Divergent possible As Probable in the Set of Possible Where
This Un un likely end game is worryingly winsome wind song win some lose some
Inverse distribution
Not to be confused with Inverse distribution function.
In probability theory and statistics, an inverse distribution is the distribution of the reciprocal of a random variable. Inverse distributions arise in particular in the Bayesian context of prior distributions and posterior distributions for scale parameters. In the algebra of random variables, inverse distributions are special cases of the class of ratio distributions, in which the numerator random variable has a degenerate distribution.
History
Main article: History of statistics § Bayesian statistics
The term Bayesian derives from Thomas Bayes (1702–1761), who proved a special case of what is now called Bayes' theorem in a paper titled "An Essay towards solving a Problem in the Doctrine of Chances". In that special case, the prior and posterior distributions were beta distributions and the data came from Bernoulli trials. It was Pierre-Simon Laplace (1749–1827) who introduced a general version of the theorem and used it to approach problems in celestial mechanics, medical statistics, reliability, and jurisprudence. Early Bayesian inference, which used uniform priors following Laplace's principle of insufficient reason, was called "inverse probability" (because it infers backwards from observations to parameters, or from effects to causes). After the 1920s, "inverse probability" was largely supplanted by a collection of methods that came to be called frequentist statistics.Justification of Bayesian probabilities
The use of Bayesian probabilities as the basis of Bayesian inference has been supported by several arguments, such as Cox axioms, the Dutch book argument, arguments based on decision theory and de Finetti's theorem.
Axiomatic approach
Richard T. Cox showed that Bayesian updating follows from several axioms, including two functional equations and a hypothesis of differentiability. The assumption of differentiability or even continuity is controversial; Halpern found a counterexample based on his observation that the Boolean algebra of statements may be finite. Other axiomatizations have been suggested by various authors with the purpose of making the theory more rigorous.
Dutch book approach
Main article: Dutch book
Bruno de Finetti proposed the Dutch book argument based on betting. A clever bookmaker makes a Dutch book by setting the odds and bets to ensure that the bookmaker profits—at the expense of the gamblers—regardless of the outcome of the event (a horse race, for example) on which the gamblers bet. It is associated with probabilities implied by the odds not being coherent.
Sha the Buddahi Among us...However, Ian Hacking noted that traditional Dutch book arguments did not specify Bayesian updating: they left open the possibility that non-Bayesian updating rules could avoid Dutch books. For example, Hacking writes "And neither the Dutch book argument, nor any other in the personalist arsenal of proofs of the probability axioms, entails the dynamic assumption. Not one entails Bayesianism. So the personalist requires the dynamic assumption to be Bayesian. It is true that in consistency a personalist could abandon the Bayesian model of learning from experience. Salt could lose its savour."
In fact, there are non-Bayesian updating rules that also avoid Dutch books (as discussed in the literature on "probability kinematics" following the publication of Richard C. Jeffrey's rule, which is itself regarded as Bayesian). The additional hypotheses sufficient to (uniquely) specify Bayesian updating are substantial and not universally seen as satisfactory.
Decision theory approach
A decision-theoretic justification of the use of Bayesian inference (and hence of Bayesian probabilities) was given by Abraham Wald, who proved that every admissible statistical procedure is either a Bayesian procedure or a limit of Bayesian procedures. Conversely, every Bayesian procedure is admissible.
Personal probabilities and objective methods for constructing priors...
An Essay towards solving a Problem in the Doctrine of Chances
In the 20th century, the ideas of Laplace developed in two directions, giving rise to objective and subjective currents in Bayesian practice. Harold Jeffreys' Theory of Probability (first published in 1939) played an important role in the revival of the Bayesian view of probability, followed by works by Abraham Wald (1950) and Leonard J. Savage (1954). The adjective Bayesian itself dates to the 1950s; the derived Bayesianism, neo-Bayesianism is of 1960s coinage. In the objectivist stream, the statistical analysis depends on only the model assumed and the data analysed. No subjective decisions need to be involved. In contrast, "subjectivist" statisticians deny the possibility of fully objective analysis for the general case.
Tossing a coin
History
A Roman coin with the head of Pompey the Great on the obverse and a ship on the reverse
Coin flipping was known to the Romans as navia aut caput ("ship or head"), as some coins had a ship on one side and the head of the emperor on the other. In England, this was referred to as cross and pile.
Process
In the 1980s, there was a dramatic growth in research and applications of Bayesian methods, mostly attributed to the discovery of Markov chain Monte Carlo methods and the consequent removal of many of the computational problems, and to an increasing interest in nonstandard, complex applications. While frequentist statistics remains strong (as demonstrated by the fact that much of undergraduate teaching is based on it ), Bayesian methods are widely accepted and used, e.g., in the field of machine learning.
Relation to original distribution
In general, given the probability distribution of a random variable X with strictly positive support, it is possible to find the distribution of the reciprocal, Y = 1 / X. If the distribution of X is continuous with density function f(x) and cumulative distribution function F(x), then the cumulative distribution function, G(y), of the reciprocal is found by noting that
{\displaystyle G(y)=\Pr(Y\leq y)=\Pr \left(X\geq {\frac {1}{y}}\right)=1-\Pr \left(X<{\frac {1}{y}}\right)=1-F\left({\frac {1}{y}}\right).}G(y)=\Pr(Y\leq y)=\Pr \left(X\geq {\frac {1}{y}}\right)=1-\Pr \left(X<{\frac {1}{y}}\right)=1-F\left({\frac {1}{y}}\right).
Then the density function of Y is found as the derivative of the cumulative distribution function:
{\displaystyle g(y)={\frac {1}{y^{2}}}f\left({\frac {1}{y}}\right).}g(y)={\frac {1}{y^{2}}}f\left({\frac {1}{y}}\right).
Examples
Reciprocal distribution
The reciprocal distribution has a density function of the form.
{\displaystyle f(x)\propto x^{-1}\quad {\text{ for }}0<a<x<b,}f(x)\propto x^{-1}\quad {\text{ for }}0<a<x<b,
where {\displaystyle \propto \!\,}\propto \!\, means "is proportional to". It follows that the inverse distribution in this case is of the form
{\displaystyle g(y)\propto y^{-1}\quad {\text{ for }}0\leq b^{-1}<y<a^{-1},}g(y)\propto y^{-1}\quad {\text{ for }}0\leq b^{-1}<y<a^{-1},
which is again a reciprocal distribution.
Quick Facts Parameters, Support ...
which is again a reciprocal distribution.
Inverse uniform distribution
Quick Facts Parameters, Support ...
If the original random variable X is uniformly distributed on the interval (a,b), where a>0, then the reciprocal variable Y = 1 / X has the reciprocal distribution which takes values in the range (b−1 ,a−1), and the probability density function in this range is
{\displaystyle g(y)=y^{-2}{\frac {1}{b-a}},}g(y)=y^{-2}{\frac {1}{b-a}},
and is zero elsewhere.
The cumulative distribution function of the reciprocal, within the same range, is
{\displaystyle G(y)={\frac {b-y^{-1}}{b-a}}.}G(y)={\frac {b-y^{-1}}{b-a}}.
For example, if X is uniformly distributed on the interval (0,1), then Y = 1 / X has density {\displaystyle g(y)=y^{-2}}{\displaystyle g(y)=y^{-2}} and cumulative distribution function {\displaystyle G(y)={1-y^{-1}}}{\displaystyle G(y)={1-y^{-1}}} when {\displaystyle y>1.}{\displaystyle y>1.}
Inverse t distribution
Let X be a t distributed random variate with k degrees of freedom. Then its density function is
{\displaystyle f(x)={\frac {1}{\sqrt {k\pi }}}{\frac {\Gamma \left({\frac {k+1}{2}}\right)}{\Gamma \left({\frac {k}{2}}\right)}}{\frac {1}{\left(1+{\frac {x^{2}}{k}}\right)^{\frac {1+k}{2}}}}.}f(x)={\frac {1}{\sqrt {k\pi }}}{\frac {\Gamma \left({\frac {k+1}{2}}\right)}{\Gamma \left({\frac {k}{2}}\right)}}{\frac {1}{\left(1+{\frac {x^{2}}{k}}\right)^{\frac {1+k}{2}}}}.
The density of Y = 1 / X is
{\displaystyle g(y)={\frac {1}{\sqrt {k\pi }}}{\frac {\Gamma \left({\frac {k+1}{2}}\right)}{\Gamma \left({\frac {k}{2}}\right)}}{\frac {1}{y^{2}\left(1+{\frac {1}{y^{2}k}}\right)^{\frac {1+k}{2}}}}.}g(y)={\frac {1}{\sqrt {k\pi }}}{\frac {\Gamma \left({\frac {k+1}{2}}\right)}{\Gamma \left({\frac {k}{2}}\right)}}{\frac {1}{y^{2}\left(1+{\frac {1}{y^{2}k}}\right)^{\frac {1+k}{2}}}}.
With k = 1, the distributions of X and 1 / X are identical (X is then Cauchy distributed (0,1)). If k > 1 then the distribution of 1 / X is bimodal.[citation needed]
Reciprocal normal distribution
...Justification of Bayesian probabilities
The use of Bayesian probabilities as the basis of Bayesian inference has been supported by several arguments, such as Cox axioms, the Dutch book argument, arguments based on decision theory and de Finetti's theorem.
Practice of throwing a coin in the air to choose between two alternatives
For other uses, see Heads or Tails (disambiguation). For the use of "Heads and Tails" in computer programming, see CAR and CDR. For coin tossing in cricket, see Toss (cricket).
Coin flipping, coin tossing, or heads or tails is the practice of throwing a coin in the air and checking which side is showing when it lands, in order to choose between two alternatives, heads or tails, sometimes used to resolve a dispute between two parties. It is a form of sortition which inherently has two possible outcomes. The party who calls the side that is facing up when the coin lands wins.
C oin flipping was known to the Romans as navia aut caput ("ship or head"), as some coins had a ship on one side and the head of the emperor on the other. In England, this was referred to as cross and pile.
Process
During a coin toss, the coin is thrown into the air such that it rotates edge-over-edge several times. Either beforehand or when the coin is in the air, an interested party declares "heads" or "tails", indicating which side of the coin that party is choosing. The other party is assigned the opposite side. Depending on custom, the coin may be caught; caught and inverted; or allowed to land on the ground. When the coin comes to rest, the toss is complete and the party who called correctly or was assigned the upper side is declared the winner.
It is possible for a coin to land on its side, usually by landing up against an object (such as a shoe) or by getting stuck in the ground. However, even on a flat surface it is possible for a coin to land on its edge. A computational model suggests that the chance of a coin landing on its edge and staying there is about 1 in 6000 for an American nickel. Angular momentum typically prevents most coins from landing on their edges unsupported if flipped. Such cases in which a coin does land on its edge are exceptionally rare and in most cases the coin is simply re-flipped.
The coin may be any type as long as it has two distinct sides; it need not be a circulating coin as such. Larger coins tend to be more popular than smaller ones. Some high-profile coin tosses, such as the Cricket World Cup and the Super Bowl, use custom-made ceremonial medallions.
Three-way
Three-way coin flips are also possible, by a different process – this can be done either to choose two out of three, or to choose one out of three. To choose two out of three, three coins are flipped, and if two coins come up the same and one different, the different one loses (is out), leaving two players. To choose one out of three, the previous is either reversed (the odd coin out is the winner) or a regular two-way coin flip between the two remaining players can decide. Note that the three-way flip is 75% likely to work each time it is tried (if all coins are heads or all are tails, each of which occur 1/8 of the time due to the chances being 0.5 by 0.5 by 0.5, the flip is repeated until the results differ), and does not require that "heads" or "tails" be called. A well-known example of such a three-way coin flip (choose two out of three) is dramatized in Friday Night Lights (originally a book, subsequently film and TV series), wherein three Texas high school football teams use a three-way coin flip. A legacy of that particular 1988 coin flip was to reduce the use of coin flips to break ties in Texas sports, instead using point systems to reduce the frequency of ties.
Axiomatic approach
Richard T. Cox showed that Bayesian updating follows from several axioms, including two functional equations and a hypothesis of differentiability. The assumption of differentiability or even continuity is controversial; Halpern found a counterexample based on his observation that the Boolean algebra of statements may be finite. Other axiomatizations have been suggested by various authors with the purpose of making the theory more rigorous.
Dutch book approach
Main article: Dutch book
Bruno de Finetti proposed the Dutch book argument based on betting. A clever bookmaker makes a Dutch book by setting the odds and bets to ensure that the bookmaker profits—at the expense of the gamblers—regardless of the outcome of the event (a horse race, for example) on which the gamblers bet. It is associated with probabilities implied by the odds not being coherent.
However, Ian Hacking noted that traditional Dutch book arguments did not specify Bayesian updating: they left open the possibility that non-Bayesian updating rules could avoid Dutch books. For example, Hacking writes "And neither the Dutch book argument, nor any other in the personalist arsenal of proofs of the probability axioms, entails the dynamic assumption. Not one entails Bayesianism. So the personalist requires the dynamic assumption to be Bayesian. It is true that in consistency a personalist could abandon the Bayesian model of learning from experience. Salt could lose its savour."
In fact, there are non-Bayesian updating rules that also avoid Dutch books (as discussed in the literature on "probability kinematics" following the publication of Richard C. Jeffrey's rule, which is itself regarded as Bayesian). The additional hypotheses sufficient to (uniquely) specify Bayesian updating are substantial and not universally seen as satisfactory.
Decision theory approach
A decision-theoretic justification of the use of Bayesian inference (and hence of Bayesian probabilities) was given by Abraham Wald, who proved that every admissible statistical procedure is either a Bayesian procedure or a limit of Bayesian procedures. Conversely, every Bayesian procedure is admissible.
Personal probabilities and objective methods for constructing priors
See also: Propagation of uncertainty § Reciprocal and shifted reciprocal
If variable X follows a normal distribution {\displaystyle {\mathcal {N}}(\mu ,\sigma ^{2})}{\mathcal {N}}(\mu ,\sigma ^{2}), then the inverse or reciprocal Y=1/X follows a reciprocal normal distribution:
{\displaystyle f(y)={\frac {1}{{\sqrt {2\pi }}\sigma y^{2}}}e^{-{\frac {1}{2}}\left({\frac {1/y-\mu }{\sigma }}\right)^{2}}.}{\displaystyle f(y)={\frac {1}{{\sqrt {2\pi }}\sigma y^{2}}}e^{-{\frac {1}{2}}\left({\frac {1/y-\mu }{\sigma }}\right)^{2}}.}
Graph of inverse of the standard normal distribution
If variable X follows a standard normal distribution {\displaystyle {\mathcal {N}}(0,1)}{\mathcal {N}}(0,1), then Y=1/X follows a reciprocal standard normal distribution, heavy-tailed and bimodal, with modes at {\displaystyle \pm {\tfrac {1}{\sqrt {2}}}}{\displaystyle \pm {\tfrac {1}{\sqrt {2}}}} and density
{\displaystyle f(y)={\frac {e^{-{\frac {1}{2y^{2}}}}}{{\sqrt {2\pi }}y^{2}}}}{\displaystyle f(y)={\frac {e^{-{\frac {1}{2y^{2}}}}}{{\sqrt {2\pi }}y^{2}}}}
and the first and higher-order moments do not exist. For such inverse distributions and for ratio distributions, there can still be defined probabilities for intervals, which can be computed either by Monte Carlo simulation or, in some cases, by using the Geary–Hinkley transformation
Comments
Post a Comment
Have something to say? Say it See it Feel it Live it then take It apart until you know what It is and As you were before you were born, feel the sound of it rattling around in the outbox smell the sight of it as the juices in your brain bag are fed little light waves of electric combinatory discombinatory salty sea solution sliding stealthily supplying the juice for the other end of the switch turned on by the turn on that turns on the on like one is when one is on and one is when one is idling on as the slumber soothes silky roads of well traversed travelers traipsing trails of tuned toned teeny tiny twirls of primes pumping nines into sixes and threes like humming honey bees evens evening out the odds as the unknown unknowns puddle sing a shiny bling and bring the ear funnel Z as close as eyeball close can see...z.z..z...