Geometric distribution

related topics
{math, number, function}
{rate, high, increase}
{law, state, case}

In probability theory and statistics, the geometric distribution is either of two discrete probability distributions:

  • The probability distribution of the number X of Bernoulli trials needed to get one success, supported on the set { 1, 2, 3, ...}
  • The probability distribution of the number Y = X − 1 of failures before the first success, supported on the set { 0, 1, 2, 3, ... }

Which of these one calls "the" geometric distribution is a matter of convention and convenience.

These two different geometric distributions should not be confused with each other. Often, the name shifted geometric distribution is adopted for the former one (distribution of the number X); however, to avoid ambiguity, it is considered wise to indicate which is intended, by mentioning the range explicitly.

If the probability of success on each trial is p, then the probability that the kth trial (out of k trials) is the first success is

for k = 1, 2, 3, ....

Equivalently, if the probability of success on each trial is p, then the probability that there are k failures before the first success is

for k = 0, 1, 2, 3, ....

In either case, the sequence of probabilities is a geometric sequence.

For example, suppose an ordinary die is thrown repeatedly until the first time a "1" appears. The probability distribution of the number of times it is thrown is supported on the infinite set { 1, 2, 3, ... } and is a geometric distribution with p = 1/6.

Contents

Moments and cumulants

The expected value of a geometrically distributed random variable X is 1/p and the variance is (1 − p)/p2:

Similarly, the expected value of the geometrically distributed random variable Y is (1 − p)/p, and its variance is (1 − p)/p2:

Let μ = (1 − p)/p be the expected value of Y. Then the cumulants κn of the probability distribution of Y satisfy the recursion

Outline of proof: That the expected value is (1 − p)/p can be shown in the following way. Let Y be as above. Then

(The interchange of summation and differentiation is justified by the fact that convergent power series converge uniformly on compact subsets of the set of points where they converge.)

Parameter estimation

For both variants of the geometric distribution, the parameter p can be estimated by equating the expected value with the sample mean. This is the method of moments, which in this case happens to yield maximum likelihood estimates of p.

Full article ▸

related documents
Continuous probability distribution
Condition number
Linear prediction
BQP
Canonical LR parser
Matrix addition
Constant term
Catalan's conjecture
Symmetric tensor
Infinite set
Sather
Commutative diagram
Dirichlet's theorem on arithmetic progressions
Most significant bit
Interpreted language
Atlas (topology)
EXPTIME
Cauchy's integral theorem
Linear span
Finitely generated abelian group
Nilpotent group
Exponential time
Wreath product
Vector calculus
Merge algorithm
Unification
Single precision
NC (complexity)
Rational root theorem
Genus (mathematics)