related topics
{math, number, function}
{rate, high, increase}
{math, energy, light}
{household, population, female}
{@card@, make, design}
{black, white, people}

In probability theory and statistics, kurtosis (from the Greek word κυρτός, kyrtos or kurtos, meaning bulging) is a measure of the "peakedness" of the probability distribution of a real-valued random variable, although some sources are insistent that heavy tails, and not peakedness, is what is really being measured by kurtosis[1]. Higher kurtosis means more of the variance is the result of infrequent extreme deviations, as opposed to frequent modestly sized deviations.



The fourth standardized moment is defined as

where μ4 is the fourth moment about the mean and σ is the standard deviation. This is sometimes used as the definition of kurtosis in older works, but is not the definition used here.

Kurtosis is more commonly defined as the fourth cumulant divided by the square of the second cumulant, which is equal to the fourth moment around the mean divided by the square of the variance of the probability distribution minus 3,

which is also known as excess kurtosis. The "minus 3" at the end of this formula is often explained as a correction to make the kurtosis of the normal distribution equal to zero. Another reason can be seen by looking at the formula for the kurtosis of the sum of random variables. Because of the use of the cumulant, if Y is the sum of n independent random variables, all with the same distribution as X, then Kurt[Y] = Kurt[X] / n, while the formula would be more complicated if kurtosis were defined as μ4 / σ4.

More generally, if X1, ..., Xn are independent random variables all having the same variance, then

whereas this identity would not hold if the definition did not include the subtraction of 3.

The fourth standardized moment must be at least 1, so the excess kurtosis must be −2 or more. This lower bound is realized by the Bernoulli distribution with p = ½, or "coin toss". There is no upper limit to the excess kurtosis and it may be infinite.

Full article ▸

related documents
Poisson distribution
Benford's law
Algorithms for calculating variance
Logistic function
Pareto distribution
Metropolis–Hastings algorithm
Weighted mean
Rank (linear algebra)
Statistical independence
Haar measure
Assignment problem
Unicity distance
Referential transparency (computer science)
Lagrange inversion theorem
Tree (graph theory)
Counting sort
Mathematical model
Shannon–Fano coding
Examples of groups
Legendre symbol
Axiom of pairing
Real analysis
Dual number