related topics
{math, number, function}
{rate, high, increase}
{math, energy, light}
{household, population, female}

In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable. The skewness value can be positive or negative, or even undefined. Qualitatively, a negative skew indicates that the tail on the left side of the probability density function is longer than the right side and the bulk of the values (including the median) lie to the right of the mean. A positive skew indicates that the tail on the right side is longer than the left side and the bulk of the values lie to the left of the mean. A zero value indicates that the values are relatively evenly distributed on both sides of the mean, typically but not necessarily implying a symmetric distribution.



Consider the distribution on the figure. The bars on the right side of the distribution taper differently than the bars on the left side. These tapering sides are called tails, and they provide a visual means for determining which of the two kinds of skewness a distribution has:

If the distribution is symmetric then the mean = median and there is zero skewness. (If, in addition, the distribution is unimodal, then the mean = median = mode.) This is the case of a coin toss or the series 1,2,3,4,...

"Many textbooks," a 2005 article points out, "teach a rule of thumb stating that the mean is right of the median under right skew, and left of the median under left skew. [But] this rule fails with surprising frequency. It can fail in multimodal distributions, or in distributions where one tail is long but the other is heavy. Most commonly, though, the rule fails in discrete distributions where the areas to the left and right of the median are not equal. Such distributions not only contradict the textbook relationship between mean, median, and skew, they also contradict the textbook interpretation of the median."[2]

Skewness Statistics.svg

Full article ▸

related documents
Geometric distribution
Multivariate statistics
Zipf's law
Inverse transform sampling
Special functions
Noetherian ring
Minkowski's theorem
Euler number
Complete graph
Sum rule in integration
Ceva's theorem
Exponential time
Urysohn's lemma
Linear span
Disjunctive normal form
Chi-square distribution
Dirichlet's theorem on arithmetic progressions
Klein four-group
Canonical LR parser
Most significant bit
Euler's identity
Unit interval
Unitary matrix
Infinite set
Single precision