Table of Contents
What Is Skewness?
Skewness is a measure of the asymmetry of a probability distribution. A distribution with zero skewness is perfectly symmetric (like the normal distribution). Positive skewness means the right tail is longer or fatter, while negative skewness means the left tail is longer or fatter. Understanding skewness is essential for choosing appropriate statistical methods and interpreting data correctly.
Many real-world datasets are skewed. Income distributions typically show positive skewness (a long right tail of high earners), while failure time data often shows negative skewness. When data is significantly skewed, the mean is pulled toward the tail and may not be the best measure of central tendency; the median becomes more representative.
Formula
This is the adjusted Fisher-Pearson standardized moment coefficient, which corrects for sample bias.
Types of Skewness
| Skewness Value | Shape | Mean vs Median |
|---|---|---|
| < -1 | Highly left-skewed | Mean << Median |
| -1 to -0.5 | Moderately left-skewed | Mean < Median |
| -0.5 to 0.5 | Approximately symmetric | Mean ≈ Median |
| 0.5 to 1 | Moderately right-skewed | Mean > Median |
| > 1 | Highly right-skewed | Mean >> Median |
Frequently Asked Questions
Is skewness always bad?
No. Skewness is a property of the data, not a flaw. However, many statistical tests assume normality (zero skewness). When data is significantly skewed, you may need to use non-parametric tests, transform the data (log, square root), or use robust methods that do not assume symmetry.
What is the relationship between skewness and kurtosis?
Both are higher-order moments that describe distribution shape. Skewness measures asymmetry (third moment), while kurtosis measures tail heaviness (fourth moment). Together they provide a complete picture of how a distribution deviates from normality.