Table of Contents
What Is Expected Value?
The expected value (E(X) or μ) is the long-run average value of a random variable over many repetitions. It is the probability-weighted average of all possible outcomes. For a fair die, E(X) = 3.5, meaning over many rolls the average approaches 3.5 even though 3.5 itself never appears.
Expected value is fundamental to decision theory, insurance, gambling, investment analysis, and game theory. It allows comparison between alternatives with uncertain outcomes by reducing complex probability distributions to single numbers.
Formula
Examples
| Scenario | Values | Probabilities | E(X) |
|---|---|---|---|
| Fair coin ($1 heads, $0 tails) | 1, 0 | 0.5, 0.5 | $0.50 |
| Lottery ticket | -1, 999 | 0.999, 0.001 | $0.00 |
| Insurance claim | -500, 9500 | 0.95, 0.05 | $0.00 |
Properties
- E(aX+b) = a×E(X)+b (linearity)
- E(X+Y) = E(X)+E(Y) (always, even if dependent)
- E(XY) = E(X)×E(Y) only if X,Y are independent
- Probabilities must sum to 1
Frequently Asked Questions
Can expected value be negative?
Yes. Most gambling games have negative expected value for the player (the house edge). A game with E(X) = -$0.05 per dollar bet means you lose 5 cents per dollar on average.
Does expected value guarantee anything?
No. Expected value describes the long-run average but says nothing about any single trial. You can lose money many times before the average converges. The law of large numbers ensures convergence eventually.
What if probabilities do not sum to 1?
Then the distribution is invalid. All probabilities must be non-negative and sum to exactly 1. If they sum to more or less, check for missing outcomes or calculation errors.