Expected Value & Probability Distributions
If you play a game many times, what do you expect to win on average? If you run an experiment over and over, where does the average outcome settle? The answer is the expected value — the long-run average, weighted by probability.
Part 1: What Is Expected Value?
The expected value (EV) of a random variable is the weighted average of all possible outcomes, where each outcome is weighted by its probability:
For a continuous distribution, the expected value is the “center of mass” — the balance point of the curve. Let’s see this with a normal distribution:
The red spike marks the expected value — the balance point of the distribution. Notice that:
- Moving mu shifts the balance point (and the whole curve)
- Changing sigma changes the spread but keeps the balance point in the same place
- The EV is always at the center of a symmetric distribution
Part 2: Weighted Average in Action
Think of expected value as a weighted average. High-probability outcomes pull the average toward them more than low-probability outcomes.
Here’s a distribution with adjustable “weight” on two regions — a main peak and a secondary one:
Center of mass analogy: Imagine the curve is a piece of wire. The expected value is where you’d place your finger to balance it. A heavier peak pulls the balance point toward it. When w1 > w2, the balance tips left. When w1 = w2, it’s centered.
Part 3: EV with Many Trials
The more times you repeat an experiment, the closer the average gets to the expected value. This is the Law of Large Numbers.
Imagine rolling a weighted die. With few rolls, the average bounces around. With many rolls, it converges:
Watch what happens as trials increase:
- n = 1: Wide spread — the single outcome could be anywhere
- n = 10: The average is probably within 0.5 of the EV
- n = 50: Very concentrated around E(X) = 2
- n = 100: Almost pinpointed — the average is extremely close to 2
This is why casinos always win in the long run — they play millions of “trials”!
Part 4: Variance and Standard Deviation
The variance measures how spread out the distribution is from the expected value:
The standard deviation is just the square root of variance:
Higher variance means outcomes are more unpredictable. Lower variance means they cluster tightly around the expected value.
Part 5: Comparing Distributions
Two games might have the same expected value but very different risks. Compare a “safe” bet with a “risky” one:
Game theory challenge: Two games cost $5 to play.
- Game A: Win $6 with probability 0.9, win $0 with probability 0.1. E(X) = 5.40
- Game B: Win $50 with probability 0.12, win $0 with probability 0.88. E(X) = 6.00
Game B has a higher expected value, but Game A is much more consistent. Which would you prefer to play 100 times? Consider the variance!
Use the sliders to model the “safe” and “risky” distributions.
Wrapping Up
| Concept | What It Means |
|---|---|
| Expected value E(X) | Long-run average outcome, weighted by probability |
| Center of mass | E(X) is the balance point of the distribution |
| Law of Large Numbers | Averages converge to E(X) with more trials |
| Variance | How spread out outcomes are from E(X) |
| Standard deviation | Square root of variance — same units as data |
Expected value is the single most important number in probability. It tells you what to expect in the long run — whether you’re evaluating a business decision, a game strategy, or a scientific hypothesis.