Expected Utility Calculator

Calculate the expected utility of uncertain outcomes to make better decisions under risk. This tool helps quantify the value of different scenarios based on their probabilities and your risk preferences.

Single Outcome Analysis

Likelihood of the outcome (0-100%)
Dollar value of the outcome
Your risk preference model
Your current wealth (for utility calculation)
Expected Utility
15.81
Interpretation: With 50% probability and a $1,000 outcome, the expected utility using a risk-averse model is 15.81 utils.
$500
Expected Value
31.62
Utility of Outcome
$250
Certainty Equivalent
$250
Risk Premium

Multiple Outcomes Analysis

Add multiple possible outcomes to calculate total expected utility. Probabilities should sum to 100%.

Warning: Probabilities do not sum to 100%. Currently: 0%
Outcome Name Probability (%) Value ($) Utility
14.14
6.71
0.00
-22.36
0.00
Total Expected Utility
$0
Total Expected Value
$0
Certainty Equivalent
Utility Function Visualization
Risk Attitude Comparison
Risk Attitude Utility Function Characteristic Certainty Equivalent
Risk Neutral U(x) = x Expected value equals certainty equivalent = Expected Value
Risk Averse U(x) = √x Prefers certain outcomes over gambles < Expected Value
Very Risk Averse U(x) = ln(x) Strongly prefers safety; diminishing marginal utility << Expected Value
Risk Seeking U(x) = x² Prefers gambles over certain outcomes > Expected Value

What is Expected Utility?

Expected utility is a foundational concept in economics and decision theory that helps quantify the value of uncertain outcomes. Unlike simple expected value calculations that treat all dollars equally, expected utility accounts for the psychological reality that people's satisfaction (utility) from money isn't linear.

The theory was developed by Daniel Bernoulli in 1738 to resolve the St. Petersburg Paradox - a theoretical lottery with infinite expected value that no rational person would pay unlimited money to play. Bernoulli proposed that people evaluate outcomes based on utility (satisfaction) rather than raw monetary value.

Key Insight: Expected utility theory explains why most people prefer a guaranteed $50 over a 50% chance of winning $100, even though both have the same expected value. The certain outcome provides higher utility for risk-averse individuals.

Expected Utility Formula

The basic expected utility formula is:

E[U] = Σ (Probability_i × Utility(Outcome_i))

For a single outcome:

Expected Utility = Probability × U(Value)

Step-by-Step Calculation

  1. Determine the probability of each possible outcome
  2. Calculate the utility of each outcome using your chosen utility function
  3. Multiply each probability by its corresponding utility
  4. Sum all the weighted utilities to get expected utility

Example Calculation

Consider a gamble with:

Using square root utility function U(x) = √x:

Types of Utility Functions

Different utility functions model different attitudes toward risk:

1. Linear Utility (Risk Neutral)

U(x) = x

Utility equals monetary value. A risk-neutral person only cares about expected value and is indifferent between a guaranteed amount and a gamble with the same expected value.

2. Square Root Utility (Risk Averse)

U(x) = √x

Exhibits diminishing marginal utility - each additional dollar provides less satisfaction than the previous one. This models typical human behavior where people prefer certainty.

3. Logarithmic Utility (Very Risk Averse)

U(x) = ln(x)

Shows stronger diminishing marginal utility. Used in the Kelly Criterion for optimal bet sizing. Requires positive values only.

4. Quadratic Utility (Risk Seeking)

U(x) = x²

Exhibits increasing marginal utility - each additional dollar provides MORE satisfaction. Models risk-seeking behavior where people prefer gambles over certain outcomes.

Understanding Risk Attitudes

Risk Aversion

Most people exhibit risk aversion, especially for large stakes. A risk-averse person:

Risk Neutrality

A risk-neutral person:

Risk Seeking

A risk-seeking person:

Important: People often display different risk attitudes depending on the context. Someone might be risk-averse with their retirement savings but risk-seeking when buying lottery tickets. This behavior violates classical expected utility theory but is explained by prospect theory.

Certainty Equivalent and Risk Premium

Certainty Equivalent

The certainty equivalent is the guaranteed amount that provides the same utility as a risky gamble. It answers: "What certain amount would make you indifferent to this gamble?"

U(Certainty Equivalent) = Expected Utility of Gamble

For a risk-averse person, the certainty equivalent is less than the expected value. For example, someone might be willing to accept a guaranteed $40 instead of a 50/50 chance at $100 or $0 (expected value = $50).

Risk Premium

The risk premium is the difference between expected value and certainty equivalent:

Risk Premium = Expected Value - Certainty Equivalent

This represents the "cost" of risk - how much expected value someone is willing to give up to eliminate uncertainty.

Real-World Applications

Insurance Decisions

Expected utility explains why people buy insurance even though it has negative expected value (premiums exceed expected payouts). The utility loss from a catastrophic uninsured event outweighs the small utility cost of premiums.

Investment Portfolio Selection

Investors use utility functions to balance expected returns against risk. A risk-averse investor might accept lower expected returns for a more stable portfolio.

Medical Decision Making

Patients and doctors use expected utility (often implicitly) when choosing between treatments with different risk profiles.

Business Strategy

Companies evaluate projects not just by expected NPV but by risk-adjusted measures that account for the utility implications of potential outcomes.

Worked Examples

Example 1: Insurance Decision

Scenario: You have a $200,000 home with a 1% annual chance of a fire causing $150,000 in damage. Insurance costs $2,000/year.

Without Insurance:
- 99% chance: Keep $200,000 (utility = √200,000 = 447.21)
- 1% chance: Have $50,000 (utility = √50,000 = 223.61)
- Expected Utility = 0.99 × 447.21 + 0.01 × 223.61 = 444.98

With Insurance:
- 100% chance: Have $198,000 (utility = √198,000 = 445.0)

Decision: Buy insurance! Expected utility is slightly higher (445.0 vs 444.98), even though expected value is lower ($198,000 vs $198,500).

Example 2: Investment Choice

Scenario: Choose between:
A) Guaranteed $10,000 return
B) 50% chance of $25,000, 50% chance of $0

Expected Values:
- Option A: $10,000
- Option B: $12,500

Expected Utilities (using √x):
- Option A: √10,000 = 100
- Option B: 0.5 × √25,000 + 0.5 × 0 = 79.06

Decision: A risk-averse person chooses Option A despite lower expected value, because it has higher expected utility.

Frequently Asked Questions

Can expected utility be negative?

Yes, expected utility can be negative when outcomes have negative utility values. This occurs when dealing with losses or when using utility functions defined for negative numbers. For example, losing $500 with a risk-averse utility function might yield a utility of -22.36 (using a modified square root function that handles negatives).

How do I calculate expected utility with three steps?

  1. Determine the probability of each outcome occurring
  2. Calculate the utility value for each outcome using your chosen utility function
  3. Multiply each probability by its utility and sum the results

What utility function should I use?

The choice depends on your risk preferences and the context. For most personal financial decisions, the square root or logarithmic functions work well as they capture typical risk aversion. For corporate decisions where diversification is possible, linear (risk-neutral) might be appropriate.

How does expected utility differ from expected value?

Expected value simply averages monetary outcomes weighted by probabilities. Expected utility first transforms monetary values into utility (satisfaction) units, then averages. This captures the psychological reality that $1,000,000 isn't twice as satisfying as $500,000.

Why is the certainty equivalent important?

The certainty equivalent translates utility back into monetary terms, making it easier to compare options and communicate decisions. It tells you the guaranteed amount that's equivalent (in terms of satisfaction) to a risky gamble.

What are the limitations of expected utility theory?

The theory assumes people are perfectly rational and consistent in their preferences. Real behavior often violates these assumptions (e.g., the Allais Paradox). Prospect theory offers an alternative that better describes actual human decision-making, including loss aversion and probability weighting.