Toys

The Two Envelopes Paradox

Two envelopes contain money: one has twice as much as the other. You pick one at random. Before opening it, you can switch to the other. Should you switch? Does it matter?

This classic "paradox" suggests switching always boosts your expected winnings - but I belive that is false. Let's test it empirically through simulation and see why.


Play the Game

Experiment with strategies: always switch, always stay, or randomize. Run auto-plays to watch convergence. What do you observe?

Click "Start New Game" to begin!
Or use the Auto-Play button to see the convergence!

Games Played: 0

Total Winnings: 0

Average Winnings: 0

Expected Value: 0X

Settings




As you play, track the "Expected Value" (your average winnings relative to the smaller amount, X). It always hovers around 1.5X, no matter the strategy. Why?


Quick Math Breakdown

Let X be the smaller amount (random, uniform 1-100 here).

  • Initial pick: 50% chance of X, 50% of 2X.
  • Always switch: If you have X, get 2X; if 2X, get X. Expected: 1.5X.
  • Always stay: Opposite, but same expected value: 1.5X.
  • Random (50/50): Average of above, still 1.5X.

No advantage! The "paradox" stems from faulty logic: assuming the unchosen envelope is equally likely to have half or double your amount leads to E[other] = 1.25 times yours, implying switch. But this double-counts scenarios incorrectly—your amount isn't fixed independently.

The Fallacy Exposed

The error treats your envelope's value as known when calculating the other's, but it's not. In reality, no new info is gained, unlike Monty Hall (where the host reveals info). Switching is like repicking randomly with the exact same odds.

Play 1,000+ games per strategy. See the lines converge to 1.5? That's the not proof, but is pretty strong evidence.

Thoughts? Share your results or counterarguments—I built this because I question the "it doesn't matter" dismissal in unbounded cases, but simulations say otherwise here. Let's debate!