## DEV Community

Armando C. Santisbon

Posted on • Updated on

# The Monty Hall problem

Suppose you're on a game show, and you're given the choice of three doors: $a$ , $b$ , and $c$ . Behind one door is a car; behind the others, goats. You pick a door, say $a$ , and the host, who knows what's behind the doors, opens another door, say $b$ , which has a goat. He then says to you, "Do you want to pick door $c$ ?" Is it to your advantage to switch your choice?

We'll go through three different explanations for the solution. I've found that different explanations will "click" with different people depending on their background and way of thinking. The first two will be mathematically rigorous and the last will be more informal but useful for building an intuition for why the math works out the way that it does.

## Explanation #1

### Conditional probability

What is the probability that the host would choose that door to open?

• If the car is in box $a$ then he could open either $b$ or $c$ to reveal a goat. The probability is $1/2$ .
• If the car were in box $b$ there's no way he would open $b$ . The probability is $0$ .
• If the car is in box $c$ then Monty would definitely open box $b$ to reveal a goat. The probability is $1$ .

So what's the probability that the prize is in box $c$ given that he opened $b$ ?

Luckily, we have a way of calculating a probability based on prior knowledge. That is, the probability of an event happening given the fact that another event has already happened:

ℹ️ Bayes' Theorem

$P(A|B)={P(B|A)P(A) \over P(B)}$

But wait, where did that come from?

💡 It can be derived from the conditional probability that tells us in how many cases do both events $A$ and $B$ happen out of the cases where $B$ happens:

$P(A|B)={P(A \cap B) \over P(B)}$

Similarly, $P(B|A)={P(A \cap B) \over P(A)}$

which means

$P(A \cap B)=P(B|A)P(A)$

and substituting in the expression for conditional probability yields Bayes' Theorem:

$P(A|B)={P(B|A)P(A) \over P(B)}$

So the probability that the car is in box $c$ given that he opened $b$ is:

$P(Car_c|Open_b)={P(Open_b|Car_c)P(Car_c) \over P(Open_b)}$

There are three possibilities for the denominator $P(Open_b)$ :

• $Car_a (Open_b|Car_a)=\frac 1 3 \cdot \frac 1 2=\frac 1 6$
• $Car_b(Open_b|Car_b)=\frac 1 3 \cdot 0=0$
• $Car_c(Open_b|Car_c)=\frac 1 3 \cdot 1=\frac 1 3$

so we'll add them up and plug them into our formula:

$P(Car_c|Open_b)={1 \cdot \frac 1 3 \over \frac 1 6+\frac 1 3}={\frac 1 3 \over \frac 1 2}={\frac 2 3}$

giving us a $2/3$ probability that the car is in the other box and therefore you should switch.

## Explanation #2

### Odds

There is another way to see Bayes' Theorem: As a way to understand two competing hypotheses $H_1$ and $H_2$ . We start by having some prior belief about the odds of those two. After observing some data, what are the new odds of the two hypotheses given the new data? In other words, how do we update our belief based on new evidence?

ℹ️
Let's start by specifying what we mean by odds. We mean how large is one probability relative to the other. Let's put it as a ratio. For example, if prior to seeing any data one hypothesis has $3/4$ probability of being true and the other has $1/4$ :

$\frac {P(H_1)} {P(H_2)}=\frac {3/4} {1/4}=\frac 3 1$

Meaning the odds of $H_1$ to $H_2$ are $3$ to $1$ .

So how do we incorporate the new data $D$ into our belief?

💡 By calculating the new probability of each hypothesis given the new data.

$\frac {P(H_1|D)} {P(H_2|D)}=\frac {\frac {P(H_1 \cap D)} {P(D)}} {\frac {P(H_2 \cap D)} {P(D)}}=\frac {P(H_1 \cap D)} {P(H_2 \cap D)}$

Now we'll use a mathematical trick of multiplying and dividing by the same number to get the same value but expressed in terms of the probability of seeing the observed data given each hypothesis.

$=\frac {P(H_1 \cap D)} {P(H_2 \cap D)} \cdot \frac {P(H_2)} {P(H_1)} \cdot \frac {P(H_1)} {P(H_2)}$
$=\frac {\frac {P(H_1 \cap D)} {P(H_1)}} {\frac {P(H_2 \cap D)} {P(H_2)}} \cdot \frac {P(H_1)} {P(H_2)}$

So,

$\frac {P(H_1|D)} {P(H_2|D)}=\frac {P(D|H_1)} {P(D|H_2)} \cdot \frac {P(H_1)} {P(H_2)}$

And so we have that as a ratio, the posterior odds equal the prior odds times the probabilities of generating the new data.

Now let's apply this to the Monty Hall Problem. Let
$H_1=$ Your door has the prize
$H_2=$ Your door has a goat
$D=$ The host reveals a goat

$\frac {P(H_1|D)} {P(H_2|D)}=\frac {1/1} {1/1} \cdot \frac {1/3} {2/3}=3/6=1/2$

Meaning $H_2$ is twice as likely as $H_1$ given the new data and therefore you should switch. If you want to convert that to probabilities:

$P(H_1)=\frac 1 {1+2}=1/3$
$P(H_2)=\frac 2 {1+2}=2/3$

This is also a nice way to see that the new data did not change the odds. You had a $1/3$ chance of getting the prize before the reveal and still do after the reveal if you keep your choice.

## Explanation #3

### An intuition

The explanations above using Bayes' Theorem are not the only way to look at this problem, though. You can arrive at the same result by realizing that the door you chose has a $1/3$ probability of containing the prize. That means there’s a $2/3$ probability that the prize is somewhere else. Then the game’s host makes it easy for you by turning the “somewhere else” into a single door.

In fact, you can generalize it like this: for a game with $n$ doors, choosing one gives you a $1/n$ chance of winning. That means $(n-1)/n$ chance that the prize is somewhere else. When the host knowingly eliminates $n-2$ doors with goats by opening them, they leave only one door you could switch to with, as we established, a probability of $(n-1)/n$ of containing the prize. Do this mental exercise with 100 doors or a billion doors and it'll become clear.

Still not convinced? You could just play the game yourself with 3 doors and keep track of how many times you win by switching. The more you play the closer and closer you'll get to winning $2/3$ of the time when you switch.