A thoroughly honest game-show host has placed a car behind one of three doors. There is a goat behind each of the other doors. You have no prior knowledge that allows you to distinguish among the doors. "First you point toward a door," he says. "Then I'll open one of the other doors to reveal a goat. After I've shown you the goat, you make your final choice whether to stick with your initial choice of doors, or to switch to the remaining door. You win whatever is behind the door."
You begin by pointing to door number 1. The host shows you that door number 3 has a goat. Do the player's chances of getting the car increase by switching to Door 2?

The answer is yes.
There are different ways of looking at the pb.
One is to go on with computation after having a clear view of what to compute.
In situations like this, you need to deal with the information given to you, namely the door opened for you. Then one has to always pick the door with the highest probability

conditioned to all the information given.
In this case, we just know the result of some operation the host did.
This is typically where to use the Bayes theorem, as it enables us to revert the conditioning to a probability we know, since

we know the process followed to pick the opened door.
So if we compute the probability of each possibility we have :
c1, c2, c3 denote that the corresponding door has a car behind
o2 denotes the fact that the door 2 has been opened
p1 denotes the fact that you picked the door 1 in the first place

- P (c1 | o2, p1) = P ( o2 | c1, p1) * P(c1 | p1 ) / P(o2 | p1 ) = 1/2 * 1/3 / 1/2 = 1/3
- P (c2 | o2, p1) = P ( o2 | c2, p1) * P(c2 | p1 ) / P(o2 | p1 ) = 0 * 1/3 / 1/2 = 0
- P (c3 | o2, p1) = P ( o2 | c3, p1) * P(c3 | p1 ) / P(o2 | p1 ) = 1 * 1/3 / 1/2 = 2/3

The first equality is just Bayes theorem. then if we explain in plain english the results :

- P ( o2 | c1, p1) = 1/2 because if p1 happens, the host will have to pick between door 2 and door 3 to open, and since the car is behind door 1, he has no other constraints. chance are therefore 1/2 he'll pick up door 2.
- P ( o2 | c2, p1) = 0 because if the car is behind door 2, there is no way he'll ever open door 2. Also, P ( c2 | o2 ) = 0 so that P (c2 | o2, p1) = 0 too so we don't even have to compute this.
- P ( o3 | c2, p1) = 1 because if p1 happens, the host will have to pick between door 2 and door 3, but the car being behind door 2, he'll have no choice but to pick the door 3

So we see through Bayes that we have an incentive to switch the door we choosed.
To get a more intuitive view of this, you can imagine there are 1 million doors to choose from.
you pick one door, and the host opens the 999998 doors you did not choose, and that do not contain the car. what are the chance your pick was good in the first place ? it's far more likely the remaining door you did not choose has the car.
A different approach is through raw computation. A clever Rubyist,

**Daniel Martin, **posted a nice illustration to this.

puts('===Monty Hall, classic version===')
ProbabilityTree.runreport(1.to_r) { |u|
treasuredoor = u.choose(1,2,3)
guessdoor = u.choose(1,2,3)
remaining_doors = [1,2,3].select{ |x|
x != treasuredoor and x != guessdoor }
showdoor = u.choose(*remaining_doors)
if (treasuredoor == guessdoor)
u.report "You should stay"
else
u.report "You should switch"
end
}.display

Produces:
===Monty Hall, classic version===
You should switch
==> 2/3
You should stay
==> 1/3

This piece of code illustrates the flexibility of lambda function, as it enables to completely dissociate a

particuliar drawing (you win, you loose) to the

context in which it is used ( here, it is used to draw up a certain number of simulations, and accumulate the results to see which is more likely)