Hacker News new | past | comments | ask | show | jobs | submit login

>By switching, you have a 2/3 chance.

I guess I need to read up on the Monty Hall problem again, because I always thought it was 1/2 chance if you switch (vs 1/3 chance if you don't).




I find the Monty Hall problem becomes intuitive when scaled.

100 doors, 1 car, 99 goats. Pick a door. All doors open (with goats except 1) ... Do you think you picked the 1/100 door with the car, or that it's the other only one left standing.


Exactly. It also helps to emphasise that the eponymous host knows which door contains the prize and will never open the prize door before asking you whether you want to switch.


The variants where the host does not know are also worth considering. If they’re just guessing and happened to not open the prize door by chance, you’re back at 50/50. If they have a hangover and you think there’s a 5% chance they forgot where the prize is and still just happened to open the doors correctly, well that makes the math even more interesting.


As long as the host doesn't accidentally open the prize door, it doesn't matter whether he forgot or not. To test my statement you could write a simulation where the host randomly opens a door.


That's not right, and it's one of the more confusing parts of the problem in my opinion.

I can make sense of it by drawing out the probability trees and see that it's 1/2 rather than 2/3 in this case, but it actually sticks with me if I think more like so:

My prior is that there's a 1/3 chance I picked the car; that's the world where the host can't reveal anything other than a goat.

In the scenario where the host then reveals a goat intentionally, my prior doesn't change. The host can always reveal a goat. This gives me no information. Therefore, it's still 1/3 that I picked the car, so the remaining door must be 2/3, so I switch and get a 2/3 chance.

But in the scenario where the host reveals a door at random, and it happens to be a goat, it's time to update my priors. Since he didn't accidentally reveal the prize, the probability that I'm in the world where he couldn't accidentally reveal the prize is increased relative to the probability that I'm in the world where he could have.


See my simulation code.


Yeah, you still win 2/3 of the time. Some of those wins come from the host accidentally showing you the prize, which is prior to your decision.


Completely false. Draw up a probability table for the case where the host picks a door at random (so 1/3 times they reveal the prize) and you'll see.


What happens if the host reveals the prize? Is the game repeated? I'm saying as long as he doesn't accidentally open the prize door, it doesn't matter. If you insta-loose in that case, we're talking a different game.


> What happens if the host reveals the prize? Is the game repeated?

Maybe. Maybe you insta-lose. Maybe you insta-win. It doesn't matter. (But by definition if the host picks a door at random, there is a possibility that they will pick the door with the prize behind it, so something must happen in that case).

> I'm saying as long as he doesn't accidentally open the prize door, it doesn't matter.

But that's false. If the host picked the door to reveal at random then your chance is 1/2 if you switch and 1/2 if you keep your original door. The 1/3-2/3 case only happens if the host deliberately picked a door that didn't have the prize.


See for yourself:

  import java.util.List;
  import java.util.Random;
  import java.util.stream.Collectors;
  import java.util.stream.Stream;
  
  public class MontyHall {
    public static void main(String[] args) {
      Random random = new Random();
      final boolean GUEST_ALWAYS_SWITCHES = true /* change guest strategy */;
      final boolean HOST_IS_DRUNK = false /* simulate host knowledge */;
      final boolean GUEST_WINS_IF_HOST_PICKS_PRIZE_BY_ACCIDENT = true;
      final double GAME_COUNT = 1_000_000;
      double winCount = 0;
      for(int i = 0; i < GAME_COUNT; i++) {
        List<String> remainingDoors = Stream.of("A", "B", "C").collect(Collectors.toList());
        String prizeDoor = remainingDoors.get(random.nextInt(3));
        String guestInitialPick = remainingDoors.remove(random.nextInt(remainingDoors.size()));
  
        String hostPick;
        if(HOST_IS_DRUNK) {
          hostPick = /* host picks random */ remainingDoors.remove(random.nextInt(remainingDoors.size()));
        }
        else {
          // host is sober and always opens empty door
          hostPick = remainingDoors.get(0).equals(prizeDoor) ? remainingDoors.remove(1) : remainingDoors.remove(0);
        }
  
        if(hostPick.equals(prizeDoor)) {
          if(GUEST_WINS_IF_HOST_PICKS_PRIZE_BY_ACCIDENT) {
            winCount++;
          }
          continue /* insta win/loss */;
        }
  
        // guest decides to stick with inital pick or switch
        String guestFinalPick = guestInitialPick;
        if(GUEST_ALWAYS_SWITCHES) {
          guestFinalPick = remainingDoors.get(0);
        }
  
        if(guestFinalPick.equals(prizeDoor)) {
          winCount++;
        }
      }
  
      System.out.println(String.format("Win rate: %.2f", winCount / GAME_COUNT));
    }
  }


Try setting GUEST_ALWAYS_SWITCHES=false, you'll find your winrate is still 2/3.

In the if(hostPick.equals(prizeDoor)) block, rather than maybe incrementing winCount, you need to decrement GAME_COUNT (or rather the number that you're going to divide by at the end - don't reduce the number of iterations) - we're talking about the probability when you're deciding whether to switch (so after the host has already opened the door and not hit the prize).


My statement was that if you always switch, it doesn't matter whether the host is blind or not. But I must say it is an interesting find of the simulation that your win rate is always 2/3 independent of your strategy (provided the host is blind and you win when the host makes a mistake).

I'm always talking about the probability of winning the complete game when sticking to a certain strategy. I never look at intermediate probabilities.

Decrementing the game count or the divisor would be the like crossing out or ignoring decision tree branches.

If you change the game rules and say the game is repeated if the drunken host picks the prize, then the simulation would have to repeat until the guest has either won or lost, but the game count would still remain the same (= the number of times a simulated guest is invited to the game show).


> My statement was that if you always switch, it doesn't matter whether the host is blind or not.

Well, it does matter unless you have the rather unusual rule that you win if the host reveals the prize - normally it would make a lot more sense to say you lose in that case. (In fact this is the deep reason why switching works in the original problem - if you switch, then you win if a blind host "would have" revealed the prize, whereas if you don't switch, you lose).

The relevant general fact is that if the host is blind, your strategy never makes any difference.

> Decrementing the game count or the divisor would be the like crossing out or ignoring decision tree branches.

Well, there is no decision to be made if the host revealed the prize - you never get the choice of whether to switch or not. When we ask about the probabilities after a certain choice, at the point where you're making that choice you already know that you've certainly had to make that choice. Otherwise could equally well say that you have to include the probability of getting through to the final round, the probability of getting onto the gameshow in the first place, ...


You said: "I can make sense of it by drawing out the probability trees and see that it's 1/2 rather than 2/3 in this case"

(Edit: sorry that was not your statement, but that from furyofantares.)

Now you agreed that it is still 2/3, so I still stand by my statement "As long as the host doesn't accidentally open the prize door, it doesn't matter whether he forgot or not."

> Well, it does matter unless you have the rather unusual rule that you win if the host reveals the prize

If you change the game and say the host is blind, you have to also make a rule about what happens if the blind host reveals the prize. Otherwise how could you decide on a strategy as a guest?

> Well, there is no decision to be made if the host revealed the prize

No chance for the guest to switch, yes, but it is still a branch in the whole probability tree, and you have to count it either as a win or a loss for the guest. They can't just all go home and pretend the show didn't happen.

> Otherwise could equally well say that you have to include the probability of getting through to the final round, the probability of getting onto the gameshow in the first place, ...

The problem statement is: "What strategy (stick or switch) is best for the guest (= has higher probability to win), provided he gets into the game show final." No ambiguity here.


> If you change the game and say the host is blind, you have to also make a rule about what happens if the blind host reveals the prize.

The most natural rule is that the guest loses: the guest picked a door that didn't have the prize, they get what's behind "their" door. (And note that the much-argued formulation already "changes" the game from what was done on the original gameshow).

> The problem statement is: "What strategy (stick or switch) is best for the guest (= has higher probability to win), provided he gets into the game show final."

Why "provided he gets into the gameshow final" and not "provided he gets into the gameshow final and is offered the chance to switch"? Like, if you're asking whether a chess player should take an en passant capture, you'd look at whether capturing or declining en passant leads to winning more often, you wouldn't look at all possible chess games (including those where there was no chance to capture en passant) because that just adds a bunch of irrelevant cases.


> The most natural rule is that the guest loses

I don't want to argue which rule would be more natural or make a more interesting show. But without a clear rule, I cannot decide on a playing strategy.

> Why "provided he gets into the gameshow final"?

Because I assumed that the host always has to open a door and give a choice to switch.


> I don't want to argue which rule would be more natural or make a more interesting show. But without a clear rule, I cannot decide on a playing strategy.

It doesn't make any difference to your strategy! You don't get any choice if the host opens the door and reveals the prize, so your strategy can't possibly be affected by what the payoff for that scenario is - even if you win 10x the prize if that happens, or get executed if that happens, it makes no difference to whether you should switch or not in the scenario where you actually do get given a choice.


I agree you do not need a strategy if the rule is insta win/loss.

But if the rule were to reset the game until the host doesn't make any more mistakes, it matters.


No it doesn't. If the host chose randomly and revealed a non-prize, i.e. the point where you're making the choice, your odds are 1/2 for either door and your strategy doesn't matter.


There's a third rule set where switching always makes you lose:

Guest selects a door. If it has a goat behind it, the host opens that door and says "you lose". If the guest initially picks the car, then the host opens a different door and asks the guest if they want to switch.

For the player, if you've reached the point that the host is offering you to switch, there's no way to know whether you are in the case where switching improves your odds to 2/3, keeps them the same, or decreases them to zero.


> My statement was that if you always switch, it doesn't matter whether the host is blind or not.

You responded to someone who said it's back to 50/50 once a random host reveals a goat.

I thought you were contradicting this or adding to it, but if you were saying it doesn't matter if you switch, then yeah, that's what 50/50 means.


Beautiful short explanation. I always struggle getting this point across in discussions.


My hopes for this explanation were too high. I just tested it on my cooworkers. No luck. I reverted to suggesting to write a short simulation script to convince themselves by experimental data.


I personally find the Generalized Monty Hall problem as unintuitive as the original one. But I belive that I found a batter way to make it intuitive (at least for people that have heard about classic and conditional probability). And it event worked on one of my friends :)

If Your first choice was lucky and two goats remained in other gates than Monty does not change anything by showing You the goat in one of them (it't the same as if Monty selected random gate).

Things becomes intresting if You was unlucky and selected Goat. Monty MUST show You the other goat (because there is only one available for him to select). And thus he introduces information to otherwise random selection (it stops beeing random).

And by doing this he eliminates for You conditional possiblity of selecting second goat when You selected one in first round (and in this conditional scenariu swiching is sure bet which changes the overall odds to 2/3)


It's a 1/3 chance if you pick before the second door is opened and do not switch.

It's a 1/2 chance if you pick after the second door is opened, regardless of switching.

It's a 2/3 chance if you pick before the second door is open and switch after the second door is opened.

I recommend reading again, because I don't want to write up the logic.

Of a bigger concern is that this was not true on "Let's Make a Deal". The problem assumes that Monty always offered the option to switch. In reality, he did not, and the offers were not random. That corrected for the issue to a large degree in practice.


That doesn't add up. Literally, 1/3 + 1/2 does not add up to 1. What third option exist that has a 1/6 chance of happening?

The basic argument is. If you don't switch, you have 1/3 chance of winning. So if you do switch you have 1/3 chance of losing. Hence if you switch you have 2/3 chance of winning.

How the door you switched too went from 1/3 chance to double is the hard part to explain. But it must be the case.


It's a 1/2 chance at the point where you're considering the switch.

If you start the game with the intention of switching, you have a 2/3 chance of being successful because the winning strategy is to miss the car on your first pick.


No, I don’t think this is right.

Regardless of your intended plan, you only had a 1/3 chance of picking correctly the first time, so switching gets you a 2/3 chance.


No, it's a 2/3 chance at that point.

When you choose initially, you have a 1/3 probability of getting the right one, leaving a 2/3 probability that the car is on one of the other two.

The host reveals one of the other two. So that 2/3 probability applies to the remaining door. Here is a short C implementation that made it very clear to me...

  #include <stdio.h>
  #include <stdlib.h>

  int doround () {
    int car = rand() % 3;
    int firstchoice = rand() % 3;

    // host reveals one of the goat doors

    if (car == firstchoice) {
      // you changing to the other door after the reveal is a loss
      return 0;
    }

    if(car != firstchoice) {
      // you changing to the other door after the reveal is a win
      return 1;
    }
  }

  int main(int argc, char** argv) {
    int wins=0;
    for (int round=0; round < 1000; round++){
      wins+=doround();
    }
    printf("Worked in %d of %d rounds\n", wins, rounds);
  }


It's been a while since I actively thought about the Monty Hall problem, I thought I understood it intuitively back then. I thought it became one of those "oh yeah, that's unintuitive, but once it clicks it's fine" things for me.

I thought through it again, and I'm angry now.

At least I'm not alone (got this link from the article): https://web.archive.org/web/20140413131827/http://www.decisi...


That would mean you have a 5/6 chance of finding the car if you were allowed to pick both doors. You can see the problem with that. In reality, it's 2/3 and 1/3 respectively.


If you started with 4 doors, where 3 of them had goats, wouldn't you have a 1/4 chance of picking the car the first time and a 1/3 chance of picking the car the second time?


1/4 chance if you keep your original door, 3/4 chance if you switch (assuming the host opens two doors to show goats). If the host opened one door (so you have your original door and two other doors to pick from) then it's a 3/8 chance if you switch, if that's what you're asking.


Yes, I was asking about the latter, thanks!


When using the switch strategy, you win if you choose a goat initially. Since there are two goats and one car, this is a 2/3 chance.


Door A | Door B | Door C | —————————————————————————- Car. | Goat. |Goat.|

Goat. | Car. | Goat.|.

Goat. | Goat. | Car.|.

The above tables shows all possible distribution of car and goats. If the initial choice was Door A, then you only have 1/3 probability. But after B/C door is opened the probability increases to 2/3 if switched.

The clue is that the host will always open the door with a goat.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: