# probability Q regarding TV game "Deal or No Deal"

#### Daniel_Feldman

##### Full Member
Hi all,

I came across this question recently and am intrigued.

Consider the game show "Deal or no Deal." For those unfamiliar with the show, it essentially consists of 26 cases, each with a monetary value inside ranging from $0.01 to$1,000,000 (each case contains a different value). The contestant chooses a case that they keep until the end of the show, and then from the remaining 25 cases, chooses them one by one and they are subsequently opened. at certain points, they are offered a deal--an offer to sell their case for a given amount. The amount offered depends on the amounts eliminated by the contestant and the values still left on the board. Obviously, if a high amount is eliminated, the offer will be smaller, while if the contestant eliminates low amounts, the offer will be relatively high. If the contestant says no deal each and every time until there are only two cases remaining (the one chosen by him/her at the beginning and the last one to be eliminated), he or she is allowed to either keep their case, open it and take the money inside, or exchange the case for the other one.

Consider a producer of the show who, prior to the show, has placed the monetary values inside the cases (that is, he knows how much money is in each case and, thus, which case holds the $1,000,000). This means, that for any given contestant ("trial," if you will), the producer can allow them to select a case, and then of the remaining 25 cases, open up 24 that do not have the million dollars (I mean, he does so immediately, without going through guesses, deals, etc--just a selection of a case by a contestant followed immediately by the producer opening/removing 24 cases from the remaining 25, and since the million bucks is only in one case, he can remove 24 every single time without revealing where the million is). The contestant is then asked if he wishes to exchange his case for the other one. In this case, the contestant should exchange, correct? Because when he was originally selecting, his odds of picking the one containing the million were 1 in 26. With each subsequent case being removed, the odds of the million being in his case might be 1/2 from the viewpoint of a neutral observer--i.e., someone who just walked into the room and saw 2 cases, but the odds that he picked the one with the million are still 1 in 26, and if this trial is done 26 times with 24 cases being removed each time, we still expect him to have picked correctly only once out of the 26 times. Thus, on any given trial, when he is asked whether of not he wishes to switch, he should do so, because the odds that his case holds the million are 1/26, while the odds that the million is in the other case are therefore 1-1/26=25/26. Is this correct? Now, consider someone actually going through the game in accordance with the rules. No producer, just them selecting and removing cases at random, with each one being opened. Suppose they continuously say "no deal," and reach the point where there are only 2 cases remaining (theirs, and another one). The two monetary values left on the board are$200 and \$1,000,000. Let us assume that the player says "no deal" again, and is given the option to either keep/open his case (selected at the beginning of the game), or switch it for the other one. What, in this case, is the correct play? Or are the odds here 50/50?

Thanks a lot for your help.

#### stapel

##### Super Moderator
Staff member
At the start, there was a 1/26 chance that the case chosen had the desired maximum value. The other 25/26 was in the other cases. By the reasoning used for the "Monty Hall" game, that 25/26 is now concentrated in that one remaining case. So switching would be optimal.

I could be wrong, of course....

Eliz.