You sit down with a well mixed deck containing A cards marked "+" and B cards marked "—". You may draw cards from this deck as long as you want, i.e., you can stop playing at any point. Each time you draw a + card you are given $1 and each time you draw a — card you have to pay $1. Cards are not
replaced after having been drawn.
What would be a fair amount to pay for the right to play (i.e., what is the expected payoff) and under what circumstance should a player cease drawing?
(In reply to Winning Strategy
by ed bottemiller)
If there are no more +'s left, you do not continue to play, nor do you start to play if A=0.
If there's a single card and it's a +, of course you continue to play. If the house were foolish enough to offer this as an initial A=1,B=0, then of course the fair amount to pay to play would be 1 dollar. It's really uninteresting as a game in itself as the dollar just goes to the house and then back to the player, but of course this position could come about near the end of a game that started with higher numbers of cards, both + and -.
If there are two cards, one + and one -, the expectation is computed this way: you have 1/2 probability of drawing the +, for an immediate value of 1 which is multiplied by its 1/2 probability to give a 0.50 contribution to the total expectation, and there's no further value to this as that reduces the deck to zero +'s. If you drew the - instead, you'd lose a dollar, but the deck would then be one + and zero -'s, assuring that you get your dollar back. So this 1/2 probability has an expectation of zero. The overall expectation, then, for 1+,1- is 0.50 + 0.00 = 0.50, as shown in row 2, column 2 of Leming's post (counting is zero-based; that is there is a row 0 and a column 0). So if the game had already started and it got down to 1+,1-, you'd continue to play as this is a positive expectation. If the game started with A=1,B=1, the fair amount to pay up front to play is 50 cents. Any more that would be charged to play this particular game would be considered the house advantage. Any less than this, the house would be foolish to offer. That's the definition of the fair amount to pay to play this game.
If there were 1 + and 2 -'s, there would be 1/3 chance of getting both an immediate 1 dollar as well as the value of playing a hand that has no +'s and 2 -'s, which is nothing; and you'd have a 2/3 chance of paying an immediate dollar but getting to play a 1+,1- hand, whose value is 0.50. So overall the expectation in this case is (1/3)*1 + (2/3)*(-1 + 0.50) = 1/3 - 1/3 = 0, so it's a tossup whether to continue to play as it has an expectation of zero. If this were the situation at the beginning of the game, that is A=1,B=2, the fair amount to pay to play is zero, as there is zero expected value.
With only one + card it goes downhill from there: 1+,3- gives you 1/4 chance of an immediate +1 and then nothing further, and a 3/4 chance of an immediate -1 and then getting to play a 1+,2- hand, which has zero expectation as in the preceding paragraph. The total expectation is then 1/4 - 1 = -3/4, so you would not continue to play under this condition, and if offered A=1,B=3 at the beginning, you would not play (unless of course they paid you more than 0.75 to induce you to play, but no house will do that unless they've stacked the deck).
That brings us to situations with 2 +'s:
If there are no -'s, the expectation is of course 2. You either continue to play, or at the start, offer 2 dollars to play the game, which of course you immediately get back in this uninteresting "game".
If there is one -, you have a 2/3 chance of an immediate +1 with the added value of then continuing with a 1+,1- game, which we have seen to have a value of 0.50; this gives this 2/3 prob. part a value of 1.50. There's also the 1/3 prob. that you'd get a -, which has an immediate value of -1, but then you also get to continue with a deck that has two +'s and no -'s, for a value of 2, so the overall value of this 1/3 prob. part is 2-1=1. So the overall value of a 2+,1- hand is (2/3)*1.50 + (1/3)*1 = 1.33, in agreement with Leming's row 2, column 1.
Suppose there are 2 +'s and 2 -'s. There's a 1/2 probability of getting an immediate +1 and then playing a 1+,2- hand, worth zero. There's also a 1/2 prob. of an immediate -1 and then playing a 2+,1- hand, worth 1.33. So the total value of a 2+,2- hand is (1/2)*(1+0) + (1/2)*(-1+1.33) = 1/2 + 1/6 = 2/3, or the 0.67 listed on Leming's table.
Now the first paradoxical one: Suppose there are 2 +'s and 3 -'s. There's a 2/5 probability of getting an immediate +1 and then playing a 1+,3- hand, worth zero because you wouldn't continue with that, as mentioned above. There's also a 3/5 prob. of an immediate -1 and then playing a 2+,2- hand, worth 0.67. So the total value of a 2+,3- hand is (2/5)*(1+0) + (3/5)*(-1+0.67) = 2/5 - 1/5 = 1/5, or the 0.20 listed on Leming's table. This is in disagreement with FrankM's posted "official" solution, as he says there's zero value when B exceeds A or when, during a game, there gets to be more -'s than +'s. But we can see that it does have value, so one would continue to play. If offered A=2,B=3 at the beginning of the game, 20 cents is the fair value to play; any more would be a house advantage, and any less would be foolish on the part of the house.
As you can see, as we proceed in Leming's table, from top to bottom, left to right, for any given a,b (lower case to indicate it could be a position reached during the course of the game rather than the A,B at the start of the game; but of course at the beginning a=A and b=B), there is a prob. of a/(a+b) of getting an immediate +1 and going on to play (or decline to continue) a hand of (a-1)+,b-; and a prob of b/(a+b) of an immediate -1 and going on to play (or decline to continue) a hand of a+,(b-1)-. Each of those reduced hands' values would have been calculated before, in the prior row or the prior column, and so would be available. That is how Leming produced his table, and I produced the extended table. Then Eigenray further extended the highest b value for a given a value that would have a positive expectation, and therefore be worthy of continued play, in the form of the allowable excess of b over a. The actual worth of the game at the beginning for a particular A,B would have to be worked out through the iterative procedure outlined above.
There apparently is no closed-form solution to the problem, only the iterative one. (It could be done recursively, but that would involve multiple calculations of the same values).
Edited on March 13, 2008, 12:58 pm
Posted by Charlie
on 2008-03-13 12:51:16