You have a normal deck of 52 playing cards You draw cards one by one (Cards drawn are not returned to the deck).
A red card pays you a dollar. A black one fines you a dollar.
You can stop any time you want.
a. What is the optimal stopping rule in terms of maximizing expected payoff?
b. What is the expected payoff following this optimal rule?
c. What amount in dollars (integer values only ) are you willing to pay for one session (i.e. playing as long as you wish, not exceeding the deck), using your strategy?
Source will be disclosed after the solution is published.
Googling red card black card, after some results leading to a drinking game, finally shows
which has this puzzle, and a spreadsheet showing expected value for the game, with colored cells showing expected value from continuing (as recommended) and white cells showing actual value of stopping.
Based on that spreadsheet, my transition values (after a start of 6, which seems to be correct) would ideally be:
If draw = 11 Then v = 5
If draw = 23 Then v = 4
If draw = 35 Then v = 3
If draw = 44 Then v = 2
If draw = 49 Then v = 1
-- not too far from my empirical determination.
Using these transitions, the program produces average values close to the 2.624 shown on the spreadsheet for the value of starting the game with a fresh deck:
successive sets of 100,000 trials each:
wins win fract average gain per trial
78331 0.78331 2.63545
78148 0.78148 2.6305
78312 0.78312 2.63418
78037 0.78037 2.62392
77975 0.77975 2.62203
78189 0.78189 2.62691
78047 0.78047 2.62458
78006 0.78006 2.62225
77964 0.77964 2.62399
77991 0.77991 2.61833
|
Posted by Charlie
on 2015-09-24 16:26:17 |