The game of craps is played by rolling a pair of dice. If the total comes out to 7 or 11, the shooter wins immediately. If it comes out to 2, 3, or 12, the shooter loses immediately. If any other total shows on the first roll, the player continues to roll until either his original total comes up again, in which case he wins, or a 7 comes up, in which case he loses.
What is the probability the shooter will win?
(In reply to re(4): True Solution (Flaw in common answer.)
"First, the odds are not the same even though the die used are rolled independently for each roll, the first roll has different odds from the second, and therefore it alters the odds for the two rolls. "
But we're talking about different games, not rolls, as you had previously said you're not talking about terminating a game early.
"Even though each roll imparts even odds for each number on the die, the odds for the entire set are changed."
But that's within one game, the set of rolls constituting one and only one game.
"That said, if you notice a significant number of longer runs, then the next run length, while statistically with N=infinity is indeterminable, with N sufficiently small, you can reasonably assume is going to be less than 12, giving you a better chance of winning if you are the shooter.
This seems to be the crux of your misunderstanding. There is nothing that says a series of long games makes shorter games likelier, even in the short run. It's just like when heads comes of five times in a row on a coin. It doesn't make tails any more likely for the next toss. Several games of craps in a row that have long runs are no more likely to be followed by a short run than at any other time. You previously said you have mathematical proof that these independent events (lengths of games) are actually dependent (conditional); what do you purport to be that proof?
Posted by Charlie
on 2011-09-05 10:33:45