Mac is idly tossing two dice when
he decides to see how many tosses it
would take to go from 1 to 6 in order.
The rules are to toss the two dice
until one or the other or both show
a 1. Then, toss until a 2 shows.
However, if both a 1 and a 2 show at the
same time, both can be used.
He then
tosses for a 3 and similarly for other
numbers. What is the expected number
of tosses to go from 1 to 6?
(In reply to
failed analytic attempt; switch to simulation by Charlie)
In 600,000,000 trials
number of times occurrences fraction average
two numbers were in 600 million of the expectation
found in one toss trials cases for when this
happens
0 219977649 0.3666294 19.6365482
1 287888055 0.4798134 16.3639145
2 88525684 0.1475428 13.0908114
3 3608612 0.0060144 9.8180433
avg =
17.0414662833333
|
Posted by Charlie
on 2025-01-28 11:20:50 |