Mac is idly tossing two dice when
he decides to see how many tosses it
would take to go from 1 to 6 in order.
The rules are to toss the two dice
until one or the other or both show
a 1. Then, toss until a 2 shows.
However, if both a 1 and a 2 show at the
same time, both can be used.
He then
tosses for a 3 and similarly for other
numbers. What is the expected number
of tosses to go from 1 to 6?
I had a similar experience as Charlie. I tried an analytic approach, but the sum of all possibilities did not add to 1.
I did it a few different ways: one the sum was less than 1 and the other time it was greater than 1.
So I did a simulation.
Experimental:
17.042544 One million reps
17.039745 One million reps
17.0384554 Ten million reps
I think I will give the analytic approach another try after I go over what I did and what Charlie did.
-----------
import random
random.randint(1,6)
score = []
reps = 10000000
for iter in range(reps):
target = 1
for n in range(1,100):
toss = {random.randint(1,6),random.randint(1,6)}
if target in toss:
target += 1
if target in toss:
target += 1
if target > 6:
score.append(n)
break
print(sum(score)/reps)
|
Posted by Larry
on 2025-01-28 10:52:01 |