You play a coin flipping game with 5 coins. On round 1 you flip all of them. On round 2, you pick up all the ones that came up tails (leaving all the heads alone) and flip them again. You continue to do this until all the coins are heads. For example:
Round 1: H T T H T
Round 2: - H T - H
Round 3: - - T - -
Round 4: - - T - -
Round 5: - - H - -
Done in 5 Rounds.
What is the expected number of rounds you'll need to finish the game?
What is the probability you will finish the game in 3 rounds or less?
Hi,
The expected number of rounds needed for m coins can be written recursively in terms of the expected number of rounds needed for less coins:
E(m)=(1/2^m)(1+Sum[Binomial[m,k]*(1+E[k]),{k,1,m}])
Take each possible scenerio of the number of tails left after the first flip, multiply the probability of that situation and the expected number of rounds to get rid of those tails (plus one for the first round), and add'em up. There is care needed, but nothing too nasty. Note that there is a term of E(m) on the left, which is the case when we get all tails the first round. Take this term to the right side, factor out the E(m), divide out the resulting constant, and we get:
E(m)=(1/2^m)(1+Sum[Binomial[m,k]*(1+E[k]),{k,1,m-1}])/(1-1/2^m)
Starting with E(1)=2, we can recursively compute the sequence of expected values:
E(3)=22/7
E(4)=368/105
E(5)=2470/651
I do not know how to write an explicit formula, though this formulation has allowed me to see that the expected value grows logarithmically with respect to the number of starting coins; something like:
E(m)=Log(2, 2.52*m+1.28), where I suspect the 2.52 is actually 2^(4/3) but not sure why. Any Ideas ??
By the way, I think this all could be pulled out nicely from a eigenvalue argument with the previously (and nicely) stated Markov matrix approach, but haven't ried.
|
Posted by owl
on 2004-10-24 01:53:19 |