Trivial Cases
We want to simulate an n-sided die by flipping a biased coin. Our goal is to simultaneously minimize the number of coins and flips. If n=1 we don't need any flips. If n=2^k we only need to flip a fair coin \log n times.
Using 1 Coin
Interestingly, you can simulate an n-sided die using only 1 coin and \,3 \log n\, flips. The only caveat is that I can't give you the bias of the coin explicitly. Suppose we have a coin with bias b. We have n numbered bins and we want to map each individual sequence of f flips to a bin such that each bin has probability 1/n. There are {f \choose k} sequences that have k heads. For each k, we map \left\lfloor {f \choose k}(n-1)^{-1} \right\rfloor of these to the first n-1 bins, and put the remainder in the last bin. By construction, the first n-1 bins have the same probability, so it remains to show that the last bin is filled to 1/n. Let R(b) be the probability of the last bin. Then
\begin{align}
R(b) &= \sum_{k=0}^f r_k \; b^k \; (1-b)^{f-k} \\ & \\
r_k &= \small{{f \choose k} \;\; (\text{mod } n-1)}
\end{align}
We have \,R(0) = 1\, and if we take f large enough such that \,R(1/2)<1/n, then by continuity and the intermediate value theorem there must be a b' such that \,R(b') = 1/n. Noting the bound \,R(1/2) < 2^{-f} f n\, we find that \,f-\log f > 2 \log n\, implies \,R(1/2) < 1/n. Choosing \,f = 3\log n\, satisfies the inequality and we are done.
We can make this prettier if we don't care about minimizing f. Suppose n is one more than an odd prime and let f=n-1. Then there are no remainders since n-1 \choose k is always a multiple of n-1. Thus R(b) = b^{n-1} + (1-b)^{n-1} which clearly has a solution R(b) = 1/n. For cases where n is not one more than an odd prime, use Dirichlet's Theorem to get a multiple of n that is.
Using 2 Coins
The best I can do with two coins is \sim 2\log n flips. Let one coin be fair and one coin have bias 1/n. Let f be such that 2^f \geq (n-1)^2. Flip the fair coin f times and the 1/n coin once. There are 2^{f+1} total outcomes, half of which have probability \tfrac{1}{ 2^fn} and the other half have probability \tfrac{n-1}{2^f n}. Label the former L and the latter H. We want to group these outcomes into bins of probability \tfrac{2^f}{2^fn}.
Add as many H outcomes as possible to the first bin and denote the number h_1 \geq n-1. Fill the remainder with t_1 < n-1 of the T outcomes. Repeat for the next bin, and then the next, and so on. Because h_i>t_i, we run out of H outcomes first and can fill the remaining bins with L outcomes.
Remarks
A few more interesting observations. Let F(n) be the minimum number of flips needed using a mutable coin (you can change the bias every flip). If n=ab, then we can "roll" an a and b-sided die and take the result b (r_a -1) + r_b. Also, if n = a+b, then we flip an a/n-biased coin and then roll either an a or b-sided die depending on whether the coin flip came up heads or tails. So we get the recurrence relations
\begin{align}
F(ab) &\leq F(a) + F(b) \\ \; \\
F(a+b) &\leq 1 + \text{max}\{F(a), F(b)\} \\
\end{align}
The second relation let's us simulate an n=2^k + 2^{\ell}_{(k > \ell)}-sided die using 2 coins and k+1 flips.
_________________________________________________________________________________
No comments:
Post a Comment