6.1

From The Algorithm Design Manual Solution Wiki
Jump to navigation Jump to search

(a) Expected number of rounds
Let R be the number of rounds. For a single coin the probability it is still tails after t rounds is (½)t. Hence [math]\Pr(R>t)=1-\bigl(1-(\tfrac12)^t\bigr)^{n}.[/math] By the tail–sum formula [math]\displaystyle\; \mathbb{E}[R]=\sum_{t\ge 0}\Pr(R>t)=\sum_{t\ge 0}\Bigl[1-\bigl(1-2^{-t}\bigr)^{n}\Bigr].[/math] This exact series shows that [math]\mathbb{E}[R]=\Theta(\log n).[/math] Using standard extreme–value asymptotics for geometric variables one obtains the tighter estimate [math]\displaystyle\; \mathbb{E}[R]=\log_{2}n+\frac{\gamma}{\ln 2}+\frac12+o(1)\;\approx\;\log_{2}n+1.33.[/math] (Here γ≈0.5772 is Euler’s constant.) (b) Expected number of individual coin tosses
Each coin is tossed until its first head appears; for an unbiased coin this takes Geom(½) tosses with mean 2. Because the coins act independently, the total number of tosses T satisfies [math]\mathbb{E}[T]=n\cdot\mathbb{E}[\text{Geom}(½)]=2n.[/math] Summary
(a) [math]\displaystyle\mathbb{E}[R]=\sum_{t\ge 0}\Bigl[1-\bigl(1-2^{-t}\bigr)^{n}\Bigr]=\log_{2}n+O(1)\approx \log_{2}n+1.33.[/math]
(b) [math]\displaystyle\mathbb{E}[\text{# tosses}]=2n.[/math]


Back to Chapter 6