Starters
These questions should help you to gain confidence with the basics.
S1. Each morning a student rolls a die and starts studying if she gets a 6. Otherwise, she stays in bed. However, during the four months of exams, she tosses a coin instead of rolling a die and studies if she gets a Head. On a (randomly chosen) morning, the student is studying. What is the probability that it is exam period? (Assume for simplicity that each month has the same number of days!)
Let \(W=\{\text{student is working}\}\) and} \(E=\{\text{exam period}\}\). Then Bayes’ theorem tells us that
\[
\begin{split}
\mathbb{P}\left(E\,|\,W\right)&=\frac{\mathbb{P}\left(W\,|\,E\right)\mathbb{P}\left(E\right)}{\mathbb{P}\left(W\,|\,E\right)\mathbb{P}\left(E\right)+\mathbb{P}\left(W\,|\,E^c\right)\mathbb{P}\left(E^c\right)}\\
&=\frac{\dfrac{1}{2}\cdot\dfrac{1}{3}}{\dfrac{1}{2}\cdot\dfrac{1}{3}+\dfrac{1}{6}\cdot\dfrac{2}{3}}
=\frac{3}{5}.
\end{split}
\]
S2. Consider a discrete random variable \(V\) taking values in \(\{1,2,3,4\}\) with mass function \(p_V\): \[
p_V(x) = \begin{cases}
xc^x & \quad\text{if $x=1,2$} \\
c(5-x) & \quad\text{if $x=3,4$}.
\end{cases}
\] What value of \(c\) makes \(p_V\) a mass function?
To be a mass function, we need \(p_V(x)\geq 0\) for all \(x\) and \(\sum_x p_V(x) = 1\). For the second of these constraints we need \(c + 2c^2 + 2c + c = 1\), and so \(c = -1 \pm \sqrt{6}/2\). But if we take \(c=-1-\sqrt{6}/2\) then we get (for example) \(p_V(3)<0\), which isn’t allowed. Therefore, we have to take \(c=-1+\sqrt{6}/2>0\).
S3. Hand-in
Consider a discrete random variable \(Y\) taking values in \(\{1,2,3,4,5\}\) with mass function \(p_Y\) given by \[
p_Y(y) = \begin{cases}
c y^2 & \quad\text{if $y=1,2$} \\
(6-y)/8 & \quad\text{if $y=3,4,5$}.
\end{cases}
\] What value of \(c\) makes \(p_Y\) a mass function?
To be a mass function, we need \(p_Y(y)\geq 0\) for all \(y\) and \(\sum_y p_Y(y) = 1\). [2 marks] The first is satisfied if \(c\ge 0\). For the second constraint we need \(c(1^2+2^2) + (6-3)/8+(6-4)/8+(6-5)/8= 1\), and so \(c = 1/20\). [3 marks]
S4. What is the probability that exactly 3 heads are obtained in 5 tosses of a fair coin?
Let \(X\) denote the number of heads obtained. Then \(X\sim\mbox{\textup{Bin}}(5,1/2)\), and \[
\mathbb{P}\left(X=3\right) = {5\choose 3}\left(\frac{1}{2}\right)^5 = 0.3125 \,.
\]
S5. A fair die is thrown until the sum of the results of the throws exceeds 6. The random variable \(X\) is the number of throws needed for this. Let \(F_X\) be the distribution function of \(X\). Determine \(F_X(1)\), \(F_X(2)\) and \(F_X(7)\).
There is no possibility to exceed \(6\) already on the first throw because the die only has numbers up to \(6\). So \(F_X(1)=\mathbb{P}\left(X\leq 1\right)=0.\)
There are, however, \(21\) outcomes of the throw of two dice that give a sum greater than \(6\). There are \(36\) outcomes altogether, all equally likely, so \[
F_X(2)=\mathbb{P}\left(X\leq 2\right)=\mathbb{P}\left(X= 2\right)=\frac{21}{36}=\frac{7}{12}\,.
\] Finally, there is no way to avoid getting a sum of more than six once one has more than six throws, because each die will show at least one. So \(F_X(7)=\mathbb{P}\left(X\leq 7\right)=1-\mathbb{P}\left(X>7\right)=1-0=1\).
S6. The probability density function \(f_X\) of a continuous random variable \(X\) is given by \[
f_X(x)=\begin{cases}
x(3-x)+c &\text{ if $1\leq x\leq2$}\\
0&\text{ otherwise\,. }\end{cases}
\] Compute \(c\). Work out the distribution function of \(X\).
The constant \(c\) is determined by the requirement that the integral of the density function over the entire real line is equal to one: \[
1=\int_{-\infty}^\infty f_X(x)dx=\int_1^2 (x(3-x)+c) dx = 13/6+c\,.
\] This implies that \(c=-7/6\). Note that for this value of \(c\) the function \(f_X\) is non-negative everywhere, which is also required in order for it to be a density.
The distribution function is obtained as an integral over the density function, but not over the whole real line but only from \(-\infty\) up to \(x\): \[
F_X(x)=\begin{cases}
0 & \text{ if $x<1$}\\
\int_1^x f_X(s)ds=\int_1^x (s(3-s)-7/6)ds= \frac{9x^2-2x^3-7x}{6} &\text{ if $1\leq x\leq2$}\\
1 & \text{if $2< x$.}
\end{cases}
\]
S7. Hand-in
The probability density function \(f_X\) of a continuous random variable \(X\) is given by \[
f_X(x)=\begin{cases}
c x^2 &\text{ if $0\leq x\leq 1$}\\
1/2 & \text{if $2\leq x\leq 3$}\\
0&\text{ otherwise\,. }\end{cases}
\] Compute \(c\), and hence calculate \(\mathbb{P}\left(X> 1/2\right)\).
The constant \(c\) is determined by the requirement that the integral of the density function over the entire real line is equal to one: \[
1=\int_{-\infty}^\infty f_X(x)dx=\int_0^1 cx^2 dx + \int_2^3 \frac{1}{2} dx = \frac{1}{2} + \frac{c}{3} \,.
\] This implies that \(c=3/2\). Note that for this value of \(c\) the function \(f_X\) is non-negative everywhere, which is also required in order for it to be a density.[3 marks]
We can then calculate \[
\mathbb{P}\left(X > 1/2\right) = 1 - \mathbb{P}\left(X\le 1/2\right) = 1- \int_0^{1/2} \frac{3x^2}{2} dx = \frac{15}{16}\,.
\] [2 marks]
S8. Let \(Y\sim\mbox{\textup{Poisson}}(2)\). Write down the mass function for \(Y\). What is \(\mathbb{P}\left(Y\geq 2\right)\)?
The probability mass function is given by \[
p_Y(k)=\frac{2^k}{k!}e^{-2}\,.
\] We can then calculate \[
\mathbb{P}\left(Y\geq 2\right)=1-\mathbb{P}\left(Y<2\right)=1-(\mathbb{P}\left(Y=0\right)+\mathbb{P}\left(Y=1\right))=1-\left(1+2\right)e^{-2}.
\]
S9. A continuous random variable \(X\) has a distribution function \(F\) which satisfies \[
F(x) = \begin{cases}0 &\text{ for } x < 0\\\frac{4x^2+3x}{7} &\text{ for } 0 \leq x \leq 1\\ 1 &\text{ for } x > 1\,. \end{cases}
\] Determine \(\mathbb{E}\left[X\right]\).
The probability density function of \(X\) is given by \[
f(x)=\frac{d}{dx}F(x)=\frac{8x+3}{7}
\] for \(0\leq x \leq 1\), and \(f(x)=0\) elsewhere. So \[
\mathbb{E}\left[X\right]=\int_{-\infty}^\infty xf(x)dx=\int_0^1\left(\frac{8x^2+3x}{7}\right)dx=\frac{25}{42}.
\]
Mains
These are important, and cover some of the most substantial parts of the course.
M1. Hand-in
Consider the probability space corresponding to throwing a fair die. Give an example of a random variable \(X\) whose image is \(\{1, 2, 3\}\) and for which \(F_X(2.7)=5/6\).
There are lots of possibilities. The only important thing is that exactly one of the outcomes (\(\Omega=\{1,2,3,4,5,6\}\)) is mapped to \(3\) so that \(F_X(2.7)=\mathbb{P}\left(X\le 2.7\right)=1-\mathbb{P}\left(X=3\right)=1 - 1/6=5/6\). One possibility is \(X(1)=1, X(2)=1, X(3)=1, X(4)=2, X(5)=2,X(6)=3\). [3 marks for a correct example; 2 for some sort of reasoning]
M2 Hand-in
A box contains four coins: one has a Head on both sides; one has a Tail on both sides; the other two coins are normal and fair. A coin is chosen at random from the box and tossed three times. What is the probability that the two-Tailed coin was chosen, given that all three tosses are Tails?
Let \(A_1=\{\text{two-Headed coin chosen}\}\), \(A_2=\{\text{two-Tailed coin chosen}\}\), \(A_3 = \{\text{fair coin chosen}\}\), and \(B=\{\text{obtain three Tails}\}\) [1 mark for setting up some notation].
Then \[
\begin{split}
\mathbb{P}\left(B\right)&=\sum_{i=1}^3\mathbb{P}\left(B\,|\,A_i\right)\mathbb{P}\left(A_i\right) = 0 \cdot \frac{1}{4}+ 1 \cdot\frac{1}{4} + \left(\frac{1}{2}\right)^3\cdot \frac{1}{2}=\frac{5}{16}\,.
\end{split}
\] [2 marks] Finally, we can use Bayes’ theorem: \[
\begin{split}
\mathbb{P}\left(A_2\,|\,B\right)&=\frac{\mathbb{P}\left(B\,|\,A_2\right)\mathbb{P}\left(A_2\right)}{\mathbb{P}\left(B\right)}=\frac{1\cdot 1/4}{5/16}=\frac{4}{5}.
\end{split}
\] [2 marks]
M3. Let \(\lambda>0\) be some positive number. Find \(c\) such that the following function is a density function: \[
f(x) =ce^{-\lambda \left\vert x-1\right\vert }\,, \quad x\in \mathbb{R}.
\]
For \(f\) to be a density, it must be non-negative on \(\mathbb{R}\) and satisfy \(\int_{-\infty}^\infty f(x) dx = 1\). Since the exponential function is non-negative, \(f\) is non-negative everywhere as long as \(c\geq 0\)
Note that \[
\begin{split}
\int_{-\infty}^{\infty}e^{-\lambda |x-1|}dx &= \int_{-\infty}^1e^{-\lambda (1-x)}dx+\int_1^{\infty}e^{-\lambda (x-1)}dx\\
&=\frac{1}{\lambda}\left[e^{-\lambda(1- x)}\right]_{-\infty}^1-\frac{1}{\lambda}\left[e^{-\lambda (x-1)}\right]_1^{\infty}=\frac{2}{\lambda}\,.
\end{split}
\] So \(f(\cdot)\) is a density if and only if \(c=\lambda/2\).
M4. Recall the Chevalier de Méré’s problem: two dice are thrown 24 times – you win £1 if at least one double 6 is thrown, otherwise you lose £1. Show that you expect to lose money on average if you play this game.
The probability of throwing a double 6 with one throw of two dice is \(1/36\). The probability of there being no occurrences of a double 6 in 24 throws of two dice is therefore \((35/36)^{24} \approx 0.509\). Letting \(X\) denote your winnings from the game, this is a random variable satisfying \[
\mathbb{P}\left(X=1\right) \approx 0.491\,, \quad \text{and} \quad \mathbb{P}\left(X=-1\right) \approx 0.509 \,.
\]
Your expected winnings are hence given by \[
\mathbb{E}\left[X\right] \approx 0.491 - 0.509\approx -0.018 \ <0 \,.
\] Therefore you expect to lose on average in this game.
M5. Suppose that \(A\) and \(B\) are independent events. Show that \(A^c\) and \(B^c\) are independent.
We need to show that \(\mathbb{P}\left(A^c\cap B^c\right) = \mathbb{P}\left(A^c\right)\mathbb{P}\left(B^c\right)\). \[
\begin{split}
\mathbb{P}\left(A^c\cap B^c\right)&=\mathbb{P}\left((A\cup B)^c\right) \qquad\text{(De Morgan)}\\
&=1-\mathbb{P}\left(A\cup B\right) \qquad\text{(P4)}\\
&=1-(\mathbb{P}\left(A\right)+\mathbb{P}\left(B\right)-\mathbb{P}\left(A\cap B\right)) \qquad\text{(P6)}\\
&=(1-\mathbb{P}\left(A\right))-\mathbb{P}\left(B\right)+\mathbb{P}\left(A\right)\mathbb{P}\left(B\right) \qquad\text{(independence of $A$ and $B$)}\\
&=\mathbb{P}\left(A^c\right)-\mathbb{P}\left(B\right)[1-\mathbb{P}\left(A\right)] \qquad\text{(P4)}\\
&=\mathbb{P}\left(A^c\right)-\mathbb{P}\left(B\right)\mathbb{P}\left(A^c\right) \qquad\text{(P4)}\\
&=\mathbb{P}\left(A^c\right)[1-\mathbb{P}\left(B\right)]\\
&=\mathbb{P}\left(A^c\right)\mathbb{P}\left(B^c\right). \qquad\text{(P4)}\\
\end{split}
\]
M6. Consider a roulette wheel, with numbers 00, 0, 1, 2, 3, …, 36 (38 numbers in total). A ball is thrown onto the wheel as it is spinning, and comes to rest by one of the numbers. You always bet that the ball will stop on one of the numbers 1, 2, …, 12. Let \(N\) be the random variable giving the number of bets that you lose before your first win. Calculate \(p_N(0)\), \(p_N(5)\) and \(F_N(5)\).
The probability of winning on a particular bet is \(12/38\) because 12 of the 38 outcomes lead to a win and all outcomes are equally likely. So the probability of winning on the first bet, \(p_N(0)\), equals \(12/38\).
The probability of losing on a particular bet is \(26/38\). The individual spins of the roulette wheel are independent. So the probability of losing the first 5 first bets and then winning on the next is \(p_N(5)=\mathbb{P}\left(N=5\right)=(26/38)^5\cdot 12/38\approx 0.047\). Also \(F_N(5)=\mathbb{P}\left(N\leq 5\right) = 1-\mathbb{P}\left(N\geq 6\right)\) is the probability of not losing all first 6 bets. So \(F_N(5)=1-(26/38)^6\approx 0.90\). (Notice that \((N+1)\sim\mbox{\textup{Geom}}(12/38)\).)
M7. Consider a discrete random variable \(Y\) taking values in the set \(\{0,1,\dots,8\}\) and with probability mass function \(p_Y\) given by \[
p_Y(k) = \begin{cases}
c\,a & \quad\text{if $k=0,1,2,3,4,5$} \\
c\,a^2 & \quad\text{if $k=6,7$} \\
c(1-a)^2 & \quad\text{if $k=8$,}
\end{cases}
\] where \(a\) is some fixed number between 0 and 1 and \(c\) is a constant to be determined. What value of \(c\) makes \(p_Y\) a mass function? (The answer is a function of \(a\).)
To be a mass function, we need \(\sum_k p_Y(k) = 1\). That is, \[
c\left[6a + 2a ^2 + (1-a )^2\right] = 1 \,,
\] and so the required value of \(c\) is \[
c= \frac{1}{(3a +1)(a +1)} \,.
\]
M8. Let \(X\sim\mbox{\textup{Bern}}(p)\). Let \(Y=1-X\) and \(V=X^2\). Show that
- \(Y\sim\mbox{\textup{Bern}}(1-p)\);
- \(V\sim\mbox{\textup{Bern}}(p)\).
The mass function of \(X\) is \[ p_X(x) = \begin{cases}
1-p & \quad\text{if $x=0$} \\
p & \quad\text{if $x=1$} \\
0 & \quad\text{if $x\neq 0,1$.}
\end{cases}
\]
Thus we see that \[\begin{split}
\mathbb{P}\left(Y=0\right) &= \mathbb{P}\left(1-X=0\right) = \mathbb{P}\left(X=1\right) = p \\
\text{and}\qquad \mathbb{P}\left(Y=1\right) &= \mathbb{P}\left(1-X=1\right) = \mathbb{P}\left(X=0\right) = 1-p \,.
\end{split}
\] Furthermore, \(\mathbb{P}\left(Y=x\right) = 0\) if \(x\neq 0,1\). Thus \(Y\sim\mbox{\textup{Bern}}(1-p)\), as required.
Similarly, \[\begin{split}
\mathbb{P}\left(V=0\right) &= \mathbb{P}\left(X^2=0\right) = \mathbb{P}\left(X=0\right) = 1-p \\
\text{and}\qquad \mathbb{P}\left(V=1\right) &= \mathbb{P}\left(X^2=1\right) = \mathbb{P}\left(X=1\right) = p \,,
\end{split}
\] and so \(V\sim\mbox{\textup{Bern}}(p)\).
M9. \(N\geq 3\) people go for coffee. Each person flips a fair coin: if all but one of the coins shows the same face (Head or Tail), then the odd person out pays for all the drinks; if not then the coins are tossed again until this event occurs. How many times on average must each person toss their coin before somebody is selected in this way?
We first calculate the probability that there is an odd person out on the first (or indeed, any) flip of the coins. This happens if
- \(N-1\) people get Tails and one person gets a Head; or
- \(N-1\) people get Heads and one person gets a Tail.
Clearly these two events have equal probability (since the coins are all fair). Thinking of case (a), there are \(N\) ways in which we can choose the person who gets a Head, and the probability that this person gets a Head and the rest get tails equals \((1/2)^N\). So the probability of (a) happening is \(N/2^N\), and since (b) has the same probability we obtain \[
p = \mathbb{P}\left(\text{odd person out on any particular flip}\right) = \frac{N}{2^{N-1}} \,.
\] Now, the number of flips needed for there to be an odd person out, \(X\), follows a \(\mbox{\textup{Geom}}(p)\) distribution, and so we obtain \[
\text{expected number of tosses} = \mathbb{E}\left[X\right] = \frac{2^{N-1}}{N} \,.
\] This gets pretty big as \(N\) gets large!
M10. Prove the memoryless property of the exponential distribution which states that if \(X\sim\mbox{\textup{Exp}}(\lambda)\) then for any \(t,s\geq 0\), \[
\mathbb{P}\left(X>t+s\,|\,X>s\right)=\mathbb{P}\left(X>t\right).
\]
\[
\begin{split}
\mathbb{P}\left(X>t+s\,|\,X>s\right)&=\frac{\mathbb{P}\left(\{X>t+s\}\cap\{X>s\}\right)}{\mathbb{P}\left(X>s\right)}\\
&=\frac{\mathbb{P}\left(X>t+s\right)}{\mathbb{P}\left(X>s\right)}\\
&=\frac{1-\mathbb{P}\left(X\leq t+s\right)}{1-\mathbb{P}\left(X\leq s\right)}\\
&=\frac{1-F_X(t+s)}{1-F_X(s)}=\frac{e^{-\lambda(t+s)}}{e^{-\lambda s}}=e^{-\lambda t}\\
&=1-F_X(t)=\mathbb{P}\left(X>t\right).
\end{split}
\]
Desserts
Still hungry for more? Try these if you want to push yourself further. (These are mostly harder than I’d expect you to answer in an exam, or involve non-examinable material.)
D1. \(N\ge 3\) people go for coffee. Each person flips a coin, of which \(N-1\) are fair, and one has probability \(q\) of coming up Heads (for some \(q\in[0,1]\)): if all but one of the coins shows the same face (Head or Tail), then the odd person out pays for all the drinks; if not then the coins are tossed again until this event occurs. How does the expected number of coin tosses each person makes vary with \(q\)?
It doesn’t vary with \(q\) – the expected number of tosses is always \(2^{N-1}/N\)! Why? \[
\begin{split}
\mathbb{P}\left(\text{odd person out on any given flip}\right) &= q\left[{{N-1}\choose{N-2}}\left(\frac{1}{2}\right)^{N-2}\left(\frac{1}{2}\right) +\left(\frac{1}{2}\right)^{N-1}\right] \\
&\qquad+ (1-q)\left[{{N-1}\choose{N-2}}\left(\frac{1}{2}\right)^{N-2}\left(\frac{1}{2}\right) + \left(\frac{1}{2}\right)^{N-1} \right]\,,
\end{split}
\] where the first quantity is the probability that the person with the strange coin gets a Head, and either \(N-2\) people also get Heads (with one Tail), or everybody else gets a Tail, etc. So the probability of having one odd person out in any given round is \(p=N/2^{N-1}\), irrespective of the value of \(q\). The number of rounds needed until there’s one odd person out has a \(\mbox{\textup{Geom}}(p)\) distribution, with mean \(2^{N-1}/N\).
D2. Alice and Bob decide to duel, using just one six-shot revolver, and one bullet, between them. They decide to duel in the following way: with the bullet inserted into the revolver, Alice will spin the cylinder and shoot at Bob (killing him if the gun fires); if Alice misses, Bob will spin the cylinder and shoot at Alice. Assuming there is a \(1/6\) probability that the revolver will fire each time (since the revolver has 6 chambers), what is
- the distribution of the number of turns \(T\) until the gun fires?
- the probability that Alice wins the duel?
The time until the gun fires follows a \(\mbox{\textup{Geom}}(1/6)\) distribution: \(T\sim\mbox{\textup{Geom}}(1/6)\).
Alice wins if:
- the gun fires first time; or
- the gun doesn’t fire on the first two goes, but fires on the third; or
- the gun doesn’t fire on the first four goes, but fires on the fifth; or…
That is, Alice wins if the gun fires at an odd time: \[
\mathbb{P}\left(\text{Alice wins}\right) = \mathbb{P}\left(T \text{ is odd}\right) \,.
\]
Since \(\mathbb{P}\left(T=n\right) = (5/6)^{n-1}(1/6)\), we require \[
\begin{split}
\mathbb{P}\left(T \text{ is odd}\right) &= \frac{1}{6}\left[ 1+ \left(\frac{5}{6}\right)^2 + \left(\frac{5}{6}\right)^4 + \dots\right] \\
&= \frac{1}{6} \left[\frac{1}{1-\left(\frac{5}{6}\right)^2}\right] \\
&= \frac{6}{11} \,,
\end{split}
\] where the middle equality follows from the usual formula for a geometric series. So Alice has the greater chance of winning (because she gets to take the first shot!).