28 June 2021 · Maths

UniMelb Probability Competition

UniMelb (specifically, the Stochastic Processes Research Group) runs a competition every semester for students taking probability subjects, where "eight nice problems at different levels" are issued.

Question 1)

Your probability lecturer keeps \(n_r\) red socks and \(n_b\) black socks in the bottom drawer of a desk in the General Office. When the lecturer draws two socks out of the drawer at random, the probability that both are red is 0.5. What is the minimum number \(n=n_r+n_b\) for which this is possible?

Clearly the answer is three red and one black for a total of four socks. Nothing much more to comment on here, beyond the strange way of storing socks in the maths department.

Question 2)

There were 100 green and 300 red apples in Box A and 300 green and 100 red apples in Box B, respectively. Somebody moved half of the apples (chosen at random) from Box A to Box B, and then chose at random one apple from (the 600 apples that were then in) Box B. What is the probability that the apple was green?

There's no need to do any complicated conditioning over hundreds of cases. We simply notice that the apple chosen from Box B either originated from Box A (with \(\frac{1}{3}\) probability) or Box B. If it originated from Box A, it has a \(\frac{1}{4}\) probability of being green. Similarly, if it originated from Box B, it has a \(\frac{3}{4}\) probability of being green. Hence the net probability of being green is \[\frac{1}{3}\cdot\frac{1}{4}+\frac{2}{3}\cdot\frac{3}{4}=\frac{7}{12}\]

Question 3)

Let \(X\) be a random variable which is non-zero with positive probability and has a finite second moment. Prove that \(\mathbf{P}(X = 0) \leq \mathbb{V}(X)/\mathbb{E}(X^2)\).

My favourite out of the problems! We define \(q=\mathbf{P}(X\neq 0)\). We also define a new variable \(Y\) which is \(X\) but with \(X=0\) truncated (ie. \(Y=X | X\neq 0\)). Clearly the density functions of \(X\) and \(Y\) are related by
\[f_Y(x)=\begin{cases}\frac{f_X(x)}{q},& x\neq 0 \\ 0, & x=0 \end{cases}\]Note that this is only possible because \(q>0\). The desired result follows from the fact that all variances are non-negative (and also that the second moment of \(Y\) is finite), as follows:
\[\begin{align*}0&\leq \mathbb{V}(Y)\\&=\mathbb{E}(Y^2)-\mathbb{E}(Y)^2\\&=\int_{-\infty}^{\infty}x^2f_Y(x)dx-\left(\int_{-\infty}^{\infty}xf_Y(x)dx\right)^2\\&=\frac{1}{q}\int_{-\infty}^{\infty}x^2f_X(x)dx-\left(\frac{1}{q}\int_{-\infty}^{\infty}xf_X(x)dx\right)^2\\&=\frac{\mathbb{E}(X^2)}{q}-\frac{\mathbb{E}(X)^2}{q^2}\\\therefore \frac{\mathbb{E}(X)^2}{\mathbb{E}(X^2)} &\leq q\\\therefore 1-q &\leq 1-\frac{\mathbb{E}(X)^2}{\mathbb{E}(X^2)}\\&=\frac{\mathbb{V}(X)}{\mathbb{E}(X^2)}\end{align*}\]recalling of course that \(\mathbf{P}(X=0)=1-q\).

Question 4)

Let \(X_1\), \(X_2\) and \(X_3\) be random variables such that \(\text{Corr}(X_1, X_2) > 0\) and \(\text{Corr}(X_2, X_3) > 0\). Does it follow that \(\text{Corr}(X_1, X_3) > 0\) as well?

This is false; it's fairly straightforward to see that \(X_1, X_3\) can be independent, yet both are still correlated with \(X_2=X_1+X_3\).

Question 5)

Assume that \(X\) and \(Y\) are non-negative independent random variables with a common distribution, and \(\mathbb{E}(X^2)<\infty\). Is it always true that \(\mathbb{V}(\text{min}(X,Y))\leq\mathbb{V}(X)\)?

This is also false. A simple counter-example is if the mass function is
\[f(x)=\begin{cases}1/3,&x=0\\2/3,&x=1\end{cases}\]resulting in \(\mathbb{V}(\text{min}(X,Y))=\frac{20}{81}\) but \(\mathbb{V}(X)=\frac{18}{81}\).

I haven't really tested this, but intuitively it seems that the stated inequality in the question sometimes fails if the density function is negatively skewed; the minimum function will "redistribute" the mass more "evenly", resulting in a higher variance.

Question 6)

Suppose that \((X,Y)\) is a random vector whose distribution has no atoms, i.e., \(\mathbf{P}((X,Y) = (x,y)) = 0\) for all \((x,y) \in \mathbb{R}^2\). Will that imply that its distribution function \(F(x, y) := P(X \leq x, Y \leq y)\) will be continuous?

I'm not 100% sure about the answer to this because I haven't really studied measure theory, but an apparent counterexample is if \(P(X=0)=1\) and the marginal distribution of \(Y\) is a standard normal. Clearly there is a discontinuity in the distribution function along the line \(X=0\), but there are no atoms in \(\mathbb{R}^2\).

Question 7)

Assume that \(P\) is the distribution of a random variable \(X\) such that
\[\begin{align*}P([0, 1]) &= 1\\P([0,\tfrac{1}{3}]) = P([\tfrac{2}{3}, 1]) &= \tfrac{1}{2}\\P([0,\tfrac{1}{9}]) = P([\tfrac{2}{9},\tfrac{1}{3}]) = P([\tfrac{2}{3},\tfrac{7}{9}]) = P([\tfrac{8}{9}, 1]) &= \tfrac{1}{4}\end{align*}\]and so on. Compute \(\mathbb{V}(X)\).

This is a description of the Cantor distribution; several ways exist to show that its variance is \(\frac{1}{8}\). I did so by noting that the density function from \([0,1]\) exhibits self-similarity with the density function from \([0,\tfrac{1}{3}]\) and also \([\tfrac{2}{3},1]\), resulting in a self-referential expression of the density function \[f(x)=\frac{3}{2}f(3x) + \frac{3}{2}f(3x-2)\]Exploiting this expression allows me to solve for the moments.

Question 8)

Suppose that \(X\) and \(Y\) are two random variables on a common probability space such that \(\mathbb{E}(X^2 + Y^2) \leq \infty\), \(\mathbb{E}(X | Y ) = Y\) and \(\mathbb{E}(Y | X) = X\) (a.s.). Prove that \(\mathbf{P}(X = Y ) = 1\).

My approach was to first use the law of total probability, which is that \[\mathbb{E}(\phi(X,Y))=\mathbb{E}(\mathbb{E}(\phi(X,Y)|X))=\mathbb{E}(\mathbb{E}(\phi(X,Y)|Y))\]for some reasonable function \(\phi(X,Y)\). Using the information given, this leads to \(\mathbb{E}(X)=\mathbb{E}(Y)\), and also to \(\mathbb{E}(X^2)=\mathbb{E}(XY)=\mathbb{E}(Y^2)\). This is sufficient to show that \(X,Y\) have identical variances and hence are perfectly correlated.

As a perfect correlation implies a linear relationship, we exploit the identical mean and variance of \(X,Y\) and conclude that indeed, \(X=Y\).