Exercises

ex-ch12-01

Easy

Let Y∼Uniform[0,1]Y \sim \text{Uniform}[0,1] and X∣Y=y∼Uniform[0,y]X | Y = y \sim \text{Uniform}[0, y]. Find E[X∣Y]\mathbb{E}[X|Y] and verify the tower property by computing E[X]\mathbb{E}[X] both directly and via E[E[X∣Y]]\mathbb{E}[\mathbb{E}[X|Y]].

ex-ch12-02

Easy

Prove that Var(E[X∣Y])≀Var(X)\text{Var}(\mathbb{E}[X|Y]) \leq \text{Var}(X).

ex-ch12-03

Easy

Let XX and YY be independent with E[X]=3\mathbb{E}[X] = 3 and Var(X)=4\text{Var}(X) = 4. What is E[X∣Y]\mathbb{E}[X|Y]? What is the MMSE?

ex-ch12-04

Medium

Let (X,Y)(X, Y) be jointly Gaussian with ΞΌX=1\mu_X = 1, ΞΌY=βˆ’1\mu_Y = -1, ΟƒX2=4\sigma_X^2 = 4, ΟƒY2=9\sigma_Y^2 = 9, and ρ=0.6\rho = 0.6. (a) Find X^LMMSE\hat{X}_{\text{LMMSE}} given Y=2Y = 2. (b) Find the MSE of the LMMSE estimator. (c) Is the LMMSE equal to the MMSE here?

ex-ch12-05

Medium

Let X∈{βˆ’1,+1}X \in \{-1, +1\} equiprobably and Y=X+ZY = X + Z with Z∼N(0,1)Z \sim \mathcal{N}(0, 1) independent of XX. (a) Find E[X∣Y=y]\mathbb{E}[X|Y=y]. (b) Find the LMMSE estimator X^LMMSE\hat{X}_{\text{LMMSE}}. (c) Compare the MSE of both estimators numerically for y=0.5y = 0.5.

ex-ch12-06

Medium

Prove the orthogonality principle: E[(Xβˆ’E[X∣Y])β‹…h(Y)]=0\mathbb{E}[(X - \mathbb{E}[X|Y]) \cdot h(Y)] = 0 for any measurable hh with E[h(Y)2]<∞\mathbb{E}[h(Y)^2] < \infty.

ex-ch12-07

Medium

Let N∼Poisson(Ξ»)N \sim \text{Poisson}(\lambda) and X1,X2,…X_1, X_2, \ldots be i.i.d. with mean ΞΌ\mu and variance Οƒ2\sigma^2, independent of NN. Let S=βˆ‘i=1NXiS = \sum_{i=1}^N X_i (with S=0S = 0 if N=0N = 0). Use the law of total variance to find Var(S)\text{Var}(S).

ex-ch12-08

Medium

Derive the LMMSE estimator of XX given Y=(Y1,Y2)T\mathbf{Y} = (Y_1, Y_2)^\mathsf{T} where XX, Y1Y_1, Y2Y_2 are zero-mean with Var(X)=1\text{Var}(X) = 1, Var(Y1)=Var(Y2)=2\text{Var}(Y_1) = \text{Var}(Y_2) = 2, Cov(X,Y1)=0.8\text{Cov}(X, Y_1) = 0.8, Cov(X,Y2)=0.5\text{Cov}(X, Y_2) = 0.5, Cov(Y1,Y2)=0.3\text{Cov}(Y_1, Y_2) = 0.3.

ex-ch12-09

Hard

Show that for any estimator X^=g(Y)\hat{X} = g(Y), the MSE can be decomposed as

E[(Xβˆ’g(Y))2]=MMSE+E[(E[X∣Y]βˆ’g(Y))2].\mathbb{E}[(X - g(Y))^2] = \text{MMSE} + \mathbb{E}[(\mathbb{E}[X|Y] - g(Y))^2].

Interpret each term.

ex-ch12-10

Hard

Let Y=Hx+w\mathbf{Y} = \mathbf{H}\mathbf{x} + \mathbf{w} where x∼CN(0,Οƒx2In)\mathbf{x} \sim \mathcal{CN}(\mathbf{0}, \sigma_x^2\mathbf{I}_n), w∼CN(0,Οƒ2Im)\mathbf{w} \sim \mathcal{CN}(\mathbf{0}, \sigma^2\mathbf{I}_m) independent of x\mathbf{x}, and H∈CmΓ—n\mathbf{H} \in \mathbb{C}^{m \times n} is known. Derive the LMMSE estimator of x\mathbf{x} and its MSE matrix.

ex-ch12-11

Hard

Prove the conditional Jensen's inequality: if Ο†\varphi is convex and E[βˆ£Ο†(X)∣]<∞\mathbb{E}[|\varphi(X)|] < \infty, then

Ο†(E[X∣Y])≀E[Ο†(X)∣Y]a.s.\varphi(\mathbb{E}[X|Y]) \leq \mathbb{E}[\varphi(X)|Y] \quad \text{a.s.}

ex-ch12-12

Hard

Let X∼Exp(λ)X \sim \text{Exp}(\lambda) and Y=X+ZY = X + Z where Z∼Exp(μ)Z \sim \text{Exp}(\mu) independent of XX. (a) Find Cov(X,Y)\text{Cov}(X, Y). (b) Find X^LMMSE\hat{X}_{\text{LMMSE}}. (c) Is the LMMSE equal to the MMSE? Justify your answer.

ex-ch12-13

Medium

A sensor measures temperature TT with additive noise: Y=T+WY = T + W where T∼N(20,4)T \sim \mathcal{N}(20, 4) (in Celsius) and W∼N(0,1)W \sim \mathcal{N}(0, 1) independent of TT. (a) Find the LMMSE estimate of TT given Y=23Y = 23. (b) What fraction of the total variance of YY is explained by TT?

ex-ch12-14

Challenge

Let X∼N(0,CX)\mathbf{X} \sim \mathcal{N}(\mathbf{0}, \mathbf{C}_X) in Rn\mathbb{R}^n and suppose we observe a noisy linear combination Y=aTX+WY = \mathbf{a}^\mathsf{T}\mathbf{X} + W where a∈Rn\mathbf{a} \in \mathbb{R}^n is known and W∼N(0,Οƒ2)W \sim \mathcal{N}(0, \sigma^2) is independent. Show that the MMSE estimate of X\mathbf{X} given YY is

X^=CXaaTCXa+Οƒ2Y\hat{\mathbf{X}} = \frac{\mathbf{C}_X \mathbf{a}}{\mathbf{a}^\mathsf{T}\mathbf{C}_X\mathbf{a} + \sigma^2} Y

and find the MSE matrix.

ex-ch12-15

Challenge

(Nonlinear MMSE vs. LMMSE gap) Let X∼Uniform[βˆ’1,1]X \sim \text{Uniform}[-1, 1] and Y=X2+ZY = X^2 + Z where Z∼N(0,0.1)Z \sim \mathcal{N}(0, 0.1) independent of XX. (a) Show that E[X∣Y]\mathbb{E}[X|Y] depends on YY nonlinearly (find it numerically if needed). (b) Compute the LMMSE estimate. What is Cov(X,Y)\text{Cov}(X, Y)? (c) Compare the MSE of both estimators via Monte Carlo simulation.