Exercises

ex-ch20-01

Easy

Compute the cumulant generating function Λ(θ)\Lambda(\theta) and rate function I(x)I(x) for XExponential(λ)X \sim \text{Exponential}(\lambda), i.e., fX(x)=λeλxf_X(x) = \lambda e^{-\lambda x} for x0x \geq 0.

ex-ch20-02

Easy

Compute the rate function I(x)I(x) for XPoisson(μ)X \sim \text{Poisson}(\mu).

ex-ch20-03

Medium

Let X1,,XnX_1, \ldots, X_n be i.i.d. Bernoulli(1/2)\text{Bernoulli}(1/2). Using Cramér's theorem, compute the exact rate function and determine the exponential rate of P(Xˉn3/4)\mathbb{P}(\bar{X}_n \geq 3/4).

ex-ch20-04

Medium

Show that I(x)=Λ(x)I(x) = \Lambda^*(x) is convex and satisfies I(x)0I(x) \geq 0 with I(μ)=0I(\mu) = 0 where μ=Λ(0)\mu = \Lambda'(0).

ex-ch20-05

Medium

Prove that if XX is σ\sigma-sub-Gaussian, then P(Xt)et2/(2σ2)\mathbb{P}(X \geq t) \leq e^{-t^2/(2\sigma^2)} for all t>0t > 0.

ex-ch20-06

Medium

Show that if XX is σ1\sigma_1-sub-Gaussian and YY is σ2\sigma_2-sub-Gaussian, and X,YX, Y are independent, then X+YX + Y is σ12+σ22\sqrt{\sigma_1^2 + \sigma_2^2}-sub-Gaussian.

ex-ch20-07

Medium

Let XN(0,1)X \sim \mathcal{N}(0, 1). Show that X2X^2 is sub-exponential but not sub-Gaussian.

ex-ch20-08

Hard

Prove Hoeffding's lemma: if E[X]=0\mathbb{E}[X] = 0 and aXba \leq X \leq b, then E[etX]exp(t2(ba)2/8)\mathbb{E}[e^{tX}] \leq \exp(t^2(b-a)^2/8).

ex-ch20-09

Hard

Let X1,,XnX_1, \ldots, X_n be i.i.d. Ber(p)\text{Ber}(p) with p=0.5p = 0.5. Compare the upper bounds on P(Xˉn0.8)\mathbb{P}(\bar{X}_n \geq 0.8) from: (a) Markov's inequality, (b) Chebyshev's inequality, (c) Hoeffding's inequality, (d) the exact Cramér rate function. Evaluate numerically for n=100n = 100.

ex-ch20-10

Hard

Prove Sanov's theorem for the binary alphabet X={0,1}\mathcal{X} = \{0, 1\} with P=Ber(p)P = \text{Ber}(p) and E={Q:Q(1)a}\mathcal{E} = \{Q : Q(1) \geq a\} where a>pa > p.

ex-ch20-11

Medium

Verify Stein's lemma for the specific case P0=N(0,1)P_0 = \mathcal{N}(0, 1) and P1=N(μ,1)P_1 = \mathcal{N}(\mu, 1) with μ>0\mu > 0. Show the optimal Type II error exponent (with Type I error constrained) is D(P0P1)=μ2/2D(P_0 \| P_1) = \mu^2/2.

ex-ch20-12

Hard

Compute the Chernoff information C(P0,P1)C(P_0, P_1) for P0=Ber(p)P_0 = \text{Ber}(p) and P1=Ber(q)P_1 = \text{Ber}(q) with pqp \neq q.

ex-ch20-13

Challenge

(Gärtner-Ellis) Let {Xn}\{X_n\} be a stationary ergodic Markov chain on {0,1}\{0, 1\} with transition matrix P=(1ααβ1β)\mathbf{P} = \begin{pmatrix}1-\alpha & \alpha \\ \beta & 1-\beta\end{pmatrix}. Show that the limiting CGF Λ(θ)=limn1nlogE[eθSn]\Lambda(\theta) = \lim_{n \to \infty}\frac{1}{n}\log\mathbb{E}[e^{\theta S_n}] exists (where Sn=i=1nXiS_n = \sum_{i=1}^n X_i) and compute it.

ex-ch20-14

Medium

Using the matrix Bernstein inequality, determine how many pilot transmissions nn are needed so that R^RϵR\|\hat{\mathbf{R}} - \mathbf{R}\| \leq \epsilon \|\mathbf{R}\| with probability 1δ\geq 1 - \delta, where R^=1nk=1nhkhkH\hat{\mathbf{R}} = \frac{1}{n}\sum_{k=1}^n \mathbf{h}_k\mathbf{h}_k^H, hkCN(0,R)\mathbf{h}_k \sim \mathcal{CN}(\mathbf{0}, \mathbf{R}), and the antenna dimension is dd.

ex-ch20-15

Easy

Show that a Rademacher random variable (P(X=1)=P(X=1)=1/2\mathbb{P}(X = 1) = \mathbb{P}(X = -1) = 1/2) is 1-sub-Gaussian.

ex-ch20-16

Hard

(Cramér's theorem - lower bound via tilting) Let X1,,XnX_1, \ldots, X_n be i.i.d. with finite CGF Λ(θ)\Lambda(\theta) on R\mathbb{R}, and let a>μa > \mu. Using the exponential tilting measure dPθ=eθxΛ(θ)dPdP_{\theta^*} = e^{\theta^* x - \Lambda(\theta^*)}dP where Λ(θ)=a\Lambda'(\theta^*) = a, prove the lower bound lim infn1nlogP(Xˉna)I(a)\liminf_{n \to \infty}\frac{1}{n}\log\mathbb{P}(\bar{X}_n \geq a) \geq -I(a).

ex-ch20-17

Challenge

(Connection to hypothesis testing) Using Cramér's theorem, derive Stein's lemma for the special case of testing P0=Ber(p)P_0 = \text{Ber}(p) vs P1=Ber(q)P_1 = \text{Ber}(q) with p<qp < q. Show that the optimal Type II error exponent is D(P0P1)D(P_0 \| P_1).

ex-ch20-18

Easy

Show that D(QP)0D(Q \| P) \geq 0 with equality iff Q=PQ = P, using Jensen's inequality.