Exercises

ex-ch01-01

Easy

Compute the entropy of a random variable XX uniformly distributed over X={1,2,,32}\mathcal{X} = \{1, 2, \ldots, 32\}.

ex-ch01-02

Easy

Let XX take values in {a,b,c,d}\{a, b, c, d\} with probabilities (1/2,1/4,1/8,1/8)(1/2, 1/4, 1/8, 1/8). Compute H(X)H(X) and compare with log24\log_2 4.

ex-ch01-03

Easy

Prove that H(X,Y)H(X)+H(Y)H(X, Y) \leq H(X) + H(Y) with equality iff XX and YY are independent.

ex-ch01-04

Easy

Let XBernoulli(1/3)X \sim \text{Bernoulli}(1/3) and Y=1XY = 1 - X. Compute H(X)H(X), H(Y)H(Y), and I(X;Y)I(X;Y).

ex-ch01-05

Medium

Prove that for any function gg: H(g(X))H(X)H(g(X)) \leq H(X), with equality iff gg is one-to-one on the support of XX.

ex-ch01-06

Medium

Let (X,Y)(X, Y) have joint distribution:

X\YX \backslash Y y=0y=0 y=1y=1
x=0x=0 1/41/4 1/41/4
x=1x=1 1/41/4 1/41/4

Compute H(X)H(X), H(Y)H(Y), H(X,Y)H(X,Y), H(XY)H(X|Y), H(YX)H(Y|X), and I(X;Y)I(X;Y).

ex-ch01-07

Medium

Prove that I(X;Y,Z)I(X;Y)I(X;Y,Z) \geq I(X;Y), i.e., more observations cannot decrease mutual information.

ex-ch01-08

Medium

Show that D(PQ)12ln2PQ12D(P \| Q) \geq \frac{1}{2\ln 2}\|P - Q\|_1^2 (Pinsker's inequality), where PQ1=xP(x)Q(x)\|P - Q\|_1 = \sum_x |P(x) - Q(x)|.

ex-ch01-09

Medium

Let X1,X2,,XnX_1, X_2, \ldots, X_n be i.i.d. PX\sim P_X. Show that H(X1,,Xn)=nH(X)H(X_1, \ldots, X_n) = n \, H(X).

ex-ch01-10

Medium

A binary erasure channel (BEC) has input X{0,1}X \in \{0,1\} and output Y{0,e,1}Y \in \{0, e, 1\}. With probability δ\delta, the output is an erasure ee; otherwise Y=XY = X. For uniform input, compute I(X;Y)I(X;Y).

ex-ch01-11

Hard

Prove that I(X;YZ)I(X;Y)I(X;Y|Z) \leq I(X;Y) when XXYZYXYZZX X \multimap Y \multimap Z Y X \multimap Y \multimap Z Z (i.e., the data processing inequality for conditional MI).

ex-ch01-12

Hard

Let XX and YY be random variables with X=Y=M|\mathcal{X}| = |\mathcal{Y}| = M. Prove that if Pe=Pr(X^X)0P_e = \Pr(\hat{X} \neq X) \to 0 where X^=g(Y)\hat{X} = g(Y), then 1nH(XnYn)0\frac{1}{n}H(X^n | Y^n) \to 0 (Fano's inequality applied to sequences).

ex-ch01-13

Hard

Prove the Mrs. Gerber's Lemma: If XBernoulli(p)X \sim \text{Bernoulli}(p) and ZBernoulli(ϵ)Z \sim \text{Bernoulli}(\epsilon) independent of XX, then hb1(H(XZ))hb1(H(X))ϵh_b^{-1}(H(X \oplus Z)) \geq h_b^{-1}(H(X)) * \epsilon, where ab=a(1b)+(1a)ba * b = a(1-b) + (1-a)b and hb1h_b^{-1} is the inverse of hbh_b on [0,1/2][0, 1/2].

ex-ch01-14

Hard

Prove that for any discrete random variables X,Y,ZX, Y, Z:

I(X;Y)I(X;YZ)+I(Y;Z).I(X;Y) \leq I(X;Y|Z) + I(Y;Z).

Interpret this bound in terms of the information that ZZ reveals about the relationship between XX and YY.

ex-ch01-15

Hard

(Entropy of a function) Let Y=g(X)Y = g(X) for some deterministic function gg. Show that H(XY)=H(X)H(Y)H(X|Y) = H(X) - H(Y) and interpret this as the "entropy lost" by applying gg.

ex-ch01-16

Challenge

(Non-Shannon inequality) Show that for four random variables X1,X2,X3,X4X_1, X_2, X_3, X_4, every "Shannon-type" inequality is a non-negative linear combination of the basic identities I(A;BC)0I(A;B|C) \geq 0. Then state (without proof) the Zhang-Yeung inequality — the first known non-Shannon inequality:

2I(X3;X4)I(X1;X2)+I(X1;X3,X4)+3I(X3;X4X1)+I(X3;X4X2).2I(X_3;X_4) \leq I(X_1;X_2) + I(X_1;X_3,X_4) + 3I(X_3;X_4|X_1) + I(X_3;X_4|X_2).

Verify that this cannot be derived as a non-negative combination of conditional MI non-negativity constraints.

ex-ch01-17

Challenge

(Strong Fano) Prove the following strengthening of Fano's inequality: if X^=g(Y)\hat{X} = g(Y) and Pe=Pr(X^X)P_e = \Pr(\hat{X} \neq X), then

H(XY)hb(Pe)+PeH(XX^,E=1),H(X|Y) \leq h_b(P_e) + P_e H(X|\hat{X}, E=1),

where E=1{X^X}E = \mathbf{1}\{\hat{X} \neq X\}. Show that this is always at least as tight as the standard Fano bound.

ex-ch01-18

Medium

Compute the capacity of the Z-channel: input X={0,1}\mathcal{X} = \{0,1\}, output Y={0,1}\mathcal{Y} = \{0,1\}, with PYX(00)=1P_{Y|X}(0|0) = 1 (no error for input 0) and PYX(01)=pP_{Y|X}(0|1) = p (input 1 received as 0 with probability pp). Find the capacity C=maxPXI(X;Y)C = \max_{P_X} I(X;Y).

ex-ch01-19

Medium

Prove that for stationary processes, the entropy rate satisfies H(X)=limn1nH(X1,,Xn)=limnH(XnXn1,,X1)H'(X) = \lim_{n \to \infty} \frac{1}{n}H(X_1, \ldots, X_n) = \lim_{n \to \infty} H(X_n | X_{n-1}, \ldots, X_1) and that both limits exist.

ex-ch01-20

Easy

Verify that the binary entropy function hb(p)h_b(p) satisfies: (a) hb(0)=hb(1)=0h_b(0) = h_b(1) = 0, (b) hb(1/2)=1h_b(1/2) = 1, (c) hb(p)=hb(1p)h_b(p) = h_b(1-p) for all pp.