Exercises

ex-ch03-01

Easy

A source has alphabet {a,b,c}\{a, b, c\} with probabilities (1/2,1/4,1/4)(1/2, 1/4, 1/4). Compute H(X)H(X) and estimate the size of the typical set for n=100n = 100.

ex-ch03-02

Easy

True or false: every element of the typical set has the same probability.

ex-ch03-03

Easy

For a binary source with P(0)=0.9P(0) = 0.9, P(1)=0.1P(1) = 0.1, and n=1000n = 1000: (a) How many sequences are in {0,1}1000\{0,1\}^{1000}? (b) Approximately how many are typical? (c) What fraction of sequences are typical?

ex-ch03-04

Easy

For a fair coin (P(0)=P(1)=1/2P(0) = P(1) = 1/2), what fraction of {0,1}n\{0,1\}^n sequences are typical?

ex-ch03-05

Medium

Prove that if (x,y)Tϵ(n)(X,Y)(\mathbf{x}, \mathbf{y}) \in \mathcal{T}_\epsilon^{(n)}(X,Y), then the conditional type P^yx\hat{P}_{\mathbf{y}|\mathbf{x}} satisfies P^(ba)PYX(ba)δ(ϵ)|\hat{P}(b|a) - P_{Y|X}(b|a)| \leq \delta(\epsilon) for all (a,b)(a,b) where δ(ϵ)0\delta(\epsilon) \to 0 as ϵ0\epsilon \to 0.

ex-ch03-06

Medium

Show that the size of the conditional typical set satisfies Tϵ(n)(Yx)2nH(YX)|\mathcal{T}_\epsilon^{(n)}(Y|\mathbf{x})| \doteq 2^{nH(Y|X)} for any xTϵ(n)(X)\mathbf{x} \in \mathcal{T}_\epsilon^{(n)}(X).

ex-ch03-07

Medium

Use the AEP to prove the source coding theorem: an i.i.d. source with entropy H(X)H(X) can be losslessly compressed to H(X)+ϵH(X) + \epsilon bits per symbol for any ϵ>0\epsilon > 0 and sufficiently large blocklength nn.

ex-ch03-08

Medium

The number of binary sequences of length nn with exactly kk ones is (nk)\binom{n}{k}. Using Stirling's approximation, show that (npn)2nhb(p)\binom{n}{pn} \doteq 2^{nh_b(p)} for 0<p<10 < p < 1.

ex-ch03-09

Medium

Verify Sanov's theorem for the binary case: if X1,,XnX_1, \ldots, X_n are i.i.d. Bernoulli(pp), show that Pr(P^(1)q)2nD(qp)\Pr(\hat{P}(1) \geq q) \doteq 2^{-nD(q \| p)} for q>pq > p.

ex-ch03-10

Hard

Use the packing lemma to prove the channel coding theorem: for a DMC with capacity C=maxPXI(X;Y)C = \max_{P_X} I(X;Y), any rate R<CR < C is achievable.

ex-ch03-11

Hard

Prove the Slepian-Wolf theorem: for correlated i.i.d. sources (X,Y)(X,Y), separate encoding with joint decoding can achieve the rate region RXH(XY)R_X \geq H(X|Y), RYH(YX)R_Y \geq H(Y|X), RX+RYH(X,Y)R_X + R_Y \geq H(X,Y).

ex-ch03-12

Hard

Show that the number of joint types (QXY)(Q_{XY}) for sequences of length nn over alphabets X×Y\mathcal{X} \times \mathcal{Y} satisfies Pn(X×Y)(n+1)XY|\mathcal{P}_n(\mathcal{X} \times \mathcal{Y})| \leq (n+1)^{|\mathcal{X}||\mathcal{Y}|}. Then show that the conditional type class TQYX(x)={y:(x,y) has joint type QXY}T_{Q_{Y|X}}(\mathbf{x}) = \{\mathbf{y}: (\mathbf{x}, \mathbf{y}) \text{ has joint type } Q_{XY}\} has size TQYX(x)2nH(YX)Q|T_{Q_{Y|X}}(\mathbf{x})| \doteq 2^{nH(Y|X)_Q} where H(YX)Q=xQX(x)H(YX=x)QH(Y|X)_Q = \sum_x Q_X(x) H(Y|X=x)_Q.

ex-ch03-13

Challenge

(Blowing-up lemma) Prove that for any set SXnS \subseteq \mathcal{X}^n with Pn(S)1/2P^n(S) \geq 1/2, and for any δ>0\delta > 0, the "blown-up" set Sδ={x:dH(x,S)nδ}S_\delta = \{\mathbf{x}: d_H(\mathbf{x}, S) \leq n\delta\} satisfies Pn(Sδ)12cnδ2P^n(S_\delta) \geq 1 - 2^{-cn\delta^2} for some constant c>0c > 0 depending only on X|\mathcal{X}| and PP.

ex-ch03-14

Medium

For the binary symmetric channel with crossover ϵ=0.11\epsilon = 0.11: (a) Compute the capacity C=1hb(ϵ)C = 1 - h_b(\epsilon). (b) If we use rate R=0.5R = 0.5 bits/use, estimate the decoding error exponent E=I(X;Y)RE = I(X;Y) - R at the uniform input distribution.

ex-ch03-15

Easy

How many distinct types can arise from binary sequences of length n=100n = 100?