Exercises

ex-ch07-01

Easy

Let X∼Bernoulli(1/2)X \sim \text{Bernoulli}(1/2) and Y=XY = X. Compute the Slepian-Wolf rate region. What is the minimum sum rate RX+RYR_X + R_Y?

ex-ch07-02

Easy

Let XX and YY be independent, each Bernoulli(1/2)\text{Bernoulli}(1/2). Compute the Slepian-Wolf rate region and compare with separate compression.

ex-ch07-03

Medium

Let (X,Y)(X, Y) be jointly Gaussian with X∼N(0,1)X \sim \mathcal{N}(0, 1), Y∼N(0,1)Y \sim \mathcal{N}(0, 1), and correlation coefficient ρ\rho. Compute the Slepian-Wolf rate region as a function of ρ\rho. (Hint: use the quantized version.)

ex-ch07-04

Medium

Prove that for the Slepian-Wolf problem, the sum rate RX+RYβ‰₯H(X,Y)R_X + R_Y \geq H(X,Y) cannot be improved, even if the encoders are allowed to use time-sharing (randomly switching between different coding strategies).

ex-ch07-05

Medium

Consider the Slepian-Wolf problem with three sources (X,Y,Z)(X, Y, Z) drawn i.i.d. from PXYZP_{XYZ}. Write the complete rate region for lossless distributed compression.

ex-ch07-06

Medium

In the CEO problem with KK agents, X∼N(0,1)X \sim \mathcal{N}(0, 1), and Yk=X+NkY_k = X + N_k with Nk∼N(0,ΟƒN2)N_k \sim \mathcal{N}(0, \sigma_N^2) i.i.d., compute the minimum distortion achievable as a function of the per-agent rate RR (assuming symmetric rates Rk=RR_k = R for all kk).

ex-ch07-07

Hard

Prove the converse of the Slepian-Wolf theorem: if a rate pair (RX,RY)(R_X, R_Y) is achievable, then RX+RYβ‰₯H(X,Y)R_X + R_Y \geq H(X,Y). (Hint: use Fano's inequality and the chain rule.)

ex-ch07-08

Hard

Show that for the binary symmetric source (X∼Bernoulli(1/2)X \sim \text{Bernoulli}(1/2), Y=XβŠ•ZY = X \oplus Z, Z∼Bernoulli(p)Z \sim \text{Bernoulli}(p)), the syndrome of an LDPC code with rate 1βˆ’H2(p)βˆ’Ο΅1 - \mathcal{H}_2(p) - \epsilon can be used for Slepian-Wolf coding of XX given YY at the decoder.

ex-ch07-09

Medium

In the multiple descriptions problem, show that the sum-rate constraint R1+R2β‰₯I(X;U1,U2)+I(U1;U2)R_1 + R_2 \geq I(X; U_1, U_2) + I(U_1; U_2) can be rewritten as R1+R2β‰₯I(X;U1)+I(X;U2)+I(U1;U2∣X)R_1 + R_2 \geq I(X; U_1) + I(X; U_2) + I(U_1; U_2 | X).

ex-ch07-10

Easy

A DMS with H(V)=2H(V) = 2 bits per symbol is to be transmitted over a BSC with crossover probability p=0.1p = 0.1. Find the minimum compression ratio Ο„=n/k\tau = n/k needed for reliable transmission.

ex-ch07-11

Medium

Prove that for a Gaussian source V∼N(0,Οƒ2)V \sim \mathcal{N}(0, \sigma^2) over an AWGN channel with SNR=P/N\text{SNR} = P/N, uncoded transmission (X=P/Οƒ2VX = \sqrt{P/\sigma^2} V) achieves the optimal distortion Dβˆ—=Οƒ2/(1+SNR)D^* = \sigma^2/(1 + \text{SNR}) when the bandwidth ratio is Ο„=1\tau = 1.

ex-ch07-12

Hard

Consider two correlated binary sources (X,Y)(X, Y) transmitted over a binary multiple access channel (MAC) with output Z=XβŠ•YZ = X \oplus Y. Show that joint source-channel coding can achieve reliable communication while separated coding cannot.

ex-ch07-13

Hard

Derive the Berger-Tung rate constraints from the covering and binning conditions. Specifically, show that the covering lemma requires R~X>I(X;U)\tilde{R}_X > I(X; U) and the binning constraint requires R~Xβˆ’RX<H(U∣V)\tilde{R}_X - R_X < H(U|V), and combine these to get RX>I(X;U)βˆ’H(U∣V)+Ξ΄(Ο΅)=I(X;U∣V)+Ξ΄(Ο΅)R_X > I(X; U) - H(U|V) + \delta(\epsilon) = I(X; U|V) + \delta(\epsilon).

ex-ch07-14

Challenge

Prove that the no-excess-rate condition in the multiple descriptions problem (R1+R2=R(D0)R_1 + R_2 = R(D_0)) implies I(U1;U2∣X)=0I(U_1; U_2 | X) = 0. Interpret this condition in terms of the structure of the optimal descriptions.

ex-ch07-15

Medium

Compute the Slepian-Wolf rate region for the following joint distribution: X=Y={0,1,2}\mathcal{X} = \mathcal{Y} = \{0, 1, 2\} with PXY(x,y)=1/6P_{XY}(x, y) = 1/6 for (x,y)∈{(0,0),(0,1),(1,1),(1,2),(2,2),(2,0)}(x, y) \in \{(0,0), (0,1), (1,1), (1,2), (2,2), (2,0)\} and 0 otherwise.