Exercises

ex-ch19-01

Easy

State the point-to-point source–channel separation theorem for a DMS with entropy H(S)H(S) over a DMC with capacity CC at bandwidth ratio κ=1\kappa = 1. What is the necessary and sufficient condition for reliable (lossless) transmission?

ex-ch19-02

Easy

A binary source SBern(0.3)S \sim \text{Bern}(0.3) is transmitted over a BSC with crossover probability p=0.1p = 0.1 at bandwidth ratio κ=1\kappa = 1. Is reliable transmission possible under separation?

ex-ch19-03

Easy

Two independent sources S1,S2S_1, S_2 with H(S1)=0.8H(S_1) = 0.8 bits and H(S2)=0.6H(S_2) = 0.6 bits are transmitted over a MAC with capacity region {(R1,R2):R11,R21,R1+R21.5}\{(R_1, R_2): R_1 \leq 1, R_2 \leq 1, R_1 + R_2 \leq 1.5\}. Is separate source and channel coding sufficient?

ex-ch19-04

Easy

For a Gaussian source SN(0,1)S \sim \mathcal{N}(0, 1) over a Gaussian channel with SNR=10\text{SNR} = 10 (linear, not dB) and bandwidth ratio κ=1\kappa = 1, compute the minimum achievable distortion under (a) optimal coding, (b) uncoded transmission.

ex-ch19-05

Easy

Explain in one paragraph why Shannon's separation theorem does not extend to arbitrary multi-terminal networks. Give one specific example.

ex-ch19-06

Medium

Prove the converse of the point-to-point source–channel separation theorem: if a sequence of (n,k)(n, k) joint source–channel codes achieves Pe(k)0P_e^{(k)} \to 0 for a DMS SS over a DMC with capacity CC, then H(S)(n/k)CH(S) \leq (n/k) C.

ex-ch19-07

Medium

Consider two correlated binary sources with joint distribution PS1S2(0,0)=PS1S2(1,1)=(1p)/2P_{S_1 S_2}(0,0) = P_{S_1 S_2}(1,1) = (1-p)/2 and PS1S2(0,1)=PS1S2(1,0)=p/2P_{S_1 S_2}(0,1) = P_{S_1 S_2}(1,0) = p/2 for p[0,1/2]p \in [0, 1/2].

(a) Compute H(S1),H(S2),H(S1S2),H(S1,S2)H(S_1), H(S_2), H(S_1|S_2), H(S_1, S_2).

(b) These sources are transmitted over a Gaussian MAC with SNR1=SNR2=5\text{SNR}_{1} = \text{SNR}_{2} = 5 dB. Determine whether separation is sufficient.

ex-ch19-08

Medium

A Gaussian source SN(0,σS2)S \sim \mathcal{N}(0, \sigma_S^2) is transmitted over a Gaussian channel with SNR=P/σ2\text{SNR} = P/\sigma^2 at bandwidth ratio κ=2\kappa = 2.

(a) Compute the optimal distortion under separate coding.

(b) Compute the distortion under uncoded repetition coding: X1=X2=P/(2σS2)SX_1 = X_2 = \sqrt{P/(2\sigma_S^2)} S (same symbol transmitted twice).

(c) Which is better?

ex-ch19-09

Medium

Prove that for independent sources S1,S2S_1, S_2 over any MAC, separate source and channel coding is optimal (i.e., joint coding provides no advantage).

ex-ch19-10

Medium

Explain the concept of hybrid digital–analog coding and describe a scenario where it outperforms both pure digital (separate) and pure analog (uncoded) transmission.

ex-ch19-11

Medium

Consider the lossy version of the separation theorem. A Gaussian source SN(0,4)S \sim \mathcal{N}(0, 4) is transmitted over an AWGN channel with SNR=20\text{SNR} = 20 dB at bandwidth ratio κ=0.5\kappa = 0.5 (two source symbols per channel use).

(a) What is the minimum achievable distortion under separation?

(b) If we could increase κ\kappa to 1 (one channel use per source symbol), by how many dB does the distortion improve?

ex-ch19-12

Hard

Prove that for a degraded broadcast channel XY1Y2X \to Y_1 \to Y_2 with degraded side information ST2T1S \to T_2 \to T_1, the minimum achievable distortion pair (D1,D2)(D_1, D_2) is characterized by the existence of rates (R1,R2)(R_1, R_2) such that:

  1. RkRSTk(Dk)R_k \geq R_{S|T_k}(D_k) for k=1,2k = 1, 2
  2. (R1,R2)(R_1, R_2) is in the degraded BC capacity region

(Hint: achievability uses successive refinement + superposition coding.)

ex-ch19-13

Hard

Show that for a Gaussian source SN(0,σS2)S \sim \mathcal{N}(0, \sigma_S^2) over a Gaussian channel Y=X+ZY = X + Z, ZN(0,σ2)Z \sim \mathcal{N}(0, \sigma^2), at bandwidth ratio κ=1\kappa = 1, uncoded linear transmission achieves the optimal MSE distortion D=σS2σ2/(σS2P/σS2σS2+σ2)=σS2/(1+SNR)D^* = \sigma_S^2 \sigma^2 / (\sigma_S^2 P/\sigma_S^2 \cdot \sigma_S^2 + \sigma^2) = \sigma_S^2/(1+\text{SNR}).

ex-ch19-14

Hard

Two sources (S1,S2)(S_1, S_2) are jointly Gaussian with zero mean, unit variance, and correlation coefficient ρ\rho. They are transmitted over a Gaussian MAC Y=X1+X2+ZY = X_1 + X_2 + Z with equal power constraints P1=P2=PP_1 = P_2 = P and ZN(0,1)Z \sim \mathcal{N}(0, 1).

Derive the MSE distortion achieved by uncoded transmission Xk=PSkX_k = \sqrt{P} S_k for each source using the MMSE decoder. Show that the distortion decreases as ρ\rho increases (the MMSE decoder exploits correlation).

ex-ch19-15

Hard

Consider the Cover–El Gamal–Salehi sufficient condition for transmitting correlated sources (S1,S2)(S_1, S_2) over a MAC PYX1X2P_{Y|X_1 X_2}:

The source pair is transmissible if there exists a joint distribution PU1U2X1X2P_{U_1 U_2 X_1 X_2} such that H(S1S2)<I(X1;YX2,U1,U2)H(S_1 | S_2) < I(X_1; Y | X_2, U_1, U_2) H(S2S1)<I(X2;YX1,U1,U2)H(S_2 | S_1) < I(X_2; Y | X_1, U_1, U_2) H(S1,S2)<I(X1,X2;YU1,U2)H(S_1, S_2) < I(X_1, X_2; Y | U_1, U_2) H(S1S2)+H(S2)<I(X1;YX2,U2)+I(X2,U2;Y)H(S_1 | S_2) + H(S_2) < I(X_1; Y | X_2, U_2) + I(X_2, U_2; Y)

Show that when U1=U2=U_1 = U_2 = \emptyset (no auxiliary random variables), this reduces to the simple condition that the Slepian–Wolf region fits inside the MAC capacity region.

ex-ch19-16

Hard

A Gaussian source SN(0,1)S \sim \mathcal{N}(0, 1) is transmitted over a Gaussian channel at bandwidth ratio κ=3\kappa = 3. Show that the distortion ratio Duncoded/DoptD_{\text{uncoded}} / D_{\text{opt}} grows exponentially with SNR (in dB). Specifically, show that at high SNR, Duncoded/DoptSNRκ1D_{\text{uncoded}} / D_{\text{opt}} \approx \text{SNR}^{\kappa - 1}.

ex-ch19-17

Challenge

(Research-flavored) Consider a sensor network where KK sensors observe correlated Gaussian sources S1,,SKS_1, \ldots, S_K with covariance matrix KS\mathbf{K}_S and transmit over a Gaussian MAC with channel vector h=[h1,,hK]T\mathbf{h} = [h_1, \ldots, h_K]^T and noise ZN(0,1)Z \sim \mathcal{N}(0, 1).

(a) Derive the MMSE distortion for estimating SkS_k from the MAC output Y=i=1KhiXi+ZY = \sum_{i=1}^K h_i X_i + Z when each sensor uses uncoded transmission Xi=Pi/σSi2SiX_i = \sqrt{P_i/\sigma_{S_i}^2} \cdot S_i.

(b) Show that this distortion depends on the full covariance matrix KS\mathbf{K}_S, not just the marginal variances — confirming that the MAC decoder inherently exploits source correlation.

(c) For K=3K = 3 sensors with equicorrelation (ρij=ρ\rho_{ij} = \rho for all iji \neq j), unit variances, and equal channels (hk=1h_k = 1, Pk=PP_k = P), express the distortion in closed form and plot it as a function of ρ\rho.

ex-ch19-18

Challenge

(Open-ended) The separation theorem fails for certain multi-terminal settings, but modern standards (5G NR, Wi-Fi 7) still use separation. Write a critical analysis (1–2 pages) of when this practical choice is justified and when it may not be. Consider:

(a) The finite-blocklength penalty of separation

(b) The role of source correlation in IoT/mMTC scenarios

(c) The emergence of deep joint source–channel coding (DeepJSCC)

(d) The tradeoff between theoretical optimality and implementation complexity