Exercises
ex-ch07-01
EasyLet and . Compute the Slepian-Wolf rate region. What is the minimum sum rate ?
When , what are the conditional entropies and ?
The sum-rate bound is since .
Compute entropies
Since : , , bit.
Rate region
, , . The minimum sum rate is 1 bit. One encoder can send 0 bits (be silent) while the other sends 1 bit β perfect correlation means one copy suffices.
ex-ch07-02
EasyLet and be independent, each . Compute the Slepian-Wolf rate region and compare with separate compression.
When and are independent, and .
Compute entropies
, , .
Rate region
, , . The Slepian-Wolf region collapses to the single point . Joint decoding provides no advantage when the sources are independent β as expected.
ex-ch07-03
MediumLet be jointly Gaussian with , , and correlation coefficient . Compute the Slepian-Wolf rate region as a function of . (Hint: use the quantized version.)
For jointly Gaussian , using differential entropy.
The Slepian-Wolf theorem applies to discrete sources. For continuous sources, consider the operational rate region with vanishing excess distortion.
Differential entropy approach
For continuous sources, the Slepian-Wolf rate region generalizes to the rate region where the excess rate over the joint differential entropy is minimized. The conditional differential entropies are:
The rate region (in the differential entropy sense) is: , , .
Interpretation
As , and the conditional rates decrease (more correlation means less to send). At , the conditional rate equals the marginal rate, and joint decoding provides no benefit.
ex-ch07-04
MediumProve that for the Slepian-Wolf problem, the sum rate cannot be improved, even if the encoders are allowed to use time-sharing (randomly switching between different coding strategies).
Time-sharing between corner points lies on the dominant face of the rate region.
Show that any convex combination of achievable rate pairs satisfies the sum-rate bound.
Time-sharing argument
Let and be two achievable rate pairs. A time-sharing strategy uses the first code for fraction of the time and the second for fraction :
Sum-rate preservation
Since both rate pairs satisfy :
.
Time-sharing preserves the sum-rate bound. The Slepian-Wolf region is already convex, so time-sharing does not enlarge it.
ex-ch07-05
MediumConsider the Slepian-Wolf problem with three sources drawn i.i.d. from . Write the complete rate region for lossless distributed compression.
Extend the two-source region by considering all subsets of sources.
There are constraints on individual rates, pairs, and the triple.
The three-source Slepian-Wolf region
The achievable rate region is the set of satisfying:
This follows from the same random binning argument, extended to three bins. There is one constraint for each non-empty subset : .
ex-ch07-06
MediumIn the CEO problem with agents, , and with i.i.d., compute the minimum distortion achievable as a function of the per-agent rate (assuming symmetric rates for all ).
Use the sum-rate formula and invert to find as a function of .
The minimum distortion decreases with but is bounded below by .
Invert the sum-rate formula
From the Gaussian CEO sum-rate formula with symmetric agents, we need to solve for . For the standard case :
Simplifying: (after algebraic manipulation).
Solve for distortion
R \to \inftyD \to (1 + K/\sigma_N^2)^{-1}K \to \inftyD \to 0$ (infinite agents can reconstruct perfectly).
ex-ch07-07
HardProve the converse of the Slepian-Wolf theorem: if a rate pair is achievable, then . (Hint: use Fano's inequality and the chain rule.)
Start from .
Use Fano's inequality: where .
Apply Fano's inequality
Since , by Fano's inequality: where .
Bound the sum rate
nn \to \inftyR_X + R_Y \geq H(X,Y)$.
ex-ch07-08
HardShow that for the binary symmetric source (, , ), the syndrome of an LDPC code with rate can be used for Slepian-Wolf coding of given at the decoder.
The syndrome has length bits.
The decoder must solve given , which is equivalent to decoding from .
Setup
Let be the parity-check matrix of an LDPC code with rate . The syndrome has bits.
Encoding and decoding
Encoder: Computes and sends (rate ).
Decoder: Has and . Computes . Since has i.i.d. entries, decoding from syndrome is equivalent to decoding a BSC() channel with the LDPC code.
Capacity argument
The BSC() has capacity , and our LDPC code has rate . Since capacity-approaching LDPC codes exist, the decoder can recover reliably, and hence . The Slepian-Wolf rate is .
ex-ch07-09
MediumIn the multiple descriptions problem, show that the sum-rate constraint can be rewritten as .
Use the chain rule for mutual information: .
Apply the Markov chain to simplify.
Expand the mutual information terms
I(U_1; U_2) = I(U_1; U_2 | X) + I(X; U_2 | U_1) - I(X; U_2 | U_1)$. Actually, more carefully:
Use the chain rule and Markov property
By the Markov chain : .
Therefore: (by the mutual information chain rule on three variables).
So: .
But with the Markov chain, , so: .
ex-ch07-10
EasyA DMS with bits per symbol is to be transmitted over a BSC with crossover probability . Find the minimum compression ratio needed for reliable transmission.
Use the separation theorem: .
The BSC capacity is .
Compute BSC capacity
$
Find minimum compression ratio
\lceil 3.77 \rceil = 4\tau = 44 \times 0.531 = 2.124 > 2 = H(V)$.
ex-ch07-11
MediumProve that for a Gaussian source over an AWGN channel with , uncoded transmission () achieves the optimal distortion when the bandwidth ratio is .
Compute the MMSE of given .
Compare with the separation-based distortion from .
MMSE of uncoded transmission
Let with . Then where .
The MMSE estimate is .
The MMSE is:
Separation-based bound
From :
Solving: . This matches the uncoded distortion exactly! Uncoded transmission is optimal for .
ex-ch07-12
HardConsider two correlated binary sources transmitted over a binary multiple access channel (MAC) with output . Show that joint source-channel coding can achieve reliable communication while separated coding cannot.
With separated coding, the MAC capacity region limits the individual rates.
With joint coding, the decoder can exploit the correlation between and .
The MAC capacity region for the modulo-2 adder channel is .
Separated coding analysis
The modulo-2 adder MAC has capacity region (and the individual constraints ). With separated coding, we need and for lossless source coding, plus for channel coding.
If , then , requiring . Separated coding fails!
Joint coding
With joint source-channel coding, User 1 sends directly (uncoded) and User 2 sends directly. The receiver observes .
If and are correlated with , then , so . The receiver can recover (the "difference" between and ) and, combined with the MAC structure, reconstruct both sources.
This is a case where joint source-channel coding strictly outperforms separated coding by exploiting the correlation structure at the physical layer.
ex-ch07-13
HardDerive the Berger-Tung rate constraints from the covering and binning conditions. Specifically, show that the covering lemma requires and the binning constraint requires , and combine these to get .
The covering lemma gives a lower bound on .
The Slepian-Wolf binning gives an upper bound on .
Use .
Covering constraint
By the covering lemma, to find jointly typical with in a codebook of size drawn i.i.d. :
Binning constraint
The sequences are partitioned into bins. By the Slepian-Wolf argument, the decoder can recover from the bin index and if the number of sequences per bin is less than the conditional typical set size:
Combine
From covering: . From binning: .
Therefore: .
Now . Using : .
Alternatively: since and . So ...
More directly: . Letting : .
ex-ch07-14
ChallengeProve that the no-excess-rate condition in the multiple descriptions problem () implies . Interpret this condition in terms of the structure of the optimal descriptions.
From the El Gamal-Cover region: .
The rate-distortion bound gives when is optimal.
No excess rate means the sum-rate constraint is tight at .
Apply no-excess-rate condition
No excess rate means where the minimum is over achieving distortion .
From the El Gamal-Cover sum-rate constraint:
For this to equal (no excess), we need .
Interpret using the Markov chain
Since is a Markov chain, . And (not quite...).
Actually: means β the two descriptions are independent. Combined with the Markov chain , this means the descriptions carry completely complementary information about with zero redundancy. No excess rate is achievable only when the descriptions can be made independent while still jointly achieving distortion .
ex-ch07-15
MediumCompute the Slepian-Wolf rate region for the following joint distribution: with for and 0 otherwise.
Compute , , , , .
Note that and are both uniform on .
Compute marginals and entropies
Both and are uniform on : . So bits.
bits (6 equally likely pairs).
bit. Similarly bit.
Write the rate region
The Slepian-Wolf region is: , , .
The sum-rate constraint is binding: , so the sum-rate is the active constraint for rate pairs near . The corner points are and .