Chapter Summary

Chapter 7 Summary

Key Points

  • 1.

    The Slepian-Wolf theorem shows that separate encoding of correlated sources achieves the same rate region as joint encoding, provided decoding is joint: RXβ‰₯H(X∣Y)R_X \geq H(X|Y), RYβ‰₯H(Y∣X)R_Y \geq H(Y|X), RX+RYβ‰₯H(X,Y)R_X + R_Y \geq H(X,Y).

  • 2.

    Random binning is the key proof technique for distributed source coding achievability, analogous to random coding for channel coding.

  • 3.

    The Berger-Tung inner bound extends Slepian-Wolf to the lossy setting by combining rate-distortion codebooks with random binning; it is tight for the quadratic Gaussian CEO problem.

  • 4.

    The CEO problem models distributed sensing: KK agents observe noisy versions of a source and compress separately for a central decoder. The minimum sum rate has a clean closed-form expression in the Gaussian case.

  • 5.

    The multiple descriptions problem captures the redundancy-efficiency tradeoff when descriptions may be partially lost. The El Gamal-Cover region is the best known achievable region.

  • 6.

    The source-channel separation theorem guarantees that separate source and channel coding is optimal for point-to-point systems, but this result fails in multiuser settings where joint coding can exploit source-channel structure.

Looking Ahead

Chapter 8 extends the source coding story with additional topics: source coding with a helper (Wyner's common information), practical aspects of distributed video coding, and the surprising connections between rate-distortion theory and machine learning (the information bottleneck principle). Chapter 9 then shifts to the channel coding side, developing the channel coding theorem for discrete memoryless channels β€” the other pillar of Shannon's theory.

Distributed source coding

Source coding where multiple encoders compress correlated sources independently and a single decoder reconstructs them jointly. The fundamental theorem (Slepian-Wolf) shows that the lossless rate region is unaffected by the lack of encoder cooperation.

Related: Slepian-Wolf coding, Berger-Tung inner bound, Random binning

Compression ratio (source-channel)

The ratio Ο„=n/k\tau = n/k of channel uses nn to source symbols kk in a joint source-channel code. The separation theorem states that reliable transmission requires H(V)<\tauCH(V) < \tauC.

Related: Source-Channel Separation Theorem, Channel capacity, Shannon Entropy

Transmissibility

A source is transmissible over a channel at compression ratio Ο„\tau if there exists a sequence of joint source-channel codes with vanishing error probability. By the separation theorem, transmissibility holds iff H(V)<\tauCH(V) < \tauC.

Related: Source-Channel Separation Theorem, Joint Source-Channel Coding

πŸŽ“CommIT Contribution(2003)

Practical Slepian-Wolf Coding via LDPC Syndromes

S. S. Pradhan, K. Ramchandran β€” IEEE Transactions on Information Theory

Pradhan and Ramchandran showed that the syndrome of a good channel code (specifically, LDPC codes) provides an efficient binning scheme for Slepian-Wolf coding. The encoder transmits the LDPC syndrome of the source sequence, and the decoder uses belief propagation with the side information to recover the source. This practical framework, called DISCUS, bridged the gap between the theoretical Slepian-Wolf result and implementable distributed source codes, enabling applications in sensor networks and distributed video coding.

slepian-wolfLDPCdistributed-source-codingView Paper β†’