Prerequisites & Notation

Before You Begin

This chapter collects additional topics in source coding that extend the core material of Chapters 5-7. We explore source coding with a helper (Section 8.1), practical distributed video coding (Section 8.2), and the deep connections between rate-distortion theory and machine learning (Section 8.3). The reader should be comfortable with the Slepian-Wolf theorem, Wyner-Ziv coding, and rate-distortion theory.

  • Slepian-Wolf coding and random binning (Chapter 7)(Review ch07)

    Self-check: Can you state the Slepian-Wolf rate region and explain the random binning proof?

  • Rate-distortion function and Wyner-Ziv coding (Chapter 6)(Review ch06)

    Self-check: Can you write the Wyner-Ziv rate-distortion function and explain why it equals R(D)R(D) for Gaussian sources?

  • Mutual information, chain rules, and data processing inequality (Chapter 1)(Review ch01)

    Self-check: Can you apply the chain rule for mutual information and state the data processing inequality?

  • LDPC codes and belief propagation (familiarity helpful)

    Self-check: Do you know what a parity-check matrix is and how belief propagation decodes LDPC codes?

Notation for This Chapter

Additional notation used in this chapter. All symbols from previous chapters carry their established meanings.

SymbolMeaningIntroduced
WWHelper's encoded description of side information YYs01
C(X;Y)C(X; Y)Wyner's common information between XX and YYs01
TTCompressed representation in the information bottlenecks03
β\betaLagrange multiplier (inverse temperature) in the information bottlenecks03
D(PQ)D(P \| Q)KL divergence from PP to QQs03
LELBO\mathcal{L}_{\text{ELBO}}Evidence lower bound in variational inferences03