Prerequisites & Notation
Before You Begin
This chapter collects additional topics in source coding that extend the core material of Chapters 5-7. We explore source coding with a helper (Section 8.1), practical distributed video coding (Section 8.2), and the deep connections between rate-distortion theory and machine learning (Section 8.3). The reader should be comfortable with the Slepian-Wolf theorem, Wyner-Ziv coding, and rate-distortion theory.
- Slepian-Wolf coding and random binning (Chapter 7)(Review ch07)
Self-check: Can you state the Slepian-Wolf rate region and explain the random binning proof?
- Rate-distortion function and Wyner-Ziv coding (Chapter 6)(Review ch06)
Self-check: Can you write the Wyner-Ziv rate-distortion function and explain why it equals for Gaussian sources?
- Mutual information, chain rules, and data processing inequality (Chapter 1)(Review ch01)
Self-check: Can you apply the chain rule for mutual information and state the data processing inequality?
- LDPC codes and belief propagation (familiarity helpful)
Self-check: Do you know what a parity-check matrix is and how belief propagation decodes LDPC codes?
Notation for This Chapter
Additional notation used in this chapter. All symbols from previous chapters carry their established meanings.
| Symbol | Meaning | Introduced |
|---|---|---|
| Helper's encoded description of side information | s01 | |
| Wyner's common information between and | s01 | |
| Compressed representation in the information bottleneck | s03 | |
| Lagrange multiplier (inverse temperature) in the information bottleneck | s03 | |
| KL divergence from to | s03 | |
| Evidence lower bound in variational inference | s03 |