Prerequisites & Notation
Before You Begin
This chapter develops the central technical machinery for proving coding theorems. We assume solid understanding of discrete and continuous information measures from Chapters 1-2.
- Entropy, mutual information, KL divergence (Chapter 1)(Review ita/ch01)
Self-check: Can you state Fano's inequality and the data processing inequality?
- Differential entropy and the Gaussian maximum entropy property (Chapter 2)(Review ita/ch02)
Self-check: Can you compute the differential entropy of a Gaussian vector?
- Weak law of large numbers
Self-check: Can you state the WLLN and explain what convergence in probability means?
- Union bound and basic combinatorics
Self-check: Can you bound using the union bound?
- Empirical distribution (histogram) of a sequence
Self-check: Given a binary sequence , can you compute the empirical frequency of each symbol?
Notation for This Chapter
New notation for typicality. Prior notation from Chapters 1-2 carries over.
| Symbol | Meaning | Introduced |
|---|---|---|
| A sequence of random variables | s01 | |
| A realization (specific sequence) | s01 | |
| Strongly typical set for distribution | s02 | |
| Empirical distribution (type) of sequence : fraction of positions with value | s02 | |
| Equality to first order in the exponent: means | s01 | |
| Type class: set of all sequences with empirical distribution | s05 |