Prerequisites & Notation
Before You Begin
This chapter assumes familiarity with discrete entropy and its properties from Chapter 1. The method of types (Chapter 4) provides the error exponent perspective but is not required for the main results.
- Discrete entropy and its properties (Ch 1)(Review ch01)
Self-check: Can you prove that with equality iff is uniform?
- KL divergence and the information inequality (Ch 1)(Review ch01)
Self-check: Can you state and prove ?
- Basic probability: expectation, Jensen's inequality
Self-check: Can you apply Jensen's inequality to prove that for concave ?
Notation for This Chapter
Symbols introduced in this chapter.
| Symbol | Meaning | Introduced |
|---|---|---|
| Source alphabet (finite, ) | s01 | |
| Code alphabet (typically for binary codes) | s01 | |
| Source code: maps each source symbol to a codeword | s01 | |
| or | Codeword length for symbol (or ) | s01 |
| Expected codeword length | s01 | |
| Minimum achievable expected codeword length | s02 | |
| or | Entropy rate of a stationary process | s05 |
| Stationary distribution of a Markov chain | s05 | |
| Block code: maps length- sequences to binary strings | s01 |