Prerequisites & Notation
Before You Begin
This chapter develops the channel coding theorem β one of the two pillars of Shannon's theory (the other being the source coding theorem of Chapter 5). We need the information measures from Chapter 1 and the typicality machinery from Chapter 3, particularly the packing lemma for the achievability proof and Fano's inequality for the converse.
- Mutual information, chain rules, and convexity properties (Chapter 1)(Review ch01)
Self-check: Can you compute from a joint PMF and state why is concave in for fixed ?
- Fano's inequality (Chapter 1)(Review ch01)
Self-check: Can you state Fano's inequality and explain how it upper-bounds in terms of the error probability?
- Strong typicality and joint typicality (Chapter 3)(Review ch03)
Self-check: Can you define the jointly typical set and state the packing lemma?
- The packing lemma and covering lemma (Chapter 3)(Review ch03)
Self-check: Can you explain why the packing lemma implies that codewords can coexist without confusion?
- Basic probability: union bounds, law of large numbers
Self-check: Can you apply a union bound to bound the probability of a union of events?
Notation for This Chapter
This chapter introduces the channel coding framework. All logarithms are base 2 (capacity in bits per channel use) unless stated otherwise.
| Symbol | Meaning | Introduced |
|---|---|---|
| Discrete memoryless channel (input alphabet, transition law, output alphabet) | s01 | |
| Block code: a collection of codewords and encoding/decoding functions | s01 | |
| Code rate (bits per channel use) | s01 | |
| Block length (number of channel uses) | s01 | |
| Message and decoded message estimate | s01 | |
| Average and maximal probability of error | s01 | |
| Channel capacity: | s02 | |
| Capacity-cost function under input cost constraint | s06 | |
| Jointly typical set (from Chapter 3) | s03 | |
| Binary entropy function: | s05 | |
| Logarithm base 2 (bits) unless noted otherwise | s01 |