Prerequisites & Notation

Before You Begin

This chapter develops the channel coding theorem β€” one of the two pillars of Shannon's theory (the other being the source coding theorem of Chapter 5). We need the information measures from Chapter 1 and the typicality machinery from Chapter 3, particularly the packing lemma for the achievability proof and Fano's inequality for the converse.

  • Mutual information, chain rules, and convexity properties (Chapter 1)(Review ch01)

    Self-check: Can you compute I(X;Y)I(X;Y) from a joint PMF and state why II is concave in PXP_X for fixed PY∣XP_{Y|X}?

  • Fano's inequality (Chapter 1)(Review ch01)

    Self-check: Can you state Fano's inequality and explain how it upper-bounds H(X∣Y)H(X|Y) in terms of the error probability?

  • Strong typicality and joint typicality (Chapter 3)(Review ch03)

    Self-check: Can you define the jointly typical set and state the packing lemma?

  • The packing lemma and covering lemma (Chapter 3)(Review ch03)

    Self-check: Can you explain why the packing lemma implies that 2n(I(X;Y)βˆ’Ξ΄)2^{n(I(X;Y) - \delta)} codewords can coexist without confusion?

  • Basic probability: union bounds, law of large numbers

    Self-check: Can you apply a union bound to bound the probability of a union of events?

Notation for This Chapter

This chapter introduces the channel coding framework. All logarithms are base 2 (capacity in bits per channel use) unless stated otherwise.

SymbolMeaningIntroduced
(X,PY∣X,Y)(\mathcal{X}, P_{Y|X}, \mathcal{Y})Discrete memoryless channel (input alphabet, transition law, output alphabet)s01
C\mathcal{C}Block code: a collection of codewords and encoding/decoding functionss01
RRCode rate (bits per channel use)s01
nnBlock length (number of channel uses)s01
M,M^M, \hat{M}Message and decoded message estimates01
Pe(n),Pe,max⁑P_e^{(n)}, P_{e,\max}Average and maximal probability of errors01
CCChannel capacity: max⁑PXI(X;Y)\max_{P_X} I(X;Y)s02
C(B)C(B)Capacity-cost function under input cost constraint BBs06
TΟ΅(n)(X,Y)\mathcal{T}_\epsilon^{(n)}(X,Y)Jointly typical set (from Chapter 3)s03
H2(p)\mathcal{H}_2(p)Binary entropy function: βˆ’plog⁑pβˆ’(1βˆ’p)log⁑(1βˆ’p)-p\log p - (1-p)\log(1-p)s05
log⁑\logLogarithm base 2 (bits) unless noted otherwises01