Prerequisites & Notation
Before You Begin
This chapter requires a solid grasp of channel coding theory (capacity, achievability, converse) and Gaussian channel fundamentals. The shift from asymptotic to finite-blocklength thinking is conceptually significant.
- DMC capacity theorem: achievability (random coding) and converse (Fano's inequality)(Review ch09)
Self-check: Can you state and prove both directions of Shannon's channel coding theorem?
- Gaussian channel capacity (Review ch10)
Self-check: Can you derive the AWGN capacity and explain the role of the power constraint?
- Typicality and the AEP(Review ch03)
Self-check: Can you state the weak AEP and explain how it enables random coding arguments?
- Hypothesis testing (Neyman-Pearson lemma)
Self-check: Can you state the Neyman-Pearson lemma and compute the optimal test for two Gaussians?
- Central limit theorem and Berry-Esseen bounds
Self-check: Can you state the CLT and the Berry-Esseen bound on the approximation error?
Notation for This Chapter
Symbols introduced in this chapter. The finite-blocklength framework introduces several quantities that have no direct analog in classical information theory.
| Symbol | Meaning | Introduced |
|---|---|---|
| Maximum coding rate at blocklength and error probability | s01 | |
| Channel dispersion (variance of the information density) | s01 | |
| Information density: | s01 | |
| Inverse of the Gaussian Q-function | s01 | |
| The bound (meta-converse parameter) | s02 | |
| Minimum type-II error in hypothesis testing vs at significance | s02 | |
| Random coding union bound on error probability | s02 | |
| Third absolute moment of information density (Berry-Esseen parameter) | s01 |