Prerequisites & Notation

Before You Begin

This chapter requires a solid grasp of channel coding theory (capacity, achievability, converse) and Gaussian channel fundamentals. The shift from asymptotic to finite-blocklength thinking is conceptually significant.

  • DMC capacity theorem: achievability (random coding) and converse (Fano's inequality)(Review ch09)

    Self-check: Can you state and prove both directions of Shannon's channel coding theorem?

  • Gaussian channel capacity C=12log(1+SNR)C = \frac{1}{2}\log(1 + \text{SNR})(Review ch10)

    Self-check: Can you derive the AWGN capacity and explain the role of the power constraint?

  • Typicality and the AEP(Review ch03)

    Self-check: Can you state the weak AEP and explain how it enables random coding arguments?

  • Hypothesis testing (Neyman-Pearson lemma)

    Self-check: Can you state the Neyman-Pearson lemma and compute the optimal test for two Gaussians?

  • Central limit theorem and Berry-Esseen bounds

    Self-check: Can you state the CLT and the Berry-Esseen bound on the approximation error?

Notation for This Chapter

Symbols introduced in this chapter. The finite-blocklength framework introduces several quantities that have no direct analog in classical information theory.

SymbolMeaningIntroduced
R(n,ϵ)R^*(n, \epsilon)Maximum coding rate at blocklength nn and error probability ϵ\epsilons01
VVChannel dispersion (variance of the information density)s01
ι(x;y)\iota(x; y)Information density: logpYX(yx)pY(y)\log \frac{p_{Y|X}(y|x)}{p_Y(y)}s01
Q1()Q^{-1}(\cdot)Inverse of the Gaussian Q-functions01
κβ\kappa_\betaThe κβ\kappa\beta bound (meta-converse parameter)s02
β1ϵ(P,Q)\beta_{1-\epsilon}(P, Q)Minimum type-II error in hypothesis testing PP vs QQ at significance ϵ\epsilons02
RCU(n,M)\text{RCU}(n, M)Random coding union bound on error probabilitys02
TTThird absolute moment of information density (Berry-Esseen parameter)s01