Prerequisites & Notation

Before You Begin

This chapter develops the central technical machinery for proving coding theorems. We assume solid understanding of discrete and continuous information measures from Chapters 1-2.

  • Entropy, mutual information, KL divergence (Chapter 1)(Review ita/ch01)

    Self-check: Can you state Fano's inequality and the data processing inequality?

  • Differential entropy and the Gaussian maximum entropy property (Chapter 2)(Review ita/ch02)

    Self-check: Can you compute the differential entropy of a Gaussian vector?

  • Weak law of large numbers

    Self-check: Can you state the WLLN and explain what convergence in probability means?

  • Union bound and basic combinatorics

    Self-check: Can you bound Pr(A1AM)\Pr(A_1 \cup \cdots \cup A_M) using the union bound?

  • Empirical distribution (histogram) of a sequence

    Self-check: Given a binary sequence 01101000110100, can you compute the empirical frequency of each symbol?

Notation for This Chapter

New notation for typicality. Prior notation from Chapters 1-2 carries over.

SymbolMeaningIntroduced
Xn=(X1,,Xn)X^n = (X_1, \ldots, X_n)A sequence of nn random variabless01
x=(x1,,xn)\mathbf{x} = (x_1, \ldots, x_n)A realization (specific sequence)s01
Tϵ(n)(X)\mathcal{T}_\epsilon^{(n)}(X)Strongly typical set for distribution PXP_Xs02
P^x(a)\hat{P}_{\mathbf{x}}(a)Empirical distribution (type) of sequence x\mathbf{x}: fraction of positions with value aas02
=˙\dot{=}Equality to first order in the exponent: f(n)=˙g(n)f(n) \dot{=} g(n) means 1nlogf(n)g(n)0\frac{1}{n}\log\frac{f(n)}{g(n)} \to 0s01
TQT_QType class: set of all sequences with empirical distribution QQs05