Prerequisites & Notation

Before You Begin

This chapter applies the continuous-alphabet channel coding theorem (Chapter 2) to the most important channel model in communications: the additive white Gaussian noise (AWGN) channel. We rely heavily on differential entropy, the Gaussian entropy maximizer theorem, and the connection between MMSE and differential entropy. Convex optimization (Lagrange multipliers, KKT conditions) appears prominently in the water-filling solution.

  • Differential entropy h(X)h(X) and its properties (Chapter 2)(Review ch02)

    Self-check: Can you compute h(X)h(X) for XN(μ,σ2)X \sim \mathcal{N}(\mu, \sigma^2)?

  • Gaussian maximizes differential entropy under a covariance constraint (Chapter 2)(Review ch02)

    Self-check: Can you state and prove why Gaussian input maximizes h(Y)h(Y) for a given variance?

  • Mutual information for continuous random variables: I(X;Y)=h(Y)h(YX)I(X;Y) = h(Y) - h(Y|X)(Review ch02)

    Self-check: Can you express I(X;Y)I(X;Y) in terms of differential entropies?

  • Channel coding theorem for continuous-alphabet channels (Chapter 9)(Review ch09)

    Self-check: Can you state the capacity formula C=maxpXI(X;Y)C = \max_{p_X} I(X;Y) and what achievability and converse mean?

  • Lagrange multipliers and KKT conditions for constrained optimization

    Self-check: Can you solve maxf(x)\max f(x) subject to g(x)0g(x) \leq 0 using the KKT conditions?

  • Basic Fourier analysis: DFT, DTFT, circulant matrices

    Self-check: Do you know that circulant matrices are diagonalized by the DFT matrix?

Notation for This Chapter

Symbols introduced in this chapter. All logarithms are base 2 (bits) unless stated otherwise. We use both real and complex channel models; the context will always be clear.

SymbolMeaningIntroduced
Y=X+ZY = X + ZScalar AWGN channel models01
ZN(0,N)Z \sim \mathcal{N}(0, N)Additive Gaussian noise with variance NNs01
PPAverage transmit power constraint: 1nixi2P\frac{1}{n}\sum_i x_i^2 \leq Ps01
SNR=P/N\text{SNR} = P/NSignal-to-noise ratios01
C(SNR)C(\text{SNR})AWGN channel capacity as a function of SNRs01
Eb/N0E_b/N_0Energy per bit to noise spectral density ratios02
WWChannel bandwidth in Hzs02
GkG_kChannel gain on sub-channel kk (parallel Gaussian model)s03
ν\nuWater-filling level (Lagrange multiplier)s03
[x]+=max(x,0)[x]_+ = \max(x, 0)Positive part operators03
N0N_0One-sided noise power spectral densitys05
G(ξ)G(\xi)DTFT of the channel impulse responses04