Prerequisites & Notation

Before You Begin

This chapter applies the relay coding strategies from Chapter 22 to Gaussian channels and extends them to large networks. We assume familiarity with the Gaussian channel capacity results from Chapter 10.

  • The relay channel model, cut-set bound, DF, and CF (Chapter 22)(Review ch22)

    Self-check: Can you state the cut-set bound and the DF/CF achievable rates for the general relay channel?

  • Gaussian channel capacity and water-filling (Chapter 10)(Review ch10)

    Self-check: Can you derive the capacity of the AWGN channel Y=X+ZY = X + Z with power constraint PP?

  • Rate-distortion theory for Gaussian sources (Chapter 6)(Review ch06)

    Self-check: Can you state the rate-distortion function R(D)=12log(σ2/D)R(D) = \frac{1}{2}\log(\sigma^2/D) for a Gaussian source?

  • Wyner-Ziv coding and decoder side information (Chapter 6)(Review ch06)

    Self-check: Can you explain how side information at the decoder reduces the compression rate?

Notation for This Chapter

Key symbols for the Gaussian relay channel and network scaling results.

SymbolMeaningIntroduced
PPSource (or per-node) transmit powers01
PrP_rRelay transmit powers01
σ2\sigma^2Noise variance (often normalized to 1)s01
SNR\text{SNR}Signal-to-noise ratio P/σ2P/\sigma^2s01
Δ\DeltaQuantization distortion in CFs01
nnNumber of nodes in a network (Section 23.2)s02
Θ()\Theta(\cdot)Order notation (tight asymptotic bound)s02
Y^rk\hat{Y}_{r_k}Compressed observation of relay kk in noisy network codings03