Prerequisites & Notation

Before You Begin

This chapter builds on single-user channel coding (Chapter 9), typicality (Chapter 3), and rate-distortion theory (Chapter 6). We also draw on the MAC and BC capacity results from Chapters 14--15. If any of the following feels unfamiliar, revisit the linked material first.

  • DMC capacity and the coding theorem (achievability via random coding, converse via Fano's inequality)(Review ch09)

    Self-check: Can you state the channel coding theorem and sketch both directions of the proof?

  • Joint typicality and the joint AEP(Review ch03)

    Self-check: Can you define the jointly typical set Tϵ(n)(X,Y)\mathcal{T}_\epsilon^{(n)}(X, Y) and state the joint AEP?

  • Rate-distortion theory and Wyner-Ziv coding(Review ch06)

    Self-check: Can you state the Wyner-Ziv theorem for lossy compression with decoder side information?

  • Multiple access channel (MAC) capacity region(Review ch14)

    Self-check: Can you describe the pentagon capacity region of the two-user MAC?

  • Broadcast channel (BC) and superposition coding(Review ch15)

    Self-check: Can you explain superposition coding for the degraded broadcast channel?

  • Markov chains and the data processing inequality(Review ch01)

    Self-check: Can you state the data processing inequality and explain when equality holds?

Notation for This Chapter

Symbols introduced or heavily used in this chapter. See also the global notation table.

SymbolMeaningIntroduced
XXSource (transmitter) channel inputs01
XrX_rRelay channel inputs01
YYDestination received signals01
YrY_rRelay received signals01
Y^r\hat{Y}_rCompressed version of relay observation (Wyner-Ziv)s04
CCChannel capacitys02
RRCommunication rate (bits per channel use)s01
IIMutual information I(X;Y)I(X;Y)s02
PX,XrP_{X, X_r}Joint input distribution (source and relay)s02
Λ\LambdaLattice in Rn\mathbb{R}^n for compute-and-forwards05
a\mathbf{a}Integer coefficient vector for lattice equationss05