Prerequisites & Notation
Before You Begin
This chapter builds on single-user channel coding (Chapter 9), typicality (Chapter 3), and rate-distortion theory (Chapter 6). We also draw on the MAC and BC capacity results from Chapters 14--15. If any of the following feels unfamiliar, revisit the linked material first.
- DMC capacity and the coding theorem (achievability via random coding, converse via Fano's inequality)(Review ch09)
Self-check: Can you state the channel coding theorem and sketch both directions of the proof?
- Joint typicality and the joint AEP(Review ch03)
Self-check: Can you define the jointly typical set and state the joint AEP?
- Rate-distortion theory and Wyner-Ziv coding(Review ch06)
Self-check: Can you state the Wyner-Ziv theorem for lossy compression with decoder side information?
- Multiple access channel (MAC) capacity region(Review ch14)
Self-check: Can you describe the pentagon capacity region of the two-user MAC?
- Broadcast channel (BC) and superposition coding(Review ch15)
Self-check: Can you explain superposition coding for the degraded broadcast channel?
- Markov chains and the data processing inequality(Review ch01)
Self-check: Can you state the data processing inequality and explain when equality holds?
Notation for This Chapter
Symbols introduced or heavily used in this chapter. See also the global notation table.
| Symbol | Meaning | Introduced |
|---|---|---|
| Source (transmitter) channel input | s01 | |
| Relay channel input | s01 | |
| Destination received signal | s01 | |
| Relay received signal | s01 | |
| Compressed version of relay observation (Wyner-Ziv) | s04 | |
| Channel capacity | s02 | |
| Communication rate (bits per channel use) | s01 | |
| Mutual information | s02 | |
| Joint input distribution (source and relay) | s02 | |
| Lattice in for compute-and-forward | s05 | |
| Integer coefficient vector for lattice equations | s05 |