Prerequisites & Notation
Before You Begin
This chapter surveys the frontiers of information theory. It draws on concepts from across the book — entropy, channel coding, multi-user theory, relay channels, and network coding. The reader should have a working knowledge of the main achievability and converse techniques (random coding, typicality, Fano's inequality) and the capacity results for the DMC, Gaussian channel, MAC, and broadcast channel.
- Channel capacity and the coding theorem(Review ita/ch09)
Self-check: Can you state the achievability and converse proofs for the DMC?
- Gaussian channel capacity(Review ita/ch10)
Self-check: Can you derive the water-filling solution for parallel Gaussian channels?
- Broadcast channel(Review ita/ch15)
Self-check: Can you state the superposition coding achievable region for the degraded BC?
- Interference channel basics(Review ita/ch17)
Self-check: Can you state the Han-Kobayashi inner bound?
- Relay channel(Review ita/ch22)
Self-check: Can you describe decode-forward and compress-forward relaying?
Notation for This Chapter
This chapter uses notation from across the book. Key symbols are listed here for reference.
| Symbol | Meaning | Introduced |
|---|---|---|
| Channel capacity | ch09 | |
| Mutual information | ch01 | |
| Shannon entropy | ch01 | |
| Signal-to-noise ratio | ch10 | |
| Degrees of freedom (pre-log factor of capacity at high SNR) | s01 | |
| Capacity region (set of achievable rate tuples) | ch14 |