Prerequisites
Before You Begin
This chapter develops the information-theoretic foundations of multiuser communication. It builds heavily on linear algebra (Chapter 1), probability theory (Chapter 2), optimisation (Chapter 3), single-user information theory (Chapter 11), and MIMO capacity (Chapter 17). The material is at the level of El Gamal and Kim's Network Information Theory and requires comfort with measure-theoretic probability, epsilon-delta proofs, and convex optimisation.
- Linear algebra: eigenvalue decomposition, positive semidefinite matrices(Review ch01)
Self-check: Can you state and prove the spectral theorem for Hermitian matrices, and show that the set of PSD matrices forms a convex cone?
- Probability: joint distributions, conditional expectations, Markov chains(Review ch02)
Self-check: Can you state the data processing inequality for the Markov chain and prove it using the chain rule for mutual information?
- Optimisation: convex sets, KKT conditions, Lagrange duality(Review ch03)
Self-check: Can you formulate a convex optimisation problem, derive the KKT conditions, and solve the water-filling problem subject to ?
- Information theory: entropy, mutual information, channel capacity, AEP(Review ch11)
Self-check: Can you state the channel coding theorem, define the capacity of a DMC, and sketch the achievability proof using random coding and joint typicality?
- MIMO capacity and precoding(Review ch17)
Self-check: Can you derive the MIMO capacity and show that the optimal follows from water-filling on the channel singular values?
Chapter 26 Notation
Key symbols introduced or heavily used in this chapter.
| Symbol | Meaning | Introduced |
|---|---|---|
| Capacity region (set of achievable rate tuples) | s01 | |
| Achievable rate region (inner bound on capacity) | s01 | |
| Transmit power constraints for users 1 and 2 | s01 | |
| Noise variance (power) at the receiver | s01 | |
| Power allocation fraction in superposition coding (BC) | s02 | |
| Interference-to-noise ratio | s03 | |
| Signal-to-noise ratio | s03 | |
| Degrees of freedom: | s05 | |
| Channel dispersion (variance of information density) | s06 | |
| Inverse of the Gaussian Q-function | s06 | |
| Block error probability (finite blocklength) | s06 | |
| Blocklength (number of channel uses) | s06 |