Prerequisites

Before You Begin

This chapter develops the information-theoretic foundations of multiuser communication. It builds heavily on linear algebra (Chapter 1), probability theory (Chapter 2), optimisation (Chapter 3), single-user information theory (Chapter 11), and MIMO capacity (Chapter 17). The material is at the level of El Gamal and Kim's Network Information Theory and requires comfort with measure-theoretic probability, epsilon-delta proofs, and convex optimisation.

  • Linear algebra: eigenvalue decomposition, positive semidefinite matrices(Review ch01)

    Self-check: Can you state and prove the spectral theorem for Hermitian matrices, and show that the set of PSD matrices forms a convex cone?

  • Probability: joint distributions, conditional expectations, Markov chains(Review ch02)

    Self-check: Can you state the data processing inequality I(X;Z)≀II(X;Z) \leq I for the Markov chain Xβ†’Yβ†’ZX \to Y \to Z and prove it using the chain rule for mutual information?

  • Optimisation: convex sets, KKT conditions, Lagrange duality(Review ch03)

    Self-check: Can you formulate a convex optimisation problem, derive the KKT conditions, and solve the water-filling problem maxβ‘βˆ‘ilog⁑(1+piΞ»i)\max \sum_i \log(1 + p_i \lambda_i) subject to βˆ‘pi≀P\sum p_i \leq P?

  • Information theory: entropy, mutual information, channel capacity, AEP(Review ch11)

    Self-check: Can you state the channel coding theorem, define the capacity C=max⁑p(x)IC = \max_{p(x)} I of a DMC, and sketch the achievability proof using random coding and joint typicality?

  • MIMO capacity and precoding(Review ch17)

    Self-check: Can you derive the MIMO capacity C=max⁑Kx:tr(Kx)≀Plog⁑det⁑(I+HKxHH/Οƒ2)C = \max_{\mathbf{K}_x: \mathrm{tr}(\mathbf{K}_x) \leq P} \log\det(\mathbf{I} + \mathbf{H}\mathbf{K}_x\mathbf{H}^{H}/\sigma^2) and show that the optimal Kx\mathbf{K}_x follows from water-filling on the channel singular values?

Chapter 26 Notation

Key symbols introduced or heavily used in this chapter.

SymbolMeaningIntroduced
C\mathcal{C}Capacity region (set of achievable rate tuples)s01
R\mathcal{R}Achievable rate region (inner bound on capacity)s01
P1,P2P_1, P_2Transmit power constraints for users 1 and 2s01
NNNoise variance (power) at the receivers01
Ξ±\alphaPower allocation fraction in superposition coding (BC)s02
INR\mathrm{INR}Interference-to-noise ratios03
SNR\text{SNR}Signal-to-noise ratios03
d(SNR)d(\text{SNR})Degrees of freedom: lim⁑SNRβ†’βˆžR(SNR)/log⁑(SNR)\lim_{\text{SNR} \to \infty} R(\text{SNR}) / \log(\text{SNR})s05
VVChannel dispersion (variance of information density)s06
Qβˆ’1(β‹…)Q^{-1}(\cdot)Inverse of the Gaussian Q-functions06
Ο΅\epsilonBlock error probability (finite blocklength)s06
nnBlocklength (number of channel uses)s06