Prerequisites

Before You Begin

This chapter brings together linear algebra (Chapter 1), probability theory (Chapter 2), fading channel models (Chapter 6), antenna arrays (Chapter 7), and information-theoretic foundations (Chapter 11). MIMO theory is the synthesis of all these threads: linear algebra provides the SVD and matrix decompositions; probability gives us the random channel model; fading channels define the propagation environment; array theory establishes the spatial dimension; and information theory supplies the capacity formulas.

  • Singular value decomposition (SVD) and eigenvalue decomposition(Review ch01)

    Self-check: Can you compute the SVD A=UΣVH\mathbf{A} = \mathbf{U}\boldsymbol{\Sigma}\mathbf{V}^H of a 3×23 \times 2 matrix and identify its rank from the singular values?

  • Matrix rank, null space, and condition number(Review ch01)

    Self-check: Can you determine the rank of a matrix from its singular values and explain what a large condition number implies about numerical stability?

  • Complex Gaussian random vectors and covariance matrices(Review ch02)

    Self-check: Can you write the PDF of a circularly symmetric complex Gaussian vector zCN(0,R)\mathbf{z} \sim \mathcal{CN}(\mathbf{0}, \mathbf{R}) and compute expectations like E[zzH]\mathbb{E}[\mathbf{z}\mathbf{z}^H]?

  • Rayleigh and Ricean fading models(Review ch06)

    Self-check: Can you describe how Rayleigh fading arises from the sum of many scattered paths and state the distribution of the fading envelope and instantaneous SNR?

  • Antenna array response vectors and beamforming(Review ch07)

    Self-check: Can you write the array steering vector a(θ)=[1,ej2πdsinθ/λ,]T\mathbf{a}(\theta) = [1, e^{j2\pi d\sin\theta/\lambda}, \ldots]^T for a uniform linear array and explain spatial filtering?

  • Channel capacity, mutual information, and water-filling(Review ch11)

    Self-check: Can you state the capacity of a scalar AWGN channel C=log2(1+SNR)C = \log_2(1 + \text{SNR}) and explain the water-filling principle for parallel Gaussian channels?

Chapter 15 Notation

Key symbols introduced or heavily used in this chapter. Bold uppercase denotes matrices; bold lowercase denotes column vectors.

SymbolMeaningIntroduced
H\mathbf{H}MIMO channel matrix (nr×ntn_r \times n_t)s01
ntn_tNumber of transmit antennass01
nrn_rNumber of receive antennass01
x\mathbf{x}Transmitted signal vector (nt×1n_t \times 1)s01
y\mathbf{y}Received signal vector (nr×1n_r \times 1)s01
n\mathbf{n}Additive noise vector, nCN(0,σ2I)\mathbf{n} \sim \mathcal{CN}(\mathbf{0}, \sigma^2 \mathbf{I})s01
σi\sigma_iii-th singular value of H\mathbf{H}s01
κ(H)\kappa(\mathbf{H})Condition number σmax/σmin\sigma_{\max}/\sigma_{\min} of the channel matrixs01
Rt,Rr\mathbf{R}_{t}, \mathbf{R}_{r}Transmit and receive spatial correlation matricess02
Q\mathbf{Q}Input covariance matrix E[xxH]\mathbb{E}[\mathbf{x}\mathbf{x}^H]s03
PPTotal transmit power constraints03
CMIMOC_{\mathrm{MIMO}}MIMO channel capacity (bits/s/Hz)s03
rrMultiplexing rate (in DMT context)s06
d(r)d(r)Diversity gain as a function of multiplexing rates06
SNR\text{SNR}Signal-to-noise ratio P/σ2P / \sigma^2s01

Notation Note: Noise Vector

This chapter uses n\mathbf{n} for the additive noise vector, following the convention of Tse & Viswanath (2005) and most MIMO literature. The Ferkans library standard (see notation.yaml) designates w\mathbf{w} for noise to avoid collision with the number of antennas ntn_t, nrn_r. In this chapter, the subscripts on the antenna counts (ntn_t, nrn_r) provide sufficient disambiguation.