Prerequisites & Notation
Prerequisites for Chapter 21
This chapter introduces random matrix theory from the perspective of wireless communications. The reader needs linear algebra (eigenvalues, SVD), probability at the level of FSP Chapters 5--11, and some familiarity with complex Gaussian vectors (Ch. 8). No prior exposure to random matrix theory is assumed.
- Eigenvalue decomposition and singular value decomposition
Self-check: Can you compute the eigenvalues of a Hermitian matrix and state the relationship ?
- Complex Gaussian vectors and covariance matrices(Review ch08)
Self-check: Can you write the density of and compute ?
- Convergence in distribution, weak law of large numbers(Review ch11)
Self-check: Do you know the difference between convergence in distribution and convergence in probability?
- Moment generating functions and characteristic functions(Review ch09)
Self-check: Can you compute the MGF of a Gaussian random variable?
- Logarithm and determinant of positive definite matrices
Self-check: Do you know that for Hermitian PD ?
Notation for This Chapter
The following notation is used throughout Chapter 21. We write for a generic random matrix (not necessarily a channel matrix in this chapter). The notation denotes the empirical spectral distribution of a Hermitian matrix .
| Symbol | Meaning | Introduced |
|---|---|---|
| Random matrix (generic), often with i.i.d. entries | s01 | |
| Sample covariance (Wishart-type) matrix | s01 | |
| Ordered eigenvalues of a Hermitian matrix | s01 | |
| Empirical spectral distribution (ESD) of Hermitian | s01 | |
| Aspect ratio of the matrix | s02 | |
| Marchenko-Pastur density with parameter | s02 | |
| Endpoints of the Marchenko-Pastur support | s02 | |
| Stieltjes transform of a distribution | s03 | |
| Noise variance / noise power | ||
| Gaussian distribution with mean and variance | ||
| Standard circularly symmetric complex Gaussian |