Prerequisites & Notation
Before You Begin
This chapter brings together ideas from several earlier chapters. We need joint distributions (Chapter 7), covariance and expectation (Chapter 5–6), and basic linear algebra (eigendecomposition, positive semi-definiteness). If any item below is unfamiliar, revisit the linked material first.
- Joint PDFs, marginal and conditional distributions(Review ch07)
Self-check: Given a joint PDF , can you compute ?
- Covariance, variance, and the correlation coefficient(Review ch07)
Self-check: Can you compute from a joint distribution?
- Eigenvalue decomposition of symmetric matrices
Self-check: Given a symmetric matrix, can you find its eigenvalues and eigenvectors?
- Positive semi-definite matrices
Self-check: Can you verify that for all ?
- Matrix multiplication, transpose, inverse, and determinant
Self-check: Can you compute and for matrices?
Notation for This Chapter
Symbols introduced or heavily used in this chapter. Bold lowercase denotes vectors, bold uppercase denotes matrices.
| Symbol | Meaning | Introduced |
|---|---|---|
| Random vector | s01 | |
| Mean vector | s01 | |
| Covariance matrix | s01 | |
| Correlation matrix | s01 | |
| Multivariate Gaussian distribution | s02 | |
| Precision (information) matrix | s02 | |
| Proper complex Gaussian distribution | s08 | |
| Chi-squared distribution with degrees of freedom | s07 | |
| Gaussian tail probability | s00 |