Prerequisites & Notation
Before You Begin
This chapter assumes comfort with random vectors, Gaussian distributions, and the basic properties of estimators introduced in Chapters 5 and 6. If any item feels unfamiliar, revisit the referenced material first.
- Joint, marginal, and conditional distributions; Bayes' rule
Self-check: Can you write in terms of and ?
- Multivariate Gaussian distribution: density, marginals, conditionals
Self-check: Given jointly Gaussian, do you know the formula for ?
- Covariance matrices, positive semidefiniteness, matrix inversion(Review ch01 (telecom))
Self-check: Can you state when a covariance matrix is strictly positive definite?
- Classical estimation: bias, variance, MSE, CRLB, MLE(Review ch05, ch06)
Self-check: Can you state the CRLB and explain when the MLE attains it?
- Orthogonal projection onto a subspace in an inner product space(Review ch01 (telecom))
Self-check: Can you characterize the projection of onto by an orthogonality condition?
Notation for This Chapter
Symbols used throughout Chapter 7. The Bayesian viewpoint treats the parameter as a random variable, so every symbol has a prior distribution.
| Symbol | Meaning | Introduced |
|---|---|---|
| Scalar or vector parameter to be estimated (random in the Bayesian framework) | s01 | |
| Scalar or vector observation | s01 | |
| Prior density of | s01 | |
| Likelihood: conditional density of given | s01 | |
| Posterior density of given the observation | s01 | |
| Maximum a posteriori (MAP) estimator | s02 | |
| Minimum mean-square error estimator | s02 | |
| Linear MMSE estimator (affine function of ) | s04 | |
| LMMSE gain matrix | s04 | |
| Prior covariance of | s04 | |
| Covariance of the observation | s04 | |
| Cross-covariance | s04 | |
| Mean vectors , | s04 | |
| Posterior (MMSE-error) covariance | s04 |