Prerequisites & Notation

Before You Begin

This chapter builds the bridge between probability and estimation theory. The tools developed here β€” conditional expectation as a random variable, the MMSE estimator, the LMMSE estimator β€” are the workhorses of Bayesian inference and signal processing. Make sure the following are solid before proceeding.

  • Expectation, variance, and covariance(Review ch04)

    Self-check: Can you compute E[g(X,Y)]\mathbb{E}[g(X,Y)] for a given joint density fX,Y(x,y)f_{X,Y}(x,y)?

  • Joint and conditional distributions(Review ch03)

    Self-check: Can you derive f(y∣x)f(y|x) from the joint density fX,Y(x,y)f_{X,Y}(x,y)?

  • Gaussian random vectors(Review ch06)

    Self-check: Do you know the conditional distribution of a sub-vector of a jointly Gaussian vector?

  • Matrix inversion and positive definiteness

    Self-check: Can you invert a 2Γ—22 \times 2 matrix and check whether a matrix is positive definite?

Notation for This Chapter

Symbols introduced or heavily used in this chapter.

SymbolMeaningIntroduced
E[X∣Y]\mathbb{E}[X|Y]Conditional expectation of XX given YY (a random variable, function of YY)s01
X^MMSE\hat{X}_{\text{MMSE}}MMSE estimator: X^MMSE=E[X∣Y]\hat{X}_{\text{MMSE}} = \mathbb{E}[X|Y]s02
X^LMMSE\hat{X}_{\text{LMMSE}}Linear MMSE estimators03
CXY\mathbf{C}_{XY}Cross-covariance matrix Cov(X,Y)\text{Cov}(\mathbf{X}, \mathbf{Y})s03
CYY\mathbf{C}_{YY}Covariance matrix of Y\mathbf{Y}s03
textVar(X∣Y)\\text{Var}(X|Y)Conditional variance of XX given YY (a random variable)s04
textMSE(g)\\text{MSE}(g)Mean square error of estimator gg: E[(Xβˆ’g(Y))2]\mathbb{E}[(X - g(Y))^2]s02