Chapter Summary
Chapter Summary
Key Points
- 1.
The conditional expectation is a random variable — a function of , not a number. Its key properties are linearity, the tower property (), pulling out what is known, and invariance under independence.
- 2.
The MMSE estimator of given is the conditional expectation: . It minimizes the mean square error over all measurable functions of .
- 3.
The orthogonality principle states that the MMSE estimation error is orthogonal to every function of . Geometrically, is the projection of onto the subspace of functions of .
- 4.
The LMMSE estimator restricts to affine functions and requires only first and second moments: . For jointly Gaussian data, LMMSE equals MMSE.
- 5.
The law of total variance decomposes into unexplained and explained components. The MMSE equals the average conditional variance .
Looking Ahead
Chapter 13 introduces stochastic processes — random functions of time. The conditional expectation and LMMSE tools from this chapter become the foundation for Wiener filtering (optimal linear prediction of stationary processes) and Kalman filtering (recursive state estimation for dynamical systems), treated in the FSI book.