Chapter Summary
Chapter Summary
Key Points
- 1.
Bias-variance identity. For any estimator of a scalar , . Tuning an estimator is a trade between these two terms; biased estimators can beat unbiased ones on MSE.
- 2.
Fisher information. Under regularity, . For independent observations it is additive; for i.i.d. samples, .
- 3.
CRB (scalar). Any unbiased estimator satisfies . The proof is Cauchy--Schwarz on the centered estimator and the score; equality (efficiency) holds iff the score is affine in .
- 4.
CRB (vector). . The componentwise bound is generally larger than , the gap measuring the price of joint estimation.
- 5.
Fisher--Neyman factorization. is sufficient iff . In practice you identify the -dependence in the likelihood and read off . The exponential family makes automatically sufficient --- and, when the natural parameter image has full dimension, complete.
- 6.
Rao--Blackwell. Conditioning any unbiased estimator on a sufficient statistic produces an unbiased estimator with no-larger variance. It is a statistical projection: equality holds iff the original estimator was already a function of the sufficient statistic.
- 7.
Lehmann--Scheffe. When is a complete sufficient statistic, any unbiased function of is the unique MVUE. This gives a constructive MVUE recipe: find complete, find any unbiased function of , done. Efficiency MVUE, but not conversely (e.g., the Bessel-corrected sample variance is MVUE but not efficient).
- 8.
Engineering relevance. The matched filter is a sufficient statistic. Pilot SNR directly controls the CRB on channel estimates. In ISAC, the CRB on target parameters defines one side of the sensing--communication Pareto frontier.
Looking Ahead
Chapter 6 turns our attention from the benchmark to a general-purpose procedure: maximum likelihood. We will prove that the MLE is asymptotically unbiased, consistent, and efficient --- so it reaches the CRB in the limit of large data --- and work through its closed-form solutions (Gaussian linear model) and iterative ones (Newton--Raphson, Fisher scoring). The CRB we built here is the yardstick against which every MLE derivation in Chapter 6 will be measured.