Prerequisites & Notation
Before You Begin
This chapter builds the frequentist backbone of estimation theory: bias, variance, sufficiency, the Cramer--Rao bound, and the Rao--Blackwell procedure for constructing minimum-variance unbiased estimators. Before you begin, make sure the following are second nature.
- Multivariate Gaussian distribution, covariance matrices, PSD ordering(Review ch01)
Self-check: Can you write the density of and derive the log-likelihood up to constants?
- Conditional expectation as an projection(Review ch02)
Self-check: Do you remember why minimizes the MSE over functions of , and what the tower property says about its variance?
- Cauchy--Schwarz inequality and equality conditions
Self-check: Given vs. , can you state when equality holds?
- Log-likelihood, score, and regularity conditions for differentiation under the integral sign
Self-check: Why does fail when the support depends on ?
- Gaussian hypothesis testing and likelihood ratios(Review ch01)
Self-check: Can you connect the score function to an infinitesimal log-likelihood ratio?
Notation for This Chapter
Symbols introduced or emphasized in this chapter. For global conventions (vectors, matrices, probability), see the master notation page of this book.
| Symbol | Meaning | Introduced |
|---|---|---|
| Unknown parameter (scalar or vector) with | s01 | |
| Parameter domain, | s01 | |
| Parametric family of densities / pmfs | s01 | |
| Estimator function and its realized estimate | s01 | |
| Bias of an estimator: | s01 | |
| Mean-squared error at parameter | s01 | |
| Score function | s02 | |
| Scalar Fisher information | s02 | |
| Fisher information matrix (FIM) | s02 | |
| Statistic / sufficient statistic for | s03 | |
| Factors in the Fisher--Neyman factorization | s03 | |
| Natural parameter and log-partition function of an exponential family | s03 | |
| Rao--Blackwellized estimator | s04 | |
| Minimum-variance unbiased estimator | s04 |