References & Further Reading
References
- R. A. Fisher, Theory of Statistical Estimation, 1925
Introduces Fisher information, sufficiency, and the score function.
- H. Cramer, Mathematical Methods of Statistics, Princeton University Press, 1946
Classical derivation of the CRLB and its regularity conditions.
- C. R. Rao, Information and the Accuracy Attainable in the Estimation of Statistical Parameters, 1945
Independent derivation of the CRLB and the Rao-Blackwell construction.
- D. Blackwell, Conditional Expectation and Unbiased Sequential Estimation, 1947
Establishes the variance-reduction property later called Rao-Blackwellization.
- E. L. Lehmann and H. Scheffe, Completeness, Similar Regions, and Unbiased Estimation, 1950
The Lehmann-Scheffe theorem: unbiased function of a complete sufficient statistic is UMVUE.
- S. M. Kay, Fundamentals of Statistical Signal Processing, Volume I: Estimation Theory, Prentice Hall, 1993
Engineering-oriented treatment of CRLB, sufficiency, and MVUE with signal-processing examples.
- E. L. Lehmann and G. Casella, Theory of Point Estimation, Springer, 2nd ed., 1998
Authoritative modern reference on unbiased estimation, sufficiency, and completeness.
- G. Casella and R. L. Berger, Statistical Inference, Duxbury, 2nd ed., 2002
Graduate text with exponential-family completeness and UMVUE construction.
- H. L. Van Trees, Detection, Estimation, and Modulation Theory, Part I, Wiley, 2001
Signal-processing treatment of Fisher information matrix and vector CRLB.
- H. V. Poor, An Introduction to Signal Detection and Estimation, Springer, 2nd ed., 1994
Graduate text covering sufficiency and exponential families.
- E. L. Lehmann and G. Casella, Theory of Point Estimation, Springer, 2nd ed., 1998
Duplicate alias retained for cross-compatibility.
- D. Basu, On Statistics Independent of a Complete Sufficient Statistic, 1955
Basu's theorem linking completeness and ancillarity.
- F. Liu, G. Caire, On the Fundamental Tradeoff of Integrated Sensing and Communications Under Gaussian Channels, 2023
Further Reading
Estimation fundamentals branch in three directions: deeper asymptotic theory, tighter non-asymptotic bounds, and modern high-dimensional versions of the same ideas.
Rigorous asymptotic theory
A. W. van der Vaart, Asymptotic Statistics, 1998
Modern CRLB, efficiency, and Le Cam's theory with minimal regularity.
Tighter bounds beyond CRLB
H. L. Van Trees and K. L. Bell, Bayesian Bounds for Parameter Estimation and Nonlinear Filtering/Tracking, 2007
Bhattacharyya, Barankin, Ziv-Zakai bounds tighten CRLB at low SNR.
Information geometry
S. Amari, Information Geometry and Its Applications, 2016
Geometric view of Fisher information as a Riemannian metric on parameter space.
Minimax estimation
A. B. Tsybakov, Introduction to Nonparametric Estimation, 2009
Minimax risk, Le Cam's method, and Fano's inequality in estimation theory.