References & Further Reading

References

  1. R. A. Fisher, Theory of Statistical Estimation, 1925

    Introduces Fisher information, sufficiency, and the score function.

  2. H. Cramer, Mathematical Methods of Statistics, Princeton University Press, 1946

    Classical derivation of the CRLB and its regularity conditions.

  3. C. R. Rao, Information and the Accuracy Attainable in the Estimation of Statistical Parameters, 1945

    Independent derivation of the CRLB and the Rao-Blackwell construction.

  4. D. Blackwell, Conditional Expectation and Unbiased Sequential Estimation, 1947

    Establishes the variance-reduction property later called Rao-Blackwellization.

  5. E. L. Lehmann and H. Scheffe, Completeness, Similar Regions, and Unbiased Estimation, 1950

    The Lehmann-Scheffe theorem: unbiased function of a complete sufficient statistic is UMVUE.

  6. S. M. Kay, Fundamentals of Statistical Signal Processing, Volume I: Estimation Theory, Prentice Hall, 1993

    Engineering-oriented treatment of CRLB, sufficiency, and MVUE with signal-processing examples.

  7. E. L. Lehmann and G. Casella, Theory of Point Estimation, Springer, 2nd ed., 1998

    Authoritative modern reference on unbiased estimation, sufficiency, and completeness.

  8. G. Casella and R. L. Berger, Statistical Inference, Duxbury, 2nd ed., 2002

    Graduate text with exponential-family completeness and UMVUE construction.

  9. H. L. Van Trees, Detection, Estimation, and Modulation Theory, Part I, Wiley, 2001

    Signal-processing treatment of Fisher information matrix and vector CRLB.

  10. H. V. Poor, An Introduction to Signal Detection and Estimation, Springer, 2nd ed., 1994

    Graduate text covering sufficiency and exponential families.

  11. E. L. Lehmann and G. Casella, Theory of Point Estimation, Springer, 2nd ed., 1998

    Duplicate alias retained for cross-compatibility.

  12. D. Basu, On Statistics Independent of a Complete Sufficient Statistic, 1955

    Basu's theorem linking completeness and ancillarity.

  13. F. Liu, G. Caire, On the Fundamental Tradeoff of Integrated Sensing and Communications Under Gaussian Channels, 2023

Further Reading

Estimation fundamentals branch in three directions: deeper asymptotic theory, tighter non-asymptotic bounds, and modern high-dimensional versions of the same ideas.

  • Rigorous asymptotic theory

    A. W. van der Vaart, Asymptotic Statistics, 1998

    Modern CRLB, efficiency, and Le Cam's theory with minimal regularity.

  • Tighter bounds beyond CRLB

    H. L. Van Trees and K. L. Bell, Bayesian Bounds for Parameter Estimation and Nonlinear Filtering/Tracking, 2007

    Bhattacharyya, Barankin, Ziv-Zakai bounds tighten CRLB at low SNR.

  • Information geometry

    S. Amari, Information Geometry and Its Applications, 2016

    Geometric view of Fisher information as a Riemannian metric on parameter space.

  • Minimax estimation

    A. B. Tsybakov, Introduction to Nonparametric Estimation, 2009

    Minimax risk, Le Cam's method, and Fano's inequality in estimation theory.