References & Further Reading

References

  1. H. V. Poor, An Introduction to Signal Detection and Estimation, Springer, 2nd ed., 1994

    The canonical graduate reference for detection and estimation. Chapter II on hypothesis testing is the primary source for our treatment of Bayes and Neyman-Pearson theory.

  2. H. L. Van Trees, K. L. Bell, and Z. Tian, Detection, Estimation, and Modulation Theory, Part I: Detection, Estimation, and Filtering Theory, Wiley, 2nd ed., 2013

    The classical text (first edition 1968) expanded and updated. Chapter 2 gives the detection-theoretic foundations with extensive signal-processing examples.

  3. E. L. Lehmann and J. P. Romano, Testing Statistical Hypotheses, Springer, 3rd ed., 2005

    The definitive mathematical reference for hypothesis testing. Chapter 3 contains the full Neyman-Pearson lemma with measure-theoretic rigor.

  4. T. M. Cover and J. A. Thomas, Elements of Information Theory, Wiley-Interscience, 2nd ed., 2006

    Chapter 11 develops Chernoff information and Stein's lemma as the information-theoretic counterpart of detection-theoretic error exponents.

  5. S. M. Kay, Fundamentals of Statistical Signal Processing, Volume II: Detection Theory, Prentice Hall, 1998

    Engineering-flavored companion to Poor. Strong on worked examples and the interface between theory and implementation.

  6. J. Neyman and E. S. Pearson, On the problem of the most efficient tests of statistical hypotheses, 1933

    The original Neyman-Pearson paper introducing the lemma that bears their names.

  7. H. Chernoff, A measure of asymptotic efficiency for tests of a hypothesis based on the sum of observations, 1952

    The original 1952 paper introducing the Chernoff bound and the error exponent $C(f_0, f_1)$ that bears his name.

  8. T. Kailath, The divergence and Bhattacharyya distance measures in signal selection, 1967

    Classic paper relating the Bhattacharyya coefficient to Kullback-Leibler divergence and signal selection.

  9. G. Caire, G. Taricco, and E. Biglieri, Bit-interleaved coded modulation, 1998

    Applies Chernoff-style pairwise error analysis to coded modulation, establishing the BICM capacity. Cited in the commit contribution callout of Section 1.5.

  10. R. G. Gallager, Information Theory and Reliable Communication, Wiley, 1968

    The classical reference for random-coding exponents. Chapter 5 develops the Gallager exponent, which is a Chernoff-type bound on codeword error.

  11. B. C. Levy, Principles of Signal Detection and Parameter Estimation, Springer, 2008

    Modern treatment with emphasis on robust and distributed detection. Clear exposition of ROC geometry and randomised tests.

Further Reading

Beyond the references cited above, the following resources are excellent for further study of binary hypothesis testing and its extensions.

  • Measure-theoretic hypothesis testing

    E. L. Lehmann and J. P. Romano, *Testing Statistical Hypotheses*, 3rd ed., Springer 2005, Chapters 3-4.

    For a fully rigorous treatment with randomised tests and UMP extensions.

  • Large deviations and Sanov's theorem

    A. Dembo and O. Zeitouni, *Large Deviations Techniques and Applications*, 2nd ed., Springer 1998.

    Places Chernoff's exponent within the general large-deviations framework.

  • Sequential detection (Wald's SPRT)

    A. Wald, *Sequential Analysis*, Wiley 1947; reprinted by Dover.

    The foundational text for sequential hypothesis testing --- the LRT applied online with variable sample size.

  • Robust hypothesis testing

    P. J. Huber, 'A robust version of the probability ratio test', Ann. Math. Statist. 36(6), 1965.

    Extension of LRT to models with distributional uncertainty --- essential for CFAR design.

  • Distributed and quantised detection

    P. K. Varshney, *Distributed Detection and Data Fusion*, Springer 1997.

    Extension of binary testing to sensor networks where each node sees partial data.