Chapter Summary
Chapter Summary
Key Points
- 1.
Van Trees is the Bayesian CRLB. For a random parameter observed through likelihood , every estimator (biased or not) satisfies with . Data information and prior information add literally.
- 2.
Ziv-Zakai captures the threshold effect. The ZZB bounds MSE by an integral of minimum error probabilities between neighbouring parameter values, valley-filled for monotonicity. It asymptotes to the CRLB at high SNR and to the a priori variance at low SNR, tracing the threshold transition that the CRLB misses.
- 3.
I-MMSE ties MMSE to mutual information. On the canonical channel , for every input distribution. Mutual information is the integral of an MMSE curve.
- 4.
ISAC replaces either-or with a tradeoff region. The rate-CRB region is traced by SDP-based beampattern design: the non-convex minimisation becomes a convex SDP via the Schur complement, and the Pareto boundary morphs continuously between comm-only and sensing-only beams.
- 5.
Every bound has a proof pattern. Van Trees = Cauchy-Schwarz on the joint score. Ziv-Zakai = pairwise hypothesis testing + valley-filling. I-MMSE = heat equation + Stein's identity. SDP beampattern = Schur complement. Recognising these patterns is how one extends each bound to new settings.
Looking Ahead
Chapter 25 closes the book by surveying open problems that the bounds of this chapter touch but do not fully resolve: computation-estimation tradeoffs (when is the CRLB reachable in polynomial time?), sequential-estimation bounds (how do these scale as data arrive), and distributed estimation (how do Van Trees and I-MMSE extend when the prior is held by one agent and the likelihood by another?).