References & Further Reading
References
- J. Kaipio and E. Somersalo, Statistical and Computational Inverse Problems, Springer, 2005
The foundational text on statistical (Bayesian) approaches to inverse problems. Covers Gaussian priors, MAP and MMSE estimates, hyperparameter estimation, and MCMC sampling. Our treatment of the Bayesian framework (Sections s01-s02) follows this reference closely.
- A. M. Stuart, Inverse Problems: A Bayesian Perspective, Acta Numerica, 19:451-559, 2010
The seminal paper establishing the well-posedness theory for Bayesian inverse problems in infinite dimensions. Introduces Gaussian measures on function spaces and the Cameron-Martin framework for posterior analysis. Section s04 is based primarily on this work.
- M. Dashti and A. M. Stuart, The Bayesian Approach to Inverse Problems, Handbook of Uncertainty Quantification, Springer, pp. 311-428, 2017
A comprehensive survey extending Stuart (2010) with additional results on non-Gaussian priors, posterior consistency, and computational methods. Provides the theoretical backbone for Sections s04 and s05.
- A. Gelman, J. B. Carlin, H. S. Stern, D. B. Dunson, A. Vehtari, and D. B. Rubin, Bayesian Data Analysis, Chapman and Hall/CRC, 3rd edition ed., 2013
The standard graduate-level textbook on Bayesian statistics. Covers hierarchical models, empirical Bayes, MCMC diagnostics ($\hat{R}$, ESS), model comparison, and practical workflow. Essential background for Sections s03 and s05.
- M. E. Tipping, Sparse Bayesian Learning and the Relevance Vector Machine, 2001
Introduces sparse Bayesian learning, automatic relevance determination, and the relevance vector machine. The SBL EM algorithm in Section s03 is derived from this paper. Essential reading for understanding the connection between ARD and LASSO-type sparsity.
- D. P. Wipf and B. D. Rao, Sparse Bayesian Learning for Basis Selection, 2004
Proves the global sparsity of SBL optima under conditions on the sensing matrix, and establishes connections between SBL and $\ell_0$ minimization. The pitfall discussion on SBL local optima in Section s03 is informed by this analysis.
- C. M. Carvalho, N. G. Polson, and J. G. Scott, The Horseshoe Estimator for Sparse Signals, 2010
Introduces the horseshoe prior and proves its near-minimax properties for sparse estimation. The shrinkage profile analysis and comparison with Laplace and spike-and-slab priors in Section s02 follows this paper.
- R. Tibshirani, Regression Shrinkage and Selection via the Lasso, 1996
The original LASSO paper. While primarily a statistics paper, it establishes the connection between $\ell_1$ regularization and the Laplace prior MAP estimate — the foundational link in Section s02.
- E. I. George and R. E. McCulloch, Variable Selection via Gibbs Sampling, 1993
Introduces the spike-and-slab prior and Gibbs sampling for variable selection. The definition of spike-and-slab in Section s02 follows this formulation.
- A. P. Dempster, N. M. Laird, and D. B. Rubin, Maximum Likelihood from Incomplete Data via the EM Algorithm, 1977
The foundational EM algorithm paper. The SBL EM algorithm in Section s03 is a specific instance of this general framework applied to the ARD hyperparameter optimization problem.
- C. P. Robert and G. Casella, Monte Carlo Statistical Methods, Springer, 2nd edition ed., 2004
Comprehensive reference on MCMC methods: Metropolis-Hastings, Gibbs sampling, convergence diagnostics, and advanced techniques. Our MCMC treatment in Section s05 draws on their framework.
- M. Pereyra, Proximal Markov Chain Monte Carlo Algorithms, 2016
Develops scalable MCMC methods for log-concave posteriors in imaging, combining proximal operators with Langevin dynamics. Connects the optimization algorithms of Chapter 4 with the Bayesian sampling framework of this chapter.
Further Reading
Selected resources for readers who want to go deeper into specific topics from this chapter.
Bayesian inverse problems — comprehensive monograph
J. Kaipio and E. Somersalo, *Statistical and Computational Inverse Problems*, Springer, 2005
The most complete treatment of Bayesian inverse problems from the applied mathematics perspective. Covers everything from Gaussian priors to Monte Carlo methods for nonlinear inverse problems. The standard graduate reference for this chapter's topics.
Infinite-dimensional well-posedness theory
M. Dashti and A. M. Stuart, *The Bayesian Approach to Inverse Problems*, Handbook of UQ, Springer, 2017
The most accessible entry point to Stuart's function-space framework, with detailed proofs of the Cameron-Martin theorem and posterior well-posedness. Essential for readers who will work on continuum imaging problems.
Sparse Bayesian Learning in compressed sensing
D. P. Wipf and B. D. Rao, *An Empirical Bayesian Strategy for Solving the Simultaneous Sparse Approximation Problem*, IEEE Trans. Signal Processing, 2007
Extends SBL to multiple measurement vectors (MMV) — the case where multiple snapshots share the same support. Directly applicable to multi-frequency or multi-snapshot radar imaging.
Horseshoe and global-local shrinkage priors
N. G. Polson and J. G. Scott, *Shrink Globally, Act Locally: Sparse Bayesian Regularization and Prediction*, Bayesian Statistics 9, Oxford University Press, 2010
A thorough exposition of global-local shrinkage priors (horseshoe, Bayesian bridge, generalized double Pareto) in a unified framework. Clarifies when the horseshoe is preferred over LASSO and provides computational strategies.
Hamiltonian Monte Carlo
M. Betancourt, *A Conceptual Introduction to Hamiltonian Monte Carlo*, arXiv:1701.02434, 2017
An excellent tutorial on the geometric foundations of HMC, including the No-U-Turn Sampler (NUTS). Provides the intuition behind the leapfrog integrator and mass matrix adaptation discussed in Section s05.
Uncertainty quantification for imaging at scale
M. Pereyra, *Proximal Markov Chain Monte Carlo Algorithms*, Statistics and Computing, 2016
Develops scalable MCMC methods for non-smooth imaging problems (TV regularization, $\ell_1$ sparsity) using proximal Langevin dynamics. The bridge between the optimization algorithms of Chapter 4 and Bayesian sampling.