References & Further Reading

References

  1. H. W. Engl, M. Hanke, and A. Neubauer, Regularization of Inverse Problems, Kluwer Academic Publishers, 1996

    The standard graduate-level reference for regularization theory. Covers the full spectrum from Hadamard well-posedness through Tikhonov, spectral, and iterative regularization with rigorous convergence analysis. Chapters 2–6 are the primary source for this chapter.

  2. P. C. Hansen, Discrete Inverse Problems: Insight and Algorithms, SIAM, 2010

    Excellent computational perspective on inverse problems. Particularly strong on the SVD, the L-curve, and practical implementation of regularization methods. The Regularization Tools MATLAB toolbox accompanies this book.

  3. J. Kaipio and E. Somersalo, Statistical and Computational Inverse Problems, Springer, 2005

    Bridges the classical (deterministic) regularization theory with the Bayesian approach. The Bayesian interpretation of Tikhonov regularization and the treatment of the posterior covariance follow this reference.

  4. A. N. Tikhonov and V. Y. Arsenin, Solutions of Ill-Posed Problems, Winston & Sons, 1977

    The foundational monograph by Tikhonov himself. Introduces the variational formulation of regularization and the concept of regularizing families. Of historical importance and still surprisingly readable.

  5. A. Kirsch, An Introduction to the Mathematical Theory of Inverse Problems, Springer, 2nd edition ed., 2011

    Mathematically rigorous treatment aimed at graduate students. Covers compact operators, singular value decomposition, regularization theory, and inverse scattering.

  6. V. A. Morozov, Methods for Solving Incorrectly Posed Problems, Springer, 1984

    The original monograph introducing the discrepancy principle for parameter selection. The theoretical treatment of the principle's optimality is developed here.

  7. F. Natterer and F. Wübbeling, Mathematical Methods in Image Reconstruction, SIAM, 2001

    Rigorous treatment of the Radon transform, Fourier slice theorem, and tomographic reconstruction. The limited-angle tomography example in Section 2.1 follows this reference.

  8. M. Benning and M. Burger, Modern Regularisation Methods for Inverse Problems, Acta Numerica, 2018

    A comprehensive survey covering classical regularization, variational methods including TV and sparsity-promoting penalties, and learning-based approaches. An excellent bridge to the advanced topics in Chapters 3–4.

  9. R. Tibshirani, Regression Shrinkage and Selection via the Lasso, Journal of the Royal Statistical Society B, 1996

    The original LASSO paper. Introduces the $\ell_1$ penalised regression estimator and discusses its sparsity-inducing properties.

  10. S. S. Chen, D. L. Donoho, and M. A. Saunders, Atomic Decomposition by Basis Pursuit, SIAM Journal on Scientific Computing, 1998

    Introduces Basis Pursuit (the noiseless LASSO) and Basis Pursuit Denoising in the signal processing context. Parallel independent development of the LASSO idea.

  11. E. J. Candes and T. Tao, Near-Optimal Signal Recovery From Random Projections, IEEE Transactions on Information Theory, 2006

    Proves the exact recovery guarantee for $\ell_1$ minimization under the Restricted Isometry Property (RIP). The foundation of modern compressed sensing theory.

  12. D. L. Donoho, Compressed Sensing, IEEE Transactions on Information Theory, 2006

    Companion paper to Candes–Tao, establishing the compressed sensing framework and the role of sparsity in dimensionality reduction.

  13. L. I. Rudin, S. Osher, and E. Fatemi, Nonlinear Total Variation Based Noise Removal Algorithms, Physica D, 1992

    The original ROF model paper introducing total variation regularization for image denoising. One of the most influential papers in computational imaging.

  14. A. B. Bakushinsky and M. Yu. Kokurin, Iterative Methods for Approximate Solution of Inverse Problems, Springer, 2004

    Comprehensive treatment of iterative regularization methods for nonlinear inverse problems, including convergence theory for IRGNM and Levenberg–Marquardt.

  15. M. Hanke, A Regularizing Levenberg–Marquardt Scheme, Inverse Problems, 1997

    The original convergence analysis of the Levenberg–Marquardt method as a regularization scheme for nonlinear ill-posed problems.

  16. M. Pastorino, Microwave Imaging, Wiley, 2010

    Comprehensive reference for microwave tomography including the Born iterative method, distorted-Born iterative method, and practical implementation aspects.

  17. G. H. Golub, M. Heath, and G. Wahba, Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter, Technometrics, 1979

    The original GCV paper in the ridge regression context. The GCV criterion for Tikhonov regularization parameter selection is developed here.

  18. G. Wahba, Spline Models for Observational Data, SIAM, 1990

    Develops the theory of GCV in the spline smoothing context. Chapter 4 covers the regularization parameter selection problem in depth.

  19. C. M. Stein, Estimation of the Mean of a Multivariate Normal Distribution, Annals of Statistics, 1981

    The original SURE paper. Proves the unbiasedness of the risk estimator for linear estimators of a Gaussian mean.

  20. Y. C. Eldar and M. Mishali, Robust Recovery of Signals From a Structured Union of Subspaces, IEEE Transactions on Information Theory, 2010

    Develops the group sparsity framework and recovery guarantees for joint sparse recovery from multiple measurements.

  21. I. Stojanovic, W. C. Karl, and M. Unser, Compressed Sensing of Monostatic and Multistatic SAR, IEEE Journal of Selected Topics in Signal Processing, 2010

    Applies compressed sensing and LASSO to SAR image formation, demonstrating super-resolution for sparse scenes.

  22. P. C. Hansen and D. P. O'Leary, The Use of the L-Curve in the Regularization of Discrete Ill-Posed Problems, SIAM Journal on Scientific Computing, 1992

    The definitive paper on the L-curve criterion for regularization parameter selection, including the curvature formula and its computational implementation.

Further Reading

  • Computational regularization tools for MATLAB

    P. C. Hansen, "Regularization Tools: A MATLAB Package for Analysis and Solution of Discrete Ill-Posed Problems," Numerical Algorithms, vol. 6, pp. 1–35, 1994.

    Provides a widely-used MATLAB toolbox implementing TSVD, Tikhonov, L-curve, GCV, and other regularization methods. Invaluable for hands-on experimentation. The toolbox is freely available online.

  • Statistical perspectives on inverse problems

    A. M. Stuart, "Inverse Problems: A Bayesian Perspective," Acta Numerica, vol. 19, pp. 451–559, 2010.

    Rigorous foundation for the Bayesian approach to inverse problems in function spaces. Bridges the gap between the finite-dimensional MAP estimates of this chapter and the infinite-dimensional posterior distributions of Chapter 3.

  • Compressed sensing theory for RF imaging

    M. A. Herman and T. Strohmer, "High-Resolution Radar via Compressed Sensing," IEEE Trans. Signal Processing, vol. 57, no. 6, 2009.

    Applies the RIP and compressed sensing guarantees to radar waveform design, demonstrating that random phase-coding waveforms yield sensing matrices satisfying the RIP. Connects Section 2.6 to radar systems design.

  • Total variation and convex optimization for imaging

    A. Chambolle, "An Algorithm for Total Variation Minimization and Applications," Journal of Mathematical Imaging and Vision, vol. 20, 2004.

    Introduces the Chambolle projection algorithm for TV denoising via the dual formulation. This efficient algorithm, and its extension to general imaging problems, is the computational backbone of TV-based reconstruction.

  • Nonlinear inverse scattering — mathematical theory

    D. Colton and R. Kress, "Inverse Acoustic and Electromagnetic Scattering Theory," Springer, 4th ed., 2019.

    The mathematical reference for inverse scattering theory. Chapters 5–7 develop the regularization theory for inverse scattering operators, directly relevant to the nonlinear problems of Section 2.7 and the electromagnetic forward models of Chapter 5.

  • Deep learning meets regularization theory

    S. Mukherjee et al., "Learned Convex Regularizers for Inverse Problems," arXiv:2008.02839, 2020.

    Proposes learning the convex regularizer $R(x)$ from training data rather than hand-designing it. Connects the variational framework of Section 2.6 to the learned regularizers of Chapter 29, showing that the convex structure ensures convergence guarantees.