References & Further Reading
References
- D. L. Donoho, A. Maleki, and A. Montanari, Message-Passing Algorithms for Compressed Sensing, 2009
The original AMP paper. Derives the iteration via TAP, introduces the Onsager correction, and demonstrates the phase-transition behaviour.
- M. Bayati and A. Montanari, The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing, 2011
Rigorous proof of state evolution for AMP. Establishes Gaussianity of pseudo-data via a cavity argument on i.i.d. Gaussian matrices. The canonical theoretical reference.
- M. Bayati and A. Montanari, The LASSO Risk for Gaussian Matrices, 2012
Shows that AMP's state-evolution fixed point predicts the LASSO risk exactly in the proportional asymptotic regime, and derives the AMP-to-LASSO calibration equation.
- D. L. Donoho and J. Tanner, Counting Faces of Randomly Projected Polytopes When the Projection Radically Lowers Dimension, 2009
Combinatorial-geometry derivation of the Donoho--Tanner phase transition, later recovered from AMP state evolution.
- D. L. Donoho, A. Maleki, and A. Montanari, The Noise-Sensitivity Phase Transition in Compressed Sensing, 2011
Introduces minimax / parameter-free AMP; proves that AMP with the minimax threshold achieves the worst-case Bayes risk among sparse-signal priors.
- S. Rangan, Generalized Approximate Message Passing for Estimation with Random Linear Mixing, 2011
Extends AMP to generalised linear models with non-Gaussian likelihoods. Provides a second derivation of the Onsager term from the relaxed-BP viewpoint.
- S. Rangan, P. Schniter, A. K. Fletcher, and S. Sarkar, On the Convergence of Approximate Message Passing with Arbitrary Matrices, 2019
Comprehensive analysis of AMP divergence for non-i.i.d. matrices. Constructs counter-examples, analyses damping, and motivates VAMP.
- J. Ma and L. Ping, Orthogonal AMP, 2017
Introduces orthogonal AMP (OAMP), which imposes a divergence-free constraint to restore Gaussianity on rotationally-invariant matrices.
- S. Rangan, P. Schniter, and A. K. Fletcher, Vector Approximate Message Passing, 2019
Derives VAMP via expectation consistency. State evolution for VAMP holds under the right-rotationally-invariant matrix ensemble.
- C. A. Metzler, A. Maleki, and R. G. Baraniuk, From Denoising to Compressed Sensing, 2016
Introduces D-AMP β AMP wrapped around general (including learned) denoisers, with the divergence estimated by a Monte Carlo rule.
- M. Borgerding, P. Schniter, and S. Rangan, AMP-Inspired Deep Networks for Sparse Linear Inverse Problems, 2017
Proposes LAMP β a learned unrolling of AMP/VAMP into a deep network trained end-to-end. Foundation for the deep-unfolding literature in Chapter 21.5.
- G. Reeves and H. D. Pfister, The Replica-Symmetric Prediction for Random Linear Estimation with Gaussian Matrices Is Exact, 2019
Proves that the replica-symmetric MMSE prediction matched by Bayes- AMP is in fact the true Bayes MMSE, confirming AMP's asymptotic optimality.
- A. Montanari, Graphical Models Concepts in Compressed Sensing, Cambridge University Press, 2012
Tutorial-level introduction to AMP, state evolution, and their connection to graphical models. Highly recommended as a first read.
- G. Caire, Plug-and-Play / D-AMP variants for large-scale communications and imaging problems, 2023
Further Reading
Supplementary resources for deepening understanding of AMP, state evolution, and their modern extensions.
Free probability and random matrix theory for AMP
R. Couillet and M. Debbah, Random Matrix Methods for Wireless Communications (Cambridge, 2011)
The Marchenko--Pastur law and free independence are the background machinery that makes state evolution work. Chapters 3 and 7 cover the prerequisites.
AMP tutorial with code
P. Schniter, GAMP/VAMP MATLAB/Python toolbox (GitHub: GAMPmatlab)
Clean reference implementations of AMP/GAMP/VAMP with state-evolution tracking and automatic parameter tuning.
Deep unfolding perspective
V. Monga, Y. Li, and Y. C. Eldar, 'Algorithm unrolling: Interpretable, efficient deep learning for signal and image processing', IEEE SPM (2021)
Comprehensive survey of deep unfolding, of which LAMP/LVAMP are prominent examples.
Statistical physics origins
M. Mezard and A. Montanari, Information, Physics, and Computation (Oxford, 2009)
The cavity method, replica symmetry, and the TAP equations are presented self-contained, with direct applications to coding and compressed sensing.