References & Further Reading

References

  1. J. Ma and L. Ping, Orthogonal AMP, 2017

    Introduces OAMP. Enforces a divergence-free condition on both the linear and denoising steps so that state evolution survives for the right-rotationally-invariant matrix class.

  2. S. Rangan, P. Schniter, and A. K. Fletcher, Vector Approximate Message Passing, 2019

    Derives VAMP via expectation consistency. Proves state evolution for right-rotationally-invariant sensing matrices and establishes Bayes-optimality under matched priors.

  3. M. Opper and O. Winther, Expectation Consistent Approximate Inference, 2005

    The statistical-physics origin of the expectation-consistent framework from which VAMP was later rediscovered.

  4. S. Rangan, Generalized Approximate Message Passing for Estimation with Random Linear Mixing, 2011

    Extends AMP to generalized linear models. Introduces the two-variance (input / output) state evolution that handles non-Gaussian likelihoods such as Poisson and 1-bit CS.

  5. L. Jacques, J. N. Laska, P. T. Boufounos, and R. G. Baraniuk, Robust 1-Bit Compressive Sensing via Binary Stable Embeddings of Sparse Vectors, 2013

    Formalises 1-bit compressed sensing, proves stable-embedding guarantees, and motivates the output-denoiser family used by 1-bit GAMP.

  6. J. P. Vila and P. Schniter, Expectation-Maximization Gaussian-Mixture Approximate Message Passing, 2013

    EM-GAMP: automatic hyperparameter tuning for GAMP via expectation-maximisation updates of the prior and noise parameters. Makes GAMP parameter-free in practice.

  7. K. Takeuchi, On the Convergence of Orthogonal/Vector AMP: Long-Memory Message-Passing Strategy, 2022

    Rigorous analysis of mismatch in OAMP/VAMP: how incorrect assumed prior / noise parameters propagate through state evolution and how long-memory variants mitigate the damage.

  8. M. F. Hutchinson, A Stochastic Estimator of the Trace of the Influence Matrix for Laplacian Smoothing Splines, 1990

    The original stochastic trace estimator, used in OAMP/VAMP to estimate $\mathrm{tr}(\mathbf{W}\ntn{sens})$ without forming the full matrix product.

  9. K. Gregor and Y. LeCun, Learning Fast Approximations of Sparse Coding, 2010

    Introduces LISTA — the first deep-unfolding paper. Trains a fixed-depth network whose layers mimic ISTA iterations.

  10. M. Borgerding, P. Schniter, and S. Rangan, AMP-Inspired Deep Networks for Sparse Linear Inverse Problems, 2017

    LAMP and LDVAMP: unrolled AMP / VAMP networks with learnable feedback matrices, denoiser parameters, and Onsager coefficients. Benchmarks against fixed-parameter baselines.

  11. C. A. Metzler, A. Mousavi, and R. G. Baraniuk, Learned D-AMP: Principled Neural Network Based Compressive Image Recovery, 2017

    Learned Denoising AMP: wraps a CNN denoiser inside an AMP loop, with Monte-Carlo divergence estimation, for image recovery.

  12. P. Schniter, S. Rangan, and A. K. Fletcher, Vector Approximate Message Passing for the Generalized Linear Model, 2016

    Extends VAMP to the generalized-linear-model setting, merging the strengths of VAMP (RRI robustness) and GAMP (non-Gaussian likelihoods).

  13. K. Takeuchi, Rigorous Dynamics of Expectation-Propagation-Based Signal Recovery from Unitarily Invariant Measurements, 2017

    Rigorous proof of OAMP/VAMP state evolution for unitarily-invariant matrices using the conditional- distribution argument.

  14. A. Javanmard and A. Montanari, State Evolution for General Approximate Message Passing Algorithms, with Applications to Spatial Coupling, 2013

    Rigorous state-evolution proof for generalised AMP iterations, covering GAMP and spatially coupled constructions.

  15. M. Dehkordi, P. Jung, G. Caire, Unrolled OAMP with Kronecker Structure for Distributed RF Imaging, 2023

Further Reading

Additional resources on OAMP/VAMP/GAMP and learned message passing.

  • Algorithm unrolling survey

    V. Monga, Y. Li, and Y. C. Eldar, 'Algorithm Unrolling: Interpretable, Efficient Deep Learning for Signal and Image Processing', IEEE Signal Processing Magazine, 2021

    Comprehensive survey connecting LISTA/LAMP/LDVAMP to the broader deep-unfolding literature in imaging and signal processing.

  • GAMP and its extensions in practice

    P. Schniter, GAMP/VAMP toolbox (GitHub: GAMPmatlab)

    Production-quality implementations of GAMP, VAMP, EM-GAMP, and their variants, with state-evolution tracking.

  • Bayesian compressive sensing textbook

    M. Elad, Sparse and Redundant Representations (Springer, 2010)

    Background on sparse modelling, ISTA/FISTA, and the proximal viewpoint that LISTA/LAMP unroll.

  • VAMP in MIMO detection

    A. K. Fletcher, S. Rangan, and P. Schniter, 'Inference in Deep Networks in High Dimensions', Proc. IEEE ISIT, 2018

    Applies VAMP to MIMO detection and layered inference, a direct bridge from Chapter 21 to practical wireless receivers.