References & Further Reading

References

  1. G. Caire and A. Rezaei, On the Illumination and Sensing Model for RF Imaging, 2026

    Comprehensive treatment of the RF imaging forward model, Kronecker-structured sensing matrices, and the CommIT simulator implementation. The Kronecker LMMSE factorization in Section 17.3 follows this work.

  2. D. L. Donoho, A. Maleki, and A. Montanari, Message-Passing Algorithms for Compressed Sensing, 2009

    Introduces AMP for compressed sensing and demonstrates its connection to belief propagation on dense factor graphs. Section 17.1 recaps this derivation.

  3. M. Bayati and A. Montanari, The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing, 2011

    Rigorous proof of state evolution for AMP with Lipschitz denoisers. The state evolution theorem in Section 17.1 is based on this work.

  4. S. Rangan, P. Schniter, and A. K. Fletcher, Vector Approximate Message Passing, 2019

    Introduces VAMP for right-rotationally invariant matrices. The OAMP algorithm and state evolution in Section 17.2 are based on this paper. Includes the comparison table of AMP vs VAMP.

  5. J. Ma and L. Ping, Orthogonal AMP, 2017

    Independent derivation of OAMP via the divergence-free condition. The OAMP algorithm in Section 17.2 can be seen as equivalent to VAMP under right-rotational invariance.

  6. P. Schniter and S. Rangan, Compressive Phase Retrieval via Generalized Approximate Message Passing, 2015

    Extends GAMP to phase retrieval and discusses the OAMP/VAMP state evolution framework. Section 17.2 draws on the state evolution presentation.

  7. Y. Kabashima, A CDMA Multiuser Detection Algorithm on the Basis of Belief Propagation, 2003

    Early derivation of AMP-like algorithms for CDMA via statistical physics (TAP equations). Historical context for the Onsager correction in Section 17.1.

  8. A. K. Fletcher and S. Rangan, Iterative Reconstruction of Rank-One Signals in Noise, 2018

    Mismatched state evolution analysis for VAMP. The mismatch bounds in Section 17.5 are based on this work.

  9. C. A. Metzler, A. Maleki, and R. G. Baraniuk, From Denoising to Compressed Sensing, 2016

    D-AMP: using off-the-shelf denoisers (BM3D) within AMP. The bridge to learned denoisers in Section 17.4 builds on this work.

  10. K. Zhang, Y. Li, W. Zuo, L. Zhang, L. Van Gool, and R. Timofte, Plug-and-Play Image Restoration with Deep Denoiser Prior, 2021

    DRUNet: noise-level-conditional U-Net denoiser for plug-and-play algorithms. Section 17.4 discusses this architecture for noise-aware OAMP.

  11. G. Caire and A. Rezaei, Learned OAMP for Near-Field RF Imaging, 2024

    CommIT contribution: replacing BG-MMSE with a trained DnCNN in OAMP for RF imaging, demonstrating 3--5 dB improvement on non-sparse scenes. The commit_contribution block in Section 17.4 is based on this paper.

Further Reading

  • Generalized AMP (GAMP) for non-Gaussian likelihoods

    S. Rangan, Generalized Approximate Message Passing for Estimation with Random Linear Mixing, IEEE ISIT, 2011

    Extends AMP to non-Gaussian output channels (quantized observations, Poisson noise). Chapter 18 develops GAMP and EM-GAMP for RF imaging.

  • Deep unfolding of OAMP

    H. He, S. Jin, C.-K. Wen, F. Gao, G. Y. Li, and Z. Xu, Model-Driven Deep Learning for Physical Layer Communications, IEEE WCM, 2019

    Unrolling the OAMP iterations into a trainable neural network. The deep-unfolding perspective on OAMP is developed in Chapter 27.

  • Rigorous state evolution via the adaptive interpolation method

    J. Barbier, F. Krzakala, N. Macris, L. Miolane, and L. Zdeborova, Optimal Errors and Phase Transitions in High-Dimensional Generalized Linear Models, PNAS, 2019

    The information-theoretic foundation for why OAMP with the Bayes-optimal denoiser achieves the MMSE, using techniques from statistical physics.

  • Expectation propagation and its connection to OAMP

    T. P. Minka, Expectation Propagation for Approximate Bayesian Inference, UAI, 2001

    OAMP can be derived from expectation propagation (EP) applied to the linear model. Understanding EP provides a unified view of message-passing algorithms.

  • Finite-size corrections to OAMP state evolution

    C. Rush and R. Venkataramanan, Finite Sample Analysis of Approximate Message Passing Algorithms, IEEE Trans. IT, 2018

    Addresses the gap between the large-system-limit predictions of state evolution and the finite-dimensional reality of practical imaging problems.