Prerequisites & Notation
Before You Begin
Approximate Message Passing (AMP) sits at the intersection of statistical physics, compressed sensing, and high-dimensional statistics. To follow the derivation and the state-evolution analysis comfortably, ensure the following background is fresh.
- Compressed sensing: sparse recovery, RIP, basis pursuit, LASSO(Review ch17)
Self-check: Can you state the LASSO problem and its KKT conditions?
- ISTA / proximal gradient methods and soft-thresholding(Review ch18)
Self-check: Can you derive the soft-threshold operator as the proximal map of ?
- Gaussian i.i.d. random matrices and their concentration properties
Self-check: Do you know why fails in the proportional asymptotic regime ?
- MMSE estimation and scalar denoising in AWGN(Review ch06)
Self-check: Can you compute for a Bernoulli--Gaussian prior observed in Gaussian noise?
- Belief propagation on factor graphs and the Gaussian approximation
Self-check: Can you explain why sum-product messages on dense graphs converge to Gaussians?
- Basic statistical physics: mean-field theory, Bethe free energy (helpful, not required)
Self-check: Are you familiar with the cavity method?
Notation for This Chapter
AMP operates on a linear observation model with , and large. We track iterates indexed by
| Symbol | Meaning | Introduced |
|---|---|---|
| Sensing / measurement matrix (, typically i.i.d. Gaussian with entries) | s01 | |
| Unknown signal to recover (assumed sparse or structured) | s01 | |
| Observation vector | s01 | |
| Additive Gaussian noise, | s01 | |
| Undersampling ratio (measurements per unknown) | s01 | |
| Signal sparsity fraction () | s02 | |
| AMP estimate at iteration | s01 | |
| AMP residual at iteration | s01 | |
| Denoiser function with parameter (e.g., threshold, noise variance) | s01 | |
| Empirical average of the denoiser's derivative (Onsager coefficient) | s01 | |
| State-evolution variance at iteration | s02 | |
| Scalar MSE of the denoiser in effective noise of variance | s02 |