Part 5: Modern High-Dimensional Inference
Chapter 21: OAMP, VAMP, and Beyond
Research~240 min
Learning Objectives
- Understand why AMP fails for non-i.i.d. sensing matrices and how orthogonality repairs it
- Derive OAMP as a divergence-free combination of LMMSE and denoiser steps
- Formulate VAMP via expectation consistency with two message-passing factors
- Implement OAMP efficiently for Kronecker-structured sensing matrices
- Extend AMP to generalized linear models through GAMP and 1-bit compressed sensing
- Unroll message-passing iterations into learned architectures (LISTA, LAMP, LDVAMP)
Sections
π¬ Discussion
Loading discussions...