OAMP/VAMP for the RF Imaging Operator
The OAMP Idea β Orthogonalize, Then Denoise
AMP fails for structured matrices because the residual is correlated with the estimate. The fix is conceptually simple: replace the scalar Onsager correction with a linear operator that explicitly orthogonalizes the error.
OAMP (Orthogonal AMP) achieves this by alternating two steps:
- LMMSE step: A linear estimator that uses the full SVD of to produce an estimate whose error is uncorrelated with the input.
- Denoiser step: A (possibly nonlinear) denoiser that exploits the prior on .
The orthogonality between the errors of the two steps is the key invariant that enables state evolution for arbitrary right-rotationally-invariant matrices β a much larger class than i.i.d. Gaussian.
Definition: Right-Rotationally-Invariant Matrices
Right-Rotationally-Invariant Matrices
A random matrix is right-rotationally invariant if its distribution is unchanged under right multiplication by any unitary matrix:
Equivalently, the SVD has uniformly distributed on the unitary group (Haar-distributed), independent of .
I.i.d. Gaussian matrices are right-rotationally invariant, but the class is much larger. For instance, any matrix of the form with Haar and deterministic is right-rotationally invariant β this includes matrices with arbitrary prescribed singular values.
Historical Note: The Parallel Development of OAMP and VAMP
2017β2019OAMP and VAMP were developed independently and nearly simultaneously. Ma and Ping (2017) derived OAMP by imposing a divergence-free condition on the linear estimator β they showed that the LMMSE estimator with a specific bias subtraction produces orthogonal errors. Rangan, Schniter, and Fletcher (2019) derived VAMP from the expectation consistency framework, arriving at an equivalent algorithm with a more symmetric structure. The two formulations are now understood to be equivalent for right-rotationally invariant matrices; we use "OAMP" throughout this chapter but note the correspondence.
Definition: OAMP Algorithm
OAMP Algorithm
Let be the (compact or full) SVD. OAMP alternates:
LMMSE step (linear estimator):
where is the LMMSE filter with prior variance (the MSE from the previous denoiser step).
MSE of LMMSE step:
Denoiser step (nonlinear estimator):
where .
OAMP for Linear Inverse Problems
Complexity: Dominated by the SVD (computed once): . Per iteration: for the LMMSE matrix-vector product.The SVD is computed once as preprocessing. All subsequent LMMSE steps use the precomputed singular values and vectors to evaluate the Woodbury-form inverse efficiently.
Theorem: Orthogonality of OAMP Error Vectors
Let be right-rotationally invariant with singular values . Define the errors and . Then in the large-system limit ( with ):
Moreover, the LMMSE output satisfies
so the denoiser at each iteration faces a scalar AWGN channel with known noise variance .
The LMMSE step is designed so that its output error is orthogonal to the input (the denoiser output from the previous iteration). This is the direct analogue of the Onsager decorrelation in AMP, but achieved through a matrix operation (the LMMSE filter) rather than a scalar correction. Because the orthogonalization uses the full SVD of , it works for any singular value distribution β not just the Marchenko-Pastur distribution of i.i.d. Gaussian matrices.
Right-rotational invariance implies Gaussianity
When with Haar-distributed, the projection has i.i.d. entries (by the rotational symmetry of the Gaussian distribution, or by the universality results of Bayati-Montanari). In this rotated basis, the problem decouples into scalar channels.
LMMSE produces uncorrelated output
The LMMSE estimator is the linear estimator that minimizes MSE. By the orthogonality principle of LMMSE estimation, the estimation error is orthogonal to the observation. The key insight is that acts as a pseudo-observation, and the LMMSE construction ensures .
Scalar channel equivalence
The Gaussianity of the rotated signal and the LMMSE structure together imply that each component of is approximately with β a scalar AWGN channel. This enables the denoiser to be designed for a known noise level.
Theorem: State Evolution for OAMP
For right-rotationally invariant with empirical singular value distribution converging to a deterministic limit, the MSE of OAMP is tracked by the two-step state evolution:
where is the MSE of the denoiser on a scalar AWGN channel with noise variance .
State evolution for OAMP involves two scalar quantities ( for the LMMSE step and for the denoiser step) rather than one. The first equation depends on the full singular value distribution of (not just as in AMP). This is why OAMP can handle structured matrices.
From OAMP orthogonality to scalar recursion
The orthogonality theorem (TOrthogonality of OAMP Error Vectors) ensures that the denoiser input is approximately . Therefore the denoiser's MSE is exactly the MMSE of the scalar denoising problem: .
LMMSE MSE from the singular values
The LMMSE estimator with prior variance and noise variance has MSE given by the well-known formula:
Substituting the SVD and using the singular values directly gives the stated formula.
OAMP State Evolution vs Simulation
Compare the state-evolution prediction (dashed) with the empirical NMSE of OAMP (solid) for different matrix types and problem sizes. State evolution becomes exact as grows.
Parameters
AMP vs OAMP/VAMP
| Property | AMP | OAMP/VAMP |
|---|---|---|
| Matrix requirement | I.i.d. Gaussian | Right-rotationally invariant |
| Decorrelation mechanism | Scalar Onsager correction | LMMSE orthogonalization (matrix-valued) |
| State evolution | One scalar: | Two scalars: |
| SE depends on | only | Full singular value distribution of |
| Preprocessing | None | SVD of (one-time) |
| Per-iteration cost | ||
| Denoiser requirement | Lipschitz, scalar divergence | Same β plus divergence estimation |
| Bayes-optimal? | Yes (for i.i.d. ) | Yes (for RRI ) |
| RF imaging matrices | Diverges | Converges |
Example: OAMP on a 2D RF Imaging Problem
Consider a 2D RF imaging scene with (so voxels), observed by Tx-Rx pairs and frequencies ( measurements). The sensing matrix is with partial DFT factors. The reflectivity is Bernoulli-Gaussian with and .
Run OAMP with BG-MMSE denoiser and compare the reconstruction with the backpropagation image .
Backpropagation baseline
The backpropagation image has NMSE with significant sidelobes, as expected from the limited aperture and non-uniform k-space coverage.
OAMP reconstruction
OAMP converges in 15 iterations to NMSE , correctly identifying all scatterers and suppressing sidelobes. The state evolution prediction is NMSE , confirming the accuracy of the SE framework.
State evolution tracking
The SE prediction tracks the empirical NMSE within 0.3 dB at every iteration, demonstrating that the right-rotational invariance assumption is well-satisfied for the partial DFT Kronecker matrix.
Why This Matters: OAMP in ISAC Systems
In integrated sensing and communications (ISAC) systems, the base station uses the same OFDM waveform for both data transmission and scene reconstruction. The sensing matrix inherits the Kronecker structure from the array geometry and OFDM subcarrier spacing. OAMP is the natural reconstruction algorithm because:
- The Kronecker structure makes AMP inapplicable.
- The LMMSE step exploits the known array/frequency structure.
- State evolution enables offline optimization of the denoiser and system parameters.
Chapter 32 applies OAMP to a full ISAC scenario with simultaneous communication and imaging.
See full treatment in Chapter 32
Quick Check
Which of the following matrices is right-rotationally invariant?
with i.i.d. entries
A partial DFT matrix (random rows of the DFT)
where is deterministic and is Haar-distributed
An i.i.d. Gaussian matrix has Haar-distributed right singular vectors and is right-rotationally invariant.
OAMP (Orthogonal Approximate Message Passing)
An iterative algorithm for linear inverse problems that alternates an LMMSE step (linear, exploiting the measurement matrix) with a denoiser step (nonlinear, exploiting the signal prior). The LMMSE step orthogonalizes the estimation errors, enabling state evolution for right-rotationally invariant matrices.
Related: OAMP Algorithm, Orthogonality of OAMP Error Vectors
VAMP (Vector Approximate Message Passing)
An equivalent formulation of OAMP derived from the expectation consistency framework (Rangan, Schniter, Fletcher, 2019). VAMP uses a symmetric two-denoiser structure and is provably Bayes-optimal for right-rotationally invariant matrices.
Denoiser divergence
The normalized trace of the Jacobian of the denoiser: . Required for computing the MSE update in OAMP. Can be estimated via the Hutchinson trace estimator for black-box denoisers.
Related: OAMP Algorithm