Prerequisites & Notation
Before You Begin
This chapter assumes familiarity with the following topics. If any item feels unfamiliar, revisit the linked material first.
- OAMP/VAMP algorithms, state evolution, and the orthogonality principle for non-i.i.d. sensing matrices (Chapter 17) (Review ch17)
Self-check: Can you write the OAMP linear estimation step and explain the divergence-free condition?
- Sparse recovery algorithms: ISTA, FISTA, ADMM, and their convergence properties (Chapter 13) (Review ch13)
Self-check: Can you write one full iteration of ISTA and identify the proximal operator?
- Kronecker structure of the RF sensing operator and efficient matrix-vector products (Chapter 8) (Review ch08)
Self-check: Can you explain why reduces the cost of from to ?
Notation for This Chapter
Symbols introduced in this chapter. See also the NGlobal Notation Table master table in the front matter.
| Symbol | Meaning | Introduced |
|---|---|---|
| Estimate at iteration/layer | s01 | |
| Number of unrolled layers (OAMP iterations) | s01 | |
| Learnable parameters across all layers | s01 | |
| Learnable ProxNet denoiser at layer | s01 | |
| Effective noise variance at layer (state evolution) | s01 | |
| OAMP linear estimator (LMMSE matrix) | s01 | |
| Number of LISTA/ADMM unrolled layers | s02 | |
| Learnable linear transform at LISTA layer | s02 | |
| Learnable threshold/step-size at layer | s02 | |
| Learnable ADMM penalty parameter at layer | s02 | |
| Spatially-varying threshold map (hierarchical soft-thresholding) | s04 | |
| Kronecker product | s01 |