Prerequisites & Notation

Before You Begin

This chapter assumes familiarity with the following topics. If any item feels unfamiliar, revisit the linked material first.

  • OAMP/VAMP algorithms, state evolution, and the orthogonality principle for non-i.i.d. sensing matrices (Chapter 17) (Review ch17)

    Self-check: Can you write the OAMP linear estimation step and explain the divergence-free condition?

  • Sparse recovery algorithms: ISTA, FISTA, ADMM, and their convergence properties (Chapter 13) (Review ch13)

    Self-check: Can you write one full iteration of ISTA and identify the proximal operator?

  • Kronecker structure of the RF sensing operator and efficient matrix-vector products (Chapter 8) (Review ch08)

    Self-check: Can you explain why A=A1βŠ—A2\mathbf{A} = \mathbf{A}_1 \otimes \mathbf{A}_2 reduces the cost of Ac\mathbf{A}\mathbf{c} from O(N2)O(N^2) to O(N3/2)O(N^{3/2})?

Notation for This Chapter

Symbols introduced in this chapter. See also the NGlobal Notation Table master table in the front matter.

SymbolMeaningIntroduced
x(k)\mathbf{x}^{(k)}Estimate at iteration/layer kks01
TTNumber of unrolled layers (OAMP iterations)s01
theta=thetatt=1T\\theta = \\{\\theta_t\\}_{t=1}^TLearnable parameters across all layerss01
mathcalDthetat\\mathcal{D}_{\\theta_t}Learnable ProxNet denoiser at layer tts01
sigmat2\\sigma_t^2Effective noise variance at layer tt (state evolution)s01
mathbfWtextLE\\mathbf{W}_{\\text{LE}}OAMP linear estimator (LMMSE matrix)s01
KKNumber of LISTA/ADMM unrolled layerss02
mathbfWk\\mathbf{W}_kLearnable linear transform at LISTA layer kks02
tauk\\tau_kLearnable threshold/step-size at layer kks02
rhok\\rho_kLearnable ADMM penalty parameter at layer kks02
boldsymboltauk\\boldsymbol{\\tau}_kSpatially-varying threshold map (hierarchical soft-thresholding)s04
otimes\\otimesKronecker products01