Prerequisites & Notation

Before You Begin

This chapter assumes familiarity with the following topics. If any item feels unfamiliar, revisit the linked material first.

  • Proximal operators and ADMM (Chapter 4) (Review ch04)

    Self-check: Can you state prox⁑f(v)=arg⁑min⁑x12βˆ₯xβˆ’vβˆ₯2+f(x)\operatorname{prox}_f(\mathbf{v}) = \arg\min_\mathbf{x} \tfrac{1}{2}\|\mathbf{x} - \mathbf{v}\|^2 + f(\mathbf{x}) and compute it for f=Ξ»βˆ₯β‹…βˆ₯1f = \lambda\|\cdot\|_1?

  • OAMP/VAMP and message-passing algorithms (Chapter 17) (Review ch17)

    Self-check: Can you explain the denoiser role in OAMP and how it connects to the MMSE denoiser under a separable prior?

  • Convolutional neural networks for imaging (Chapter 20) (Review ch20)

    Self-check: Can you describe DnCNN's residual architecture and why the noise residual is easier to learn than the clean image?

Notation for This Chapter

Symbols introduced or specialised in this chapter. See the global notation table for the full library conventions.

SymbolMeaningIntroduced
Dσ\mathcal{D}_\sigmaDenoiser operating at noise standard deviation σ\sigmas01
prox⁑f\operatorname{prox}_fProximal operator of function ffs01
L(D)L(\mathcal{D})Lipschitz constant of denoiser D\mathcal{D}s03
RRED(x)R_{\text{RED}}(\mathbf{x})RED regulariser: 12xT(xβˆ’DΟƒ(x))\tfrac{1}{2}\mathbf{x}^T(\mathbf{x} - \mathcal{D}_\sigma(\mathbf{x}))s04
JD(x)\mathbf{J}_\mathcal{D}(\mathbf{x})Jacobian of the denoiser evaluated at x\mathbf{x}s04
Ξ»\lambdaRegularisation parameters01
A\mathbf{A}Sensing / measurement matrixs01
c\mathbf{c}Discretised reflectivity vectors01
w\mathbf{w}Additive noise vectors01