Prerequisites & Notation

Before You Begin

This chapter builds on the denoising and PnP frameworks of Chapter 21 and the iterative algorithms of Chapter 17. Familiarity with the following is assumed.

  • Score function and denoising score matching (Chapter 21)(Review ch21)

    Self-check: Can you state the relationship between the score βˆ‡xlog⁑pΟƒ(x)\nabla_\mathbf{x}\log p_\sigma(\mathbf{x}) and the MMSE denoiser?

  • OAMP/VAMP iterative reconstruction (Chapter 17)(Review ch17)

    Self-check: Can you write the OAMP iteration and explain the role of the denoiser module?

  • Linear inverse problems: y=Ac+w\mathbf{y} = \mathbf{A}\mathbf{c} + \mathbf{w} (Chapter 12)(Review ch12)

    Self-check: Can you define the forward model, the measurement residual, and the pseudoinverse?

  • Basic probability: Bayes rule, Gaussian conditioning

    Self-check: Can you state Bayes rule and compute the posterior for a linear Gaussian model?

Notation for This Chapter

Symbols introduced or emphasised in this chapter. See also the NGlobal Notation Table master table in the front matter.

SymbolMeaningIntroduced
βˆ‡xlog⁑p(x)\nabla_\mathbf{x}\log p(\mathbf{x})Score function of the distribution p(x)p(\mathbf{x})s01
sΞΈ(xt,t)\mathbf{s}_\theta(\mathbf{x}_t, t)Learned score network at diffusion time tts01
Ξ±Λ‰t\bar{\alpha}_tCumulative product of noise schedule: Ξ±Λ‰t=∏s=1t(1βˆ’Ξ²s)\bar{\alpha}_t = \prod_{s=1}^t (1-\beta_s)s01
x^0(xt)\hat{\mathbf{x}}_0(\mathbf{x}_t)Tweedie denoised estimate: E[x0∣xt]\mathbb{E}[\mathbf{x}_0 \mid \mathbf{x}_t]s01
ΞΆ\zetaGuidance scale controlling measurement consistency strength in DPSs02
TTTotal number of diffusion stepss01
NFE\text{NFE}Number of neural-network function evaluations per reconstructions04