Prerequisites & Notation
Before You Begin
This chapter assumes familiarity with the following topics.
- Neural network architectures (CNNs, U-Nets) and training via backpropagation (Chapter 20) (Review ch20)
Self-check: Can you describe the U-Net architecture and explain the role of skip connections?
- Denoising as a building block for inverse problems; Tweedie's formula (Chapter 21) (Review ch21)
Self-check: Can you write the Gaussian denoising MMSE estimator as a conditional expectation?
- Generative models for imaging: score-based diffusion and DDPM (Chapter 22) (Review ch22)
Self-check: Can you explain why diffusion models require paired or at least distribution-level training data?
- Linear inverse problem formulation (Chapters 12--14) (Review ch12)
Self-check: Can you describe the forward model for RF imaging and the role of the sensing matrix ?
Notation for This Chapter
Symbols introduced or heavily used in this chapter.
| Symbol | Meaning | Introduced |
|---|---|---|
| Generator network with parameters and fixed random input (DIP) | s01 | |
| Fixed random input to the DIP network (not optimised) | s01 | |
| \\text{SURE}(f_\\theta) | Stein's Unbiased Risk Estimate for denoiser | s03 |
| \\operatorname{div}(f_\\theta) | Divergence of the denoiser: | s03 |
| Group transformation operator (rotation, shift, flip) for | s04 | |
| Group of transformations for equivariant imaging | s04 | |
| Equivariance loss | s04 | |
| Data-consistency loss | s04 |