Prerequisites & Notation
Before You Begin
This chapter requires nn.Module and CNNs (Chapters 26-27). Familiarity with probability distributions (Chapter 9) is helpful.
- nn.Module, training loops, CNNs (Chapters 26-27)(Review ch27)
Self-check: Can you build a U-Net and train it?
- Probability distributions and sampling (Chapter 9)(Review ch09)
Self-check: Can you sample from a Gaussian and compute KL divergence?
Notation for This Chapter
| Symbol | Meaning | Introduced |
|---|---|---|
| Model distribution parameterised by | s01 | |
| Encoder (approximate posterior) in VAE | s01 | |
| Kullback-Leibler divergence | s01 | |
| Noise added at diffusion step | s03 | |
| Noise schedule coefficient | s03 |