Prerequisites & Notation

Before You Begin

This chapter requires nn.Module and CNNs (Chapters 26-27). Familiarity with probability distributions (Chapter 9) is helpful.

  • nn.Module, training loops, CNNs (Chapters 26-27)(Review ch27)

    Self-check: Can you build a U-Net and train it?

  • Probability distributions and sampling (Chapter 9)(Review ch09)

    Self-check: Can you sample from a Gaussian and compute KL divergence?

Notation for This Chapter

SymbolMeaningIntroduced
ptheta(mathbfx)p_{\\theta}(\\mathbf{x})Model distribution parameterised by θ\thetas01
qphi(mathbfzmathbfx)q_{\\phi}(\\mathbf{z}|\\mathbf{x})Encoder (approximate posterior) in VAEs01
DtextKLD_{\\text{KL}}Kullback-Leibler divergences01
varepsilont\\varepsilon_tNoise added at diffusion step tts03
betat\\beta_tNoise schedule coefficients03