Prerequisites & Notation
Before You Begin
This chapter builds directly on Chapter 26 (nn.Module, training loop). Familiarity with 2D convolutions from signal processing (Chapter 7) is helpful but not required.
- nn.Module, training loop, loss functions (Chapter 26)(Review ch26)
Self-check: Can you write a training loop with forward/backward/step?
- Signal processing: convolution, filtering (Chapter 7)(Review ch07)
Self-check: Do you know that convolution in the spatial domain = multiplication in frequency?
- Image representation as tensors: (B, C, H, W)
Self-check: Do you know that a batch of RGB images has shape (B, 3, H, W)?
Notation for This Chapter
| Symbol | Meaning | Introduced |
|---|---|---|
| Convolution kernel (filter) tensor | s01 | |
| Number of input and output channels | s01 | |
| Spatial kernel size | s01 | |
| Stride and padding | s01 | |
| Batch normalisation | s01 | |
| Element-wise addition (skip connection) | s02 |