Prerequisites & Notation
Before You Begin
This chapter requires nn.Module and training loop (Chapter 26). Familiarity with time-series data and basic sequence processing is helpful.
- nn.Module and training loops (Chapter 26)(Review ch26)
Self-check: Can you train a PyTorch model on batched data?
- Backpropagation and gradient flow (Chapter 26)(Review ch26)
Self-check: Do you understand vanishing gradients in deep networks?
Notation for This Chapter
| Symbol | Meaning | Introduced |
|---|---|---|
| Hidden state at time step | s01 | |
| Input at time step | s01 | |
| Sequence length | s01 | |
| LSTM forget, input, output gates | s01 | |
| LSTM cell state | s01 |