Prerequisites & Notation

Prerequisites

This chapter requires understanding of:

  • LLM architecture (Chapter 35): GPT, attention, pre-training
  • PyTorch (Chapter 26): training loops, optimizers
  • Transfer learning (Chapter 33): fine-tuning concepts

Definition:

Notation

Symbol Meaning
rr LoRA rank
W0\mathbf{W}_0 Pre-trained weight matrix
ΔW=BA\Delta\mathbf{W} = \mathbf{B}\mathbf{A} Low-rank update
α\alpha LoRA scaling factor