Prerequisites & Notation
Before You Begin
This chapter assumes familiarity with multivariable calculus (gradients, Hessians), linear algebra (eigenvalues, positive definiteness), and basic NumPy/SciPy operations. If any of these feel unfamiliar, review the linked material first.
- NumPy array creation, slicing, and broadcasting (Chapter 5)(Review ch05)
Self-check: Can you compute a matrix-vector product with @ and reshape arrays?
- Linear algebra: solve, eigenvalues, SVD (Chapter 6)(Review ch06)
Self-check: Can you solve and compute eigenvalues in NumPy?
- Multivariable calculus: gradients and Hessians
Self-check: Can you compute and for ?
- Convexity basics: convex sets and convex functions
Self-check: Do you know that ?
Notation for This Chapter
Symbols and conventions used throughout this chapter. We write optimization problems in the standard minimization form.
| Symbol | Meaning | Introduced |
|---|---|---|
| Objective function to be minimized | s01 | |
| , | Gradient (vector of partial derivatives) | s01 |
| , | Hessian matrix (matrix of second partial derivatives) | s01 |
| \\mathbf{x}^\\star | Optimal solution | s01 |
| Inequality constraints | s02 | |
| Equality constraints | s02 | |
| Proximal operator of with step size | s04 | |
| \\mathcal{S}_\\lambda(x) | Soft-thresholding operator with threshold | s04 |
| norm: (promotes sparsity) | s03 | |
| Positive semidefinite (for matrices) | s03 |