Constrained Optimization
Why Constraints Are Essential
Real engineering problems always have constraints: power budgets,
physical limits, resource allocations, interference thresholds.
A solution that ignores constraints is useless in practice.
SciPy's minimize supports equality and inequality constraints
through several solvers, each with different strengths.
Definition: General Constrained Optimization Problem
General Constrained Optimization Problem
fg_i\le 0h_j{\mathbf{x} : g_i(\mathbf{x}) \le 0, ; h_j(\mathbf{x}) = 0 ; \forall i,j}$.
SciPy uses the convention that inequality constraints are (opposite sign). Always check the docs.
Definition: Lagrangian and KKT Conditions
Lagrangian and KKT Conditions
The Lagrangian of a constrained problem is:
The Karush-Kuhn-Tucker (KKT) conditions are necessary for optimality (and sufficient for convex problems):
- Stationarity:
- Primal feasibility: ,
- Dual feasibility:
- Complementary slackness:
The multipliers represent the sensitivity of the optimal value to constraint perturbations.
Definition: Bounds and LinearConstraint in SciPy
Bounds and LinearConstraint in SciPy
from scipy.optimize import minimize, Bounds, LinearConstraint
# Box constraints: lb <= x <= ub
bounds = Bounds(lb=[0, -np.inf], ub=[np.inf, 1])
# Linear constraint: lb <= A @ x <= ub
linear_con = LinearConstraint(
A=np.array([[1, 1], [2, -1]]),
lb=[0, -np.inf],
ub=[10, 5],
)
# Nonlinear constraint (dict form for SLSQP)
nonlinear_con = {
'type': 'ineq', # g(x) >= 0
'fun': lambda x: x[0]**2 + x[1]**2 - 1,
'jac': lambda x: np.array([2*x[0], 2*x[1]]),
}
Use Bounds and LinearConstraint objects with trust-constr;
use dict-form constraints with SLSQP.
Definition: SLSQP (Sequential Least Squares Programming)
SLSQP (Sequential Least Squares Programming)
SLSQP solves nonlinear constrained problems by solving a sequence of quadratic programming (QP) subproblems. At each iteration:
- Approximate by a quadratic model around
- Linearize the constraints
- Solve the resulting QP to get a search direction
- Perform a line search
result = minimize(f, x0, method='SLSQP',
jac=grad_f,
bounds=bounds,
constraints=[eq_con, ineq_con])
Pros: Handles equality + inequality + bounds; well-tested. Cons: Can be slow for large problems; may fail on highly nonlinear constraints.
Definition: trust-constr (Trust-Region Constrained)
trust-constr (Trust-Region Constrained)
The trust-constr method uses a trust-region approach with an
interior-point or SQP algorithm. It is SciPy's most modern
constrained solver and supports:
- Sparse Jacobians and Hessians
Bounds,LinearConstraint,NonlinearConstraintobjects- Verbose iteration output
from scipy.optimize import NonlinearConstraint
nlc = NonlinearConstraint(
fun=lambda x: x[0]**2 + x[1]**2,
lb=0, ub=1,
jac=lambda x: np.array([2*x[0], 2*x[1]]),
)
result = minimize(f, x0, method='trust-constr',
jac=grad_f, hess=hess_f,
constraints=nlc, bounds=bounds)
Definition: Linear Programming with scipy.optimize.linprog
Linear Programming with scipy.optimize.linprog
A linear program (LP) has a linear objective and linear constraints:
from scipy.optimize import linprog
result = linprog(
c, # cost vector
A_ub=A_ub, b_ub=b_ub, # inequality constraints
A_eq=A_eq, b_eq=b_eq, # equality constraints
bounds=bounds, # variable bounds
method='highs', # default: HiGHS solver
)
The HiGHS solver (default since SciPy 1.9) uses the revised simplex method or interior-point method and handles millions of variables.
Theorem: KKT Necessary Conditions
If is a local minimizer of the constrained problem and a constraint qualification holds (e.g., LICQ: the active constraint gradients are linearly independent), then there exist multipliers and such that the KKT conditions hold at .
For convex problems (convex , ; affine ), the KKT conditions are both necessary and sufficient for global optimality.
At the optimum, the negative gradient of must be expressible as a non-negative combination of active constraint gradients. If you could decrease without violating any active constraint, you haven't found the optimum yet.
Example: Mean-Variance Portfolio Optimization
Find the minimum-variance portfolio of assets with expected return at least , weights summing to 1, no short selling ().
Setup
import numpy as np
from scipy.optimize import minimize, Bounds
np.random.seed(42)
n = 5
# Expected returns and covariance
mu = np.array([0.05, 0.07, 0.10, 0.12, 0.15])
# Generate a realistic covariance matrix
A = np.random.randn(n, n) * 0.1
Sigma = A.T @ A + 0.01 * np.eye(n)
def portfolio_var(w):
return w @ Sigma @ w
def portfolio_var_grad(w):
return 2 * Sigma @ w
Solve with SLSQP
constraints = [
{'type': 'eq', 'fun': lambda w: np.sum(w) - 1},
{'type': 'ineq', 'fun': lambda w: mu @ w - 0.08},
]
bounds = Bounds(lb=0, ub=1)
w0 = np.ones(n) / n
res = minimize(portfolio_var, w0, method='SLSQP',
jac=portfolio_var_grad,
constraints=constraints, bounds=bounds)
print(f"Optimal weights: {res.x.round(4)}")
print(f"Portfolio variance: {res.fun:.6f}")
print(f"Expected return: {mu @ res.x:.4f}")
Example: Resource Allocation with Linear Programming
A factory produces two products. Product A yields $40/unit, Product B yields $50/unit. Each unit of A requires 2 hours of labor and 1 unit of material; each unit of B requires 1 hour of labor and 3 units of material. Available: 100 labor hours, 90 material units. Maximize profit.
Formulate as LP
Maximize subject to , , .
Since linprog minimizes, negate the objective:
from scipy.optimize import linprog
c = [-40, -50] # negate for maximization
A_ub = [[2, 1], [1, 3]]
b_ub = [100, 90]
bounds = [(0, None), (0, None)]
res = linprog(c, A_ub=A_ub, b_ub=b_ub, bounds=bounds)
print(f"x_A = {res.x[0]:.1f}, x_B = {res.x[1]:.1f}")
print(f"Max profit = ${-res.fun:.2f}")
Result
Optimal: , , profit = $2480.
The dual variables (shadow prices) in res.ineqlin.marginals
tell you the value of relaxing each constraint by one unit.
Constrained Optimization in 2D
Visualize the feasible region, objective contours, and optimal point for a 2D constrained problem.
Parameters
Geometry of KKT Conditions
Constrained Optimization Examples
# Code from: ch08/python/constrained_optimization.py
# Load from backend supplements endpointQuick Check
In SciPy's SLSQP, an inequality constraint {'type': 'ineq', 'fun': g} means:
g(x) <= 0
g(x) >= 0
g(x) == 0
g(x) != 0
SciPy's SLSQP requires inequality constraints to be non-negative: g(x) >= 0.
Common Mistake: Wrong Sign Convention for Constraints
Mistake:
Writing {'type': 'ineq', 'fun': lambda x: 1 - x[0]**2 - x[1]**2}
when you want , then being surprised when
the solver enforces .
Correction:
SciPy's SLSQP enforces g(x) >= 0. For the constraint
, write
{'type': 'ineq', 'fun': lambda x: 1 - x[0]**2 - x[1]**2}.
This is correct because .
With trust-constr, use NonlinearConstraint(fun, lb, ub) instead,
which avoids sign confusion.
Feasible Set
The set of all points satisfying all constraints of an optimization problem.
Lagrange Multiplier
A scalar or associated with a constraint that measures the rate of change of the optimal objective value with respect to the constraint bound.
Related: Feasible Set