This repository implements gradient-based optimization methods across three classic problem families:
- 2D quadratic objective (smooth, strongly convex)
- 1D nonconvex objective (multi-start behavior and local minima)
- Least-squares linear regression (Diabetes dataset)
Implemented methods:
- fixed-step Gradient Descent (GD)
- theorem-based stepsize choice via smoothness constant (1/L)
- SciPy baseline solver (BFGS)
- Exact line search for least squares
- Backtracking Armijo line search
python -m venv .venv
# Linux/Mac
source .venv/bin/activate
# Windows (PowerShell)
.venv\Scripts\Activate.ps1
pip install -r requirements.txtpython scripts/make_all.pyFigures are saved to figures/ via src/gdlab/config.py:
figures_DIR = PROJECT_ROOT / "figures"The image links below assume figures live in figures/.
For a differentiable objective ( f:\mathbb{R}^d\to\mathbb{R} ), GD iterates as:
Optimality gap (used in many plots):
For parameter ( \beta \in \mathbb{R} ),
The minimizer satisfies ( \nabla f(x^\star)=0 ), equivalently:
For this quadratic, ( \nabla f ) is (L)-Lipschitz with
A standard safe fixed step size is
This objective has multiple local minima, so local optimizers can converge to different solutions depending on initialization.
Given (A \in \mathbb{R}^{m\times n}), (y\in\mathbb{R}^m),
Along steepest descent direction (d_k=-\nabla f(x_k)),
For least squares (quadratic), this has a closed form:
Starting from (\alpha_0), shrink (\alpha \leftarrow \beta\alpha) until:
- Objective surface
- Objective contours
- Contour plot with a single reference point
- Gradient Descent trajectory (α = 0.5)
- Gradient Descent trajectory (α = 1.0)
- Gradient Descent trajectory (α = 1/L)
- Optimality gap (α = 0.5 vs α = 1/L)
- SciPy solver iterates on contour plot
- Optimality gaps: SciPy vs GD
- Multi-start histogram summary (minimizers and objective values)
- Objective function with detected local minima
- Fixed-stepsize GD: optimality gaps
- Fixed steps vs Exact line search
- Fixed steps vs Exact line search vs Backtracking
- Stepsizes over iterations (Exact vs Backtracking)
pytest -qThe test suite includes:
- finite-difference gradient checks for the quadratic
- descent sanity checks
- Armijo condition verification for backtracking
- decrease check for exact line search on least squares
If you use this repository as a reference, please cite via CITATION.cff.
MIT (see LICENSE).














