Skip to content

Implementations of gradient descent, heavy ball, Nesterov acceleration, and Newton’s method on a convex test problem.

License

Notifications You must be signed in to change notification settings

ConnerFarrell/optimization-algorithms

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Project Title: Optimization Algorithms

Overview

This project implements classical optimization algorithms: Gradient Descent, Heavy Ball, Nesterov Accelerated Gradient, and Newton’s Method, and visualizes their trajectories on a convex test function.

Key Features

  • Implements multiple optimization methods from scratch in Python.
  • Plots convergence trajectories and compares iteration counts.
  • Educational example for convex optimization coursework.

Example Output

Gradient Descent iterations (tol=1e-5): 78

Heavy Ball iterations (tol=1e-5): 62

Nesterov iterations (tol=1e-5): 51

Newton iterations (tol=1e-5): 4

All Trajectories Newton iterations (tol=1e-10): 4 Newton Trajectory

Skills Demonstrated

  • Numerical optimization (GD, NAG, Newton)
  • Python (NumPy, Matplotlib)
  • Convex analysis

How to Run

pip install -r requirements.txt
python optimization_methods.py

About

Implementations of gradient descent, heavy ball, Nesterov acceleration, and Newton’s method on a convex test problem.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages