Skip to content

omardovesi/optimization_algorithms

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Optimization Algorithms from Scratch

This project implements and compares optimization algorithms used in neural network training, including:

  • Gradient Descent
  • Momentum
  • Adam
  • Learning Rate Decay

The notebook visualizes loss curves and decision boundaries to compare optimizer behavior.

Files

  • optimization_algs.ipynb – main notebook implementation

Technologies

  • Python
  • NumPy
  • Matplotlib
  • Jupyter Notebook

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors