Skip to content

Three optimization methods are implemented in this notebook, and since optimizing a function is an iterative process, three step methods are implemented as options as well.

Notifications You must be signed in to change notification settings

mahdis-repo/Optimization-Algorithms

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

This notebook entails:

  • Gradient Descent Algorithm
  • Newton's method
  • Hybrid Newton-Gradient methods, in case Hessian is not inversable

This implementation is able to use three step methods:

  • constant
  • line search
  • backtracking

About

Three optimization methods are implemented in this notebook, and since optimizing a function is an iterative process, three step methods are implemented as options as well.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published