Skip to content

smrfeld/l_bfgs_tutorial

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Simple implementation of L-BFGS (low-memory)

Following the description on Wikipedia, reproduced here for completeness:

drawing

Results

We try it on the six-hump camelback function:

drawing

Some sample trajectories during optimization, starting from uniformly sampled points:

drawing

The endpoint distribution is (counts indicate number of trajectories that end at that location from 100 trajectories):

drawing

drawing

Most points have converged to one of the local minima, but others are stuck at saddle points, in particular at (0,0).

Noisy gradients to move away from local maxima

We can add some Gaussian noise to the gradients (as is common in machine learning due to the use of a small batch size) to move away from the local maxima:

drawing

drawing

About

L-BFGS tutorial in Python

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages