Skip to content

alexis-mignon/python-lgbopt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

python-lgbopt

Light Gradient Based Optimization.

Author: Alexis Mignon (c) Oct. 2012 E-mail: alexis.mignon@gmail.com

This module provides routines for gradient based optimization. It focuses on the case where gradient computation is expensive and should be avoided as much as possible.

To achieve this, the inner line search procedure uses only one gradient estimation. The line search is then performed using quadratic and cubic interpolation as described in:

J. Nocedal and S. Wright. Numerical Optimization. Chap.3 p56

Three optimization schemes are provided:

  • steepest gradient descent (since sometimes it's still the most practicle way to do it).
  • low-memory GFBS Quasi-Newton method.
  • conjugate gradient (Fletcher-Reeves method)

Why writing optimization code while there exists optimized packages to do so ?

Because the problems I had to deal with had the following properties:

  • the gradient computation is expensive (don't even think of computing the Hessian),
  • the domains on which I have to optimize may not be Euclidean.

The only assumptions made on the optimization variables (and gradient values) is that they support some basic operations in normed vector spaces:

  • addition,
  • multiplication by a scalar,
  • inner product (a custom inner product function can be provided).

githalytics.com alpha

About

Light Gradient Based Optimization

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages