Skip to content

A python impementation of the famous L-BFGS-B quasi-Newton solver.

License

Notifications You must be signed in to change notification settings

antoinecollet5/lbfgsb

Repository files navigation

LBFGSB

License Stars Python PyPI Downoads Build Status Documentation Status Coverage codacy Precommit: enabled Black Ruff Checked with mypy DOI

A python impementation of the famous L-BFGS-B quasi-Newton solver [1].

This code is a python port of the famous implementation of Limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS), algorithm 778 written in Fortran [2,3] (last update in 2011). Note that this is not a wrapper like minimize` in scipy but a complete reimplementation (pure python). The original Fortran code can be found here: https://dl.acm.org/doi/10.1145/279232.279236

References

[1] R. H. Byrd, P. Lu and J. Nocedal. A Limited Memory Algorithm for Bound
Constrained Optimization, (1995), SIAM Journal on Scientific and Statistical Computing, 16, 5, pp. 1190-1208.
[2] C. Zhu, R. H. Byrd and J. Nocedal. L-BFGS-B: Algorithm 778: L-BFGS-B,
FORTRAN routines for large scale bound constrained optimization (1997), ACM Transactions on Mathematical Software, 23, 4, pp. 550 - 560.
[3] J.L. Morales and J. Nocedal. L-BFGS-B: Remark on Algorithm 778: L-BFGS-B,
FORTRAN routines for large scale bound constrained optimization (2011), ACM Transactions on Mathematical Software, 38, 1.

The aim of this reimplementation was threefold. First, familiarize ourselves with the code, its logic and inner optimizations. Second, gain access to certain parameters that are hard-coded in the Fortran code and cannot be modified (typically wolfe conditions parameters for the line search). Third, implement additional functionalities that require significant modification of the code core.

Quick start

Given an optimization problem defined by an objective function and a feasible space:

import numpy as np
from lbfgsb.types import NDArrayFloat  # for type hints, numpy array of floats

 def rosenbrock(x: NDArrayFloat) -> float:
     """
     The Rosenbrock function.

     Parameters
     ----------
     x : array_like
         1-D array of points at which the Rosenbrock function is to be computed.

     Returns
     -------
     float
         The value of the Rosenbrock function.

     """
     x = np.asarray(x)
     sum1 = ((x[1:] - x[:-1] ** 2.0) ** 2.0).sum()
     sum2 = np.square(1.0 - x[:-1]).sum()
     return 100.0 * sum1 + sum2


 def rosenbrock_grad(x: NDArrayFloat) -> NDArrayFloat:
     """
     The gradient of the Rosenbrock function.

     Parameters
     ----------
     x : array_like
         1-D array of points at which the Rosenbrock function is to be derivated.

     Returns
     -------
     NDArrayFloat
         The gradient of the Rosenbrock function.
     """
     x = np.asarray(x)
     g = np.zeros(x.size)
     # derivation of sum1
     g[1:] += 100.0 * (2.0 * x[1:] - 2.0 * x[:-1] ** 2.0)
     g[:-1] += 100.0 * (-4.0 * x[1:] * x[:-1] + 4.0 * x[:-1] ** 3.0)
     # derivation of sum2
     g[:-1] += 2.0 * (x[:-1] - 1.0)
     return g

lb = np.array([-2, -2])  # lower bounds
ub = np.array([2, 2])  # upper bounds
bounds = np.array((l, u)).T  # The number of variables to optimize is len(bounds)
x0 = np.array([-0.8, -1])  # The initial guess

The optimal solution can be found following:

from lbfgsb import minimize_lbfgsb

x = minimize_lbfgsb(
  x0=x0, fun=rosenbrock, jac=rosenbrock_grad, bounds=bounds, ftol=1e-5, gtol=1e-5
)

minimize_lbfgsb returns an OptimalResult instance (from scipy) that contains the results of the optimization:

 message: CONVERGENCE: REL_REDUCTION_OF_F_<=_FTOL
 success: True
  status: 0
     fun: 3.9912062309350614e-08
       x: [ 1.000e+00  1.000e+00]
     nit: 18
     jac: [-6.576e-02  3.220e-02]
    nfev: 23
    njev: 23
hess_inv: <2x2 LbfgsInvHessProduct with dtype=float64>

See all use cases in the tutorials section of the documentation.

About

A python impementation of the famous L-BFGS-B quasi-Newton solver.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published