Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

scipy.optimize.fmin_bfgs not handling functions with boundaries that go infinite or very large correctly (Trac #1644) #2169

scipy-gitbot opened this Issue Apr 25, 2013 · 5 comments


None yet
2 participants

Original ticket http://projects.scipy.org/scipy/ticket/1644 on 2012-04-17 by trac user jsalvatier, assigned to unknown.

Here's the toy problem I've set up:

from scipy.optimize import fmin_bfgs, fmin_ncg
from numpy import * 
import numpy as np 

def f(x ):
    if x < 0:
        return 1.79769313e+308
    else :
        return x + 1./x

xs = fmin_bfgs(f, array( [10.]), retall = True)

The solver returns [nan] as the solution.

The problem is designed to be stiff: between 0 and 1, it slopes upward to infinity but between 1 and infinity, it slopes up at a slope of 1. Left of 0 the function has a "nearly infinite" value. If bfgs encounters a value that's larger than the current value, it should try a different step size, no?

See this thread: http://mail.scipy.org/pipermail/scipy-user/2012-April/032088.html

It seems that what was actually happening is that in line_search_wolfe2, calling scalar_search_wolfe2 calling _zoom calling _cubicmin was returning a NaN if you add a test at the end of _cubicmin testing for a NaN (and then returning None), it finds the right minimum. It looks like _quadmin probably would have the same problem.

trac user jsalvatier wrote on 2012-04-17

The attached file has a very small modification to _cubicmin() which solves the particular problem I was experiencing. It looks like _quadmin right below _cubicmin could also use the same check. Perhaps there's a better way to do the check though. I also don't know what other places need similar checks.

Attachment added by trac user jsalvatier on 2012-04-17: linesearch.py

@WarrenWeckesser wrote on 2012-04-18

I wouldn't call this a bug. The BFGS algorithm is designed for differentiable functions (see http://en.wikipedia.org/wiki/BFGS_method). Having said that, if there is a change that makes the solver more robust (without breaking other cases), then I'm for it.

If you choose a better initial guess, it converges to the minimum with no problems:

In [23]: fmin_bfgs(func, 3.0)
Optimization terminated successfully.
         Current function value: 2.000000
         Iterations: 6
         Function evaluations: 27
         Gradient evaluations: 9
Out[23]: array([ 1.00000092])

fminbound works without problems, too:

In [30]: fminbound(func, 0, 100, xtol=1e-7)
Out[30]: 0.99999999935769768

@jsalvatier jsalvatier referenced this issue in pymc-devs/pymc3 Sep 12, 2013


[pymc3] Gamma(1,1) MAP issue #334


pv commented Sep 12, 2013

I think this is a bug in the line search: it shouldn't suggest a step that doesn't satisfy the conditions.
A patch is provided, so applying it should fix it --- returning None instead of nan seems to me to be the correct thing here.


pv commented Nov 29, 2013

This was fixed by gh-2864

@pv pv closed this Nov 29, 2013

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment