Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP

Loading…

scipy.minimize l-bfgs-b method ignores maxiter #3334

Closed
jerkern opened this Issue · 7 comments

4 participants

@jerkern

The call

scipy.optimize.minimize(fun=fval, x0=params_local, method='l-bfgs-b', jac=True,
options=dict({'maxiter':10}), bounds=bounds)

appears to be ignoring the maxiter options since the function fval is called significantly more times than 10.

Tested on scipy 0.13.3

@argriffing
Collaborator

There is a separate argument called maxfun that limits the number of function evaluations. Maybe the algorithm does not evaluate the function exactly once per iteration.

@jerkern

maxfun does solve the issue I was having, thanks!

If some algorithms more or less ignore the maxiter parameter the documenation could be clearer (if this really is the intent)

http://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html#scipy.optimize.minimize

@ev-br
Collaborator

I don't think it's ignoring maxiter really. It's just what Alex said above: the number of function evaluations per iteration does not have to be exactly one. If you can think of a way of improving the docs in this regard, PRs are always welcome!

Closing the issue since this does not seem to be a bug.

@ev-br ev-br closed this
@jerkern

From my testing I could not see any effect at all when changing the max_iter argument, so I really think it is being ignored. It was a few weeks since i tested now, but as I remeber even with max_iter=1 it will make 100's of function calls.

But if the consensus is that it is working as expected the work-around using maxfun works for me, and if anyone else has problems with this in the future atleast a quick search should yield this discussion.

@pv
Owner
pv commented

Numerical differentiation evaluates the function nvars times per each iteration.
maxiter argument is working correctly.

@ev-br
Collaborator

Hmmm, let's see:

>>> from scipy.optimize import minimize
>>> def f(x):
...    return x*x -4.

Use a very bad initial guess and limit maxiter:

>>> minimize(fun=f, x0=[100.], method='l-bfgs-b', options={'maxiter': 1})
  status: 1
 success: False
    nfev: 18
     fun: array([-4.])
       x: array([ -2.20042311e-09])
 message: 'STOP: TOTAL NO. of ITERATIONS EXCEEDS LIMIT'
     jac: array([ 0.])
     nit: 2

Increase maxiter:

>>> minimize(fun=f, x0=[100.], method='l-bfgs-b', options={'maxiter': 5})
  status: 0
 success: True
    nfev: 18
     fun: array([-4.])
       x: array([ -2.20042311e-09])
 message: 'CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL'
     jac: array([ 0.])
     nit: 3
@pv
Owner
pv commented

Ideally, maxiter would have a similar meaning across the different solvers.
I don't know if this is the case currently (likely, but could be checked).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.