Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: Jacobian memoization for multivariate minimization #177

Merged
merged 4 commits into from Mar 12, 2012

Conversation

dlax
Copy link
Member

@dlax dlax commented Mar 5, 2012

  • The second commit fixes the current situation in which the objective function is evaluated twice when it also returns the jacobian in TNC and L-BFGS-B. See this discussion.
  • The last one introduces support for this behavior (i.e. fun(x, *args) -> f, fprime) in minimize thus making it available for all solvers.

I'm quite new to these memoization techniques so any comment very welcome.

Denis Laxalde added 3 commits March 5, 2012 11:50
… L-BFGS-B

The `MemoizeJac` decorator added in `optimize` is used for caching the
value of the gradient when the latter is calculated along with the
objective function.
Jacobian of objective function (if None, Jacobian will be
estimated numerically). Only for CG, BFGS, Newton-CG.
Jacobian of objective function. Only for CG, BFGS, Newton-CG.
If None, Jacobian will be estimated numerically. If the `jac==fun`,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changing the semantic role of a parameter based on object identity is IMHO a bit too magical here. I'd prefer to use jac=True or jac="combined" or some other non-callable special value to signal the changed behavior.

I wonder if one should be careful with the comparisons, as the object jac passed in may in principle implement a __eq__ method. Seems rare, though, but if yes, for a string, isinstance(jac, str) and jac == "combined", and for boolean, not callable(jac) and bool(jac) could do it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I haven't thought about that, thanks. I choosed the Boolean option and set the default value of jac to False. 4ed75ba

@pv
Copy link
Member

pv commented Mar 9, 2012

Ok, this gets +1 from me. I don't see obvious problems in merging this.

@rgommers
Copy link
Member

@dlaxalde: you'll have to rebase this on master and test again, because I just merged the L-BFGS-B 3.0 upgrade. If there aren't any problems, I'd say to ahead and push it.

dlax pushed a commit that referenced this pull request Mar 12, 2012
* enh/optimize/memoize-jac:
  ENH: jac is either a bool or a callable in minimize
  ENH: allow objective function to also return the jacobian in minimize
  ENH: memoize gradient when calculated along with obj. fun. in TNC and L-BFGS-B
  FIX: drop multiplie imports of optimize in tnc

Conflicts:
	scipy/optimize/lbfgsb.py
	scipy/optimize/minimize.py
	scipy/optimize/tnc.py
@dlax dlax merged commit 4ed75ba into scipy:master Mar 12, 2012
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants