Skip to content

Conversation

bwengals
Copy link
Contributor

@bwengals bwengals commented Aug 1, 2017

For issue #2466.

  • float32 support
  • different arg, method="L-BFGS-B" instead of fmin=optimize.fmin_l_bfgs_b
  • allow keyboard interrupt, save current value
  • returns untransformed and transformed variables
  • removes monitor which fixes jump in memory usage, adds tqdm progress bar

- float32 support
- different arg, method="L-BFGS-B" instead of fmin=optimize.fmin_l_bfgs_b
- allow keyboard interrupt, save current value
- returns untransformed and transformed variables
- removes monitor, adds tqdm progress bar

r = fmin(logp_o, bij.map(start), fprime=grad_logp_o, callback=callback, *args, **kwargs)
if method in ["CG", "BFGS", "Newton-CG", "L-BFGS-B", "TNC",
"SLSQP", "dogleg", "trust-ncg"]:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would mean if I provide a custom method I can not use the gradient?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh good catch, it would be nice to be able to provide a custom method. I'll try and fix that.

@aseyboldt
Copy link
Member

@bwengals I think using model.logp_dlogp_function could be an option here. That is used in NUTS as well, and would avoid some code duplication.

@bwengals
Copy link
Contributor Author

bwengals commented Aug 2, 2017

@aseyboldt I actually tried that first, but got a bit tripped up. I wasn't sure about using the bijection stuff to go from a point to an array, versus using the dict_to_array and array_to_dict. And also if I could use this function to just return logp without the gradient. Did you write that one?

@aseyboldt
Copy link
Member

aseyboldt commented Aug 2, 2017

Yes. You could use something like:

func = model.logp_dlogp_function()
if func._extra_vars:  # make extra_args public?
    raise ValueError('Can only use find_MAP with continuous models')
func.set_extra_args({})

Then move between arrays and points through func.dict_to_array and func.array_to_dict.

At the moment you can't disable gradient computations, but this would be easy to add.

List of variables to set to MAP point (Defaults to all continuous).
fmin : function
Optimization algorithm (Defaults to `scipy.optimize.fmin_bfgs` unless
method : string or callable
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it really that much more convenient to allow strings here? In particular, is this minor convenience more important than preserving the API?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure. Some benefits I can think of is it matches up a bit better with sample and fit, which specify the method with strings. Also, minimize can take a homebrew optimizer, which may be nice. It also gives a consistent interface for all scipys optimizers. The different fmin_*` functions are sometimes a little bit different. Draw back is breaking peoples code...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Either way SGTM, just wanted to bring up the API stability concern.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah I'm not sure how folks feel about it

@kyleabeauchamp
Copy link
Contributor

Fixing the lack of transformed values is a pretty important feature, so that will be good to have.

@kyleabeauchamp
Copy link
Contributor

FWIW, passing the transformed RVs is sufficiently useful that I've started using this branch in my own work.

@bwengals
Copy link
Contributor Author

@kyleabeauchamp Ah sorry for my slowness finishing this up! Been buried in gp land... planning on being back on it tomorrow

@kyleabeauchamp kyleabeauchamp mentioned this pull request Aug 12, 2017
@kyleabeauchamp
Copy link
Contributor

See my PR into your fork: bwengals#1

@kyleabeauchamp
Copy link
Contributor

Do people have further thoughts on what to do about this PR? Do we want to continue with the deprecation of fmin?

I ask because the other components of this PR (e.g., MAP estimates of transformed + untransformed variables) are features that we and others are very interested in.

@kyleabeauchamp
Copy link
Contributor

In order to sidestep the discussion on the fmin() deprecation, I've implemented most of the fixes herein but without the new minimizer API and callback. See #2523

@bwengals bwengals mentioned this pull request Sep 4, 2017
@bwengals
Copy link
Contributor Author

bwengals commented Sep 4, 2017

closing in favor of #2539

@bwengals bwengals closed this Sep 4, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants