-
Notifications
You must be signed in to change notification settings - Fork 2.1k
API change in find_MAP #2539
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
API change in find_MAP #2539
Conversation
Is there a way to pass custom optimizer? |
Yes, directly using scipys minimize interface. Check here, section called "Custom minimizers". |
I see, thanks! |
Np! Ill see about the errors. I wasnt sure how to not compute the gradient with the |
pymc3/tuning/starting.py
Outdated
else: | ||
norm_grad = np.linalg.norm(grad) | ||
self.progress.set_description(self.desc.format(neg_value, norm_grad)) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
stray empty lines.
@bwengals Not computing the grad with logp_dlogp_function is not possible at the moment. Shouldn't be hard to add however, I can do that if you like. |
@aseyboldt Absolutely if you don't mind. I can also give it a whack, but would like to ask you a couple Q's first about We could work on that after this PR is merged as a separate project too. Depending on how tricky it may be. What do folks think? |
Hey Bill, do you think you can squeeze the changes mentions below into your PR? |
#2523 is the PR where I implemented the "output both" feature in master. I agree that a kwarg is even better. |
Oh, so instead of output both, only output the human-readable version of the parameter values (with a kwarg option to include the |
I'd also like the kwarg-option. |
LGTM. Is it ready? |
If everyone's ok with it, I think so! |
Thanks bill! |
* small typo fix in ValueGradFunction docstring, extra_args -> extra_vars * added different interface to scipy optimizers, removed errors * change error if optimization result is bad to warning * remove test for optimization error, remove transform=None, update find_MAP call args * update find_MAP call arg * small docstring change * remove blank links, remove extraneous callback arg * remove unused imports * test fail with precision a bit off, 9.996 vs. 10. Switching to previous default method of BFGS. * removed optimization check (since 'return_raw' is an option), added 'include_transformed' * remove unused import * need include_transformed=True
Fix for issue #2466. Closing #2468 in favor of this one.
method="L-BFGS-B"
instead offmin=optimize.fmin_l_bfgs_b
logp = inf
, etc.) last good optimizer value is returned. If this happens, a warning is printed, instead of an error.