Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add keyword n_tries to maximum_likelihood and maximum_a_posteriori #2280

Closed
wants to merge 2 commits into from

Conversation

itsdfish
Copy link
Contributor

@itsdfish itsdfish commented Jul 8, 2024

This PR adds a keyword n_tries to run maximum_likelihood and maximum_a_posteriori from multiple starting points to mitigate problems with local maxima (see #2279). I added two unit tests and updated the doc strings. Please let me know if there are any edits you would like me to make.

@devmotion
Copy link
Member

Generally, if local minima/maxima are problematic - wouldn't it be better to adjust the optimization method (if possible) or turn to global optimization methods? Or use a less naive multi-start optimization method such as MultistartOptimization.jl (https://docs.sciml.ai/Optimization/stable/optimization_packages/multistartoptimization/)? I'm a bit worried that exposing the functionality in the PR in an official API might make it easier for users to miss these potentially better alternatives.

@itsdfish
Copy link
Contributor Author

itsdfish commented Jul 8, 2024

I was not aware of MultistartOptimization. In that case, it would be better to use that and disregard the PR.

@itsdfish itsdfish closed this Jul 9, 2024
@mhauru
Copy link
Collaborator

mhauru commented Jul 9, 2024

Thanks @devmotion, that looks handy; Sorry @itsdfish that I wasn't aware of that when we talked about this.

We could still use the test case you wrote just to check that MultistartOptimization solves it. Or alternatively use your example in the docs. I feel like this is not going to be the last time local maxima come up, and would be handy to have some code to point at and say "you could try doing this".

@itsdfish
Copy link
Contributor Author

itsdfish commented Jul 9, 2024

@mhauru, no worries! It looks like at least 3 people were unaware of MultistartOptimization. Let me know if you would like me to reporpose this PR as a test for MultistartOptimization. I can update later this week.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants