Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question on uncertainty quantification #448

Closed
RobvanGastel opened this issue Mar 4, 2024 · 2 comments
Closed

Question on uncertainty quantification #448

RobvanGastel opened this issue Mar 4, 2024 · 2 comments
Labels

Comments

@RobvanGastel
Copy link

Hi authors!

I am exploring your package and like the simplicity of the implementation to adaptively sample search spaces. Are there any ideas on adding uncertainty quantification to the sampling strategies like you would have when using Gaussian processes for Bayesian optimization with a space-filling / expected improvement acquisition function? Any response on this would be much appreciated. Thank you very much for maintaining this project!

@akhmerov
Copy link
Contributor

akhmerov commented Mar 4, 2024

Hey @RobvanGastel!

In principle, we have a learner based on scikit-optimize, a GP-based optimization project. It is, however, archived and we intend to remove that learner from adaptive (see #404).

Overall GP is costly because of needing $O(N^3)$ operations to build from scratch and some lower power to update. Most of the adaptive built-in learners target a lower complexity, but the overall approach makes it sufficiently straightforward to implement Bayesian learners within the same API.

@RobvanGastel
Copy link
Author

Hi Akhmerov, thank you for the fast answer. I wasn't aware of scikit-optimize being GP-based this is really helpful! Skopt is a good initial test for me to test your library.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants