-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Except method #39
Comments
Turns out the print_step method is hard-coded to include up to five decimal points, and the points that the GP was probing were below 0.00001. The idea of excluding some points may still be worthwhile, so I won't close the issue just yet. |
Looking into providing constraints for the scipy minimize function in acq_max from helpers.py. Should only be a one or two line fix with minimal impact on performance. |
For exclusion of specific points, each point must be reformulated into an equality or inequality constraint. We want a non-equality constraint, i.e. x not equal to some y. This can be reformulated as |
Oy, I just realized the |
Bayesian Optimization (particularly with UCB acquisition function) is known to having a "thing" for edges, there are even a few papers in the literature of people trying to tackle this, however I didn't consider implementing it since I thought it was out of scope. Before I rather have the object giving bad, but predictable results (exact edge probing), than performing magic under the hood without the user's knowledge. So in this situation I think it would be better to let the user pass Finally, one interesting idea would be to generalize the optimization manifold from a box as is currently the case (only params lower and upper bounds are defined), to more interesting topologies. Similarly the ability to put constraints on bounds (such as |
Awesome, closing this issue. I'll follow up with a new issue for implementing constraints. I think it will be easier than it seems. |
It would be helpful to have an except method for the BayesianOptimization class. The idea would be to pass parameter values that you don't want to search. This would come in handy, for example, when tuning learning rates close to zero. I'm tuning three separate learning rates close to zero and it automatically searches each learning rate exactly equal to zero. I feel like this is buggy behavior, not sure if it comes from sklearn or bayes-opt. The model I'm trying to train is particularly tricky, and often the random initializations are degenerately better than the trained models. Thus, the optimization process exploits these degenerate cases thinking that they are optimal points.
It should be an easy fix, simply a matter of checking the right condition over an input list/dict. I'll try to submit a pull request and fix it myself if I get some time this weekend.
The text was updated successfully, but these errors were encountered: