-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Early stopping using max_score #6
Comments
Hello David, I am glad to see, that you are still using Hyperactive. In the current version of v2.x Hyperactive cannot do an early stop. This feature will be implemented in v3.0 coming in a few weeks. However, part of the current Hyperactive development is the separation of the optimization algorithms into a simpler api. It is called Gradient-Free-Optimizers and is basically Hyperactive-lite. It will be the optimization-back-end of Hyperactive in the future. But it can also be used by itself, without Hyperactive: Your code would convert into the following with Gradient-Free-Optimizers: import time
import numpy as np
from gradient_free_optimizers import BayesianOptimizer
def my_model(para):
time.sleep(3)
return 0.9
search_space = {"n_estimators": np.arange(10, 200, 10)}
c_time = time.time()
opt = BayesianOptimizer(search_space)
opt.search(my_model, n_iter=8, max_score=0.9)
diff_time = time.time() - c_time
print("\n Optimization time:", diff_time) Gradient Free Optimizers is well tested and very easy to use, but it has a few restrictions compared to Hyperactive:
To make the relationship between Hyperactive and Gradient Free Optimizers clear, I will add a section into the readme about the differences and purposes of both packages. I hope I was able to help you. If you have additional questions or suggestions please let me know. |
Hi Simon, Thank you for your fast and in depth response! I appreciate the work-around code you have provided - it is working great. Thanks, |
I am trialling a use of Hyperactive in-conjunction with early stopping.
From my research in the code and docs of this package I can see that we can achieve early stopping by passing parameters along with the name of the chosen gradient free optimizer, like below (?):
However, the code below does not stop early and continues until all
n_iter
are completed, which is not the behaviour i expected. Please could you let me know what I am doing wrong trying to get early stopping to work.Full example:
Thanks
The text was updated successfully, but these errors were encountered: