-
Notifications
You must be signed in to change notification settings - Fork 65
[ENH] optuna based optimizer
#140
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Should we put optuna into the "all_extras" requirements? |
|
The opt/una thing only works in this PR, otherwise it is confusing. Please change. |
|
Add optuna optimizer to "opt"-level init file |
| @@ -0,0 +1,6 @@ | |||
| """Grid search with sklearn style grid and backends.""" | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
copy-paste error
|
I am not that experienced in optuna, but this just always uses the default sampler/optimizer, right? Is this behaviour intended? |
| def _run(self, experiment, param_space, n_trials): | ||
| import optuna | ||
|
|
||
| study = optuna.create_study(direction="minimize") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
According to this PR: #142
The direction must go into the experiment. So we just hard-code it here and then handle it separately in the experiment, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, that was my feeling - the optimizers being minimizers all, and expeirments having the sign.
Which of course begs the sincere question, are all optimizers currently minimizers?
| # Evaluate experiment with suggested params | ||
| return self.experiment(**params) | ||
|
|
||
| def _run(self, experiment, param_space, n_trials): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Arguments not used. I am still not sure about this part in our API. But I will take a look at this at a later point.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, but they need to be there unfortunately, due to the API design passing them as a means of convenience.
| if isinstance(low, int) and isinstance(high, int): | ||
| params[key] = trial.suggest_int(key, low, high) | ||
| else: | ||
| params[key] = trial.suggest_float(key, low, high, log=False) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we could already handle other things, like distributions or categorical here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes - I was still thinking what the best interface and parameterization is.
SimonBlanke
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We pass very litte arguments to optuna. General parameters like "random_state" should be passed.
We should also use the same structure as other optimization backend implementations: Like moving some code into the _adapters module.
Agree - this was still a proof-of-concept draft.
Disagree - I think adapters make sense only if there are at least two classes using it. I do not think there will be a second, since |
I see your point and agree with it. |
This PR adds:
optunabased optimizer, inheritingBaseOptimizerTestAllclasses now test only if dependencies of the object are satisfiedWIP - The
optunainterface is still very basic and has few configurability options. Samplers and pruner selection should be added, and the possibility to provide more complex trials should be explored.