Skip to content

Conversation

@fkiraly
Copy link
Collaborator

@fkiraly fkiraly commented Jul 7, 2025

This PR adds:

  • an optuna based optimizer, inheriting BaseOptimizer
  • dependency handling for estimatorsin the test framework:
    • objects' dependency tags are now meaningfully used
    • the TestAll classes now test only if dependencies of the object are satisfied

WIP - The optuna interface is still very basic and has few configurability options. Samplers and pruner selection should be added, and the possibility to provide more complex trials should be explored.

@fkiraly fkiraly added the enhancement New feature or request label Jul 7, 2025
@SimonBlanke
Copy link
Collaborator

Should we put optuna into the "all_extras" requirements?

@SimonBlanke
Copy link
Collaborator

The opt/una thing only works in this PR, otherwise it is confusing. Please change.

@SimonBlanke
Copy link
Collaborator

Add optuna optimizer to "opt"-level init file

@@ -0,0 +1,6 @@
"""Grid search with sklearn style grid and backends."""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

copy-paste error

@SimonBlanke
Copy link
Collaborator

I am not that experienced in optuna, but this just always uses the default sampler/optimizer, right? Is this behaviour intended?

def _run(self, experiment, param_space, n_trials):
import optuna

study = optuna.create_study(direction="minimize")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to this PR: #142
The direction must go into the experiment. So we just hard-code it here and then handle it separately in the experiment, right?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, that was my feeling - the optimizers being minimizers all, and expeirments having the sign.

Which of course begs the sincere question, are all optimizers currently minimizers?

# Evaluate experiment with suggested params
return self.experiment(**params)

def _run(self, experiment, param_space, n_trials):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Arguments not used. I am still not sure about this part in our API. But I will take a look at this at a later point.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, but they need to be there unfortunately, due to the API design passing them as a means of convenience.

if isinstance(low, int) and isinstance(high, int):
params[key] = trial.suggest_int(key, low, high)
else:
params[key] = trial.suggest_float(key, low, high, log=False)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we could already handle other things, like distributions or categorical here.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes - I was still thinking what the best interface and parameterization is.

Copy link
Collaborator

@SimonBlanke SimonBlanke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We pass very litte arguments to optuna. General parameters like "random_state" should be passed.
We should also use the same structure as other optimization backend implementations: Like moving some code into the _adapters module.

@fkiraly
Copy link
Collaborator Author

fkiraly commented Jul 9, 2025

We pass very litte arguments to optuna. General parameters like "random_state" should be passed.

Agree - this was still a proof-of-concept draft.

We should also use the same structure as other optimization backend implementations: Like moving some code into the _adapters module.

Disagree - I think adapters make sense only if there are at least two classes using it. I do not think there will be a second, since optuna is a (configurable) all-in-one optimizer.

@SimonBlanke
Copy link
Collaborator

Disagree - I think adapters make sense only if there are at least two classes using it. I do not think there will be a second, since optuna is a (configurable) all-in-one optimizer.

I see your point and agree with it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants