Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Demonstration of using fit along with hyperparameter optimization #166

Closed
dionman opened this issue Mar 31, 2023 · 9 comments
Closed

Demonstration of using fit along with hyperparameter optimization #166

dionman opened this issue Mar 31, 2023 · 9 comments
Labels
enhancement New feature or request

Comments

@dionman
Copy link

dionman commented Mar 31, 2023

Is there a notebook demonstrating how to use the optuna hyperparameter optimization capabilities of the models as part of fitting?

@bcebere
Copy link
Contributor

bcebere commented Apr 4, 2023

Hello @dionman

Thank you for your question. It would be an interesting tutorial.

We will try a notebook this week with an Optuna example.

@bcebere bcebere added the enhancement New feature or request label Apr 4, 2023
@tztsai
Copy link
Contributor

tztsai commented Apr 6, 2023

Hello, is there anyone working on this issue? If not, I will work on it.

@bcebere
Copy link
Contributor

bcebere commented Apr 6, 2023

Hello @tztsai

Absolutely, thank you for working on this.
Let us know if you need help.

Just a few guidelines:

  • For a dataset and a task, you need to define an objective in Optuna.
  • To that end, you need to "translate" the hyperparam_space from a plugin to the optuna Trial suggestions.
  • Finally, given a set of args, you construct and evaluate the plugin.
  • For the evaluation, a good metric could be the detection.mlp metric, which discriminates real data from synthetic data. In other words, you ask Optuna to give you the hyperparams which create the closest synthetic data to the real data. Of course, many other metrics can be relevant here, it is up to you.

@tztsai
Copy link
Contributor

tztsai commented Apr 7, 2023

Hello @bcebere

I have finished the tutorial. In order to let Optuna sample from the hyperparam_space, I added a script optuna_sample.py under utils/. Please check them in the new pull request.

@dionman
Copy link
Author

dionman commented Apr 27, 2023

Does the current implementation of the plugins support HPO search algorithms that allow early pruning of unpromising trials, e.g. optuna's SuccessiveHalvingPruner?

@tztsai
Copy link
Contributor

tztsai commented Apr 27, 2023

I don't think so. To support early pruning, a plugin must support arbitrary callbacks in each iteration of training. A custom callback then can be provided that reports intermediate performance scores and raises optuna.TrialPruned() so as to prune this trial. However, the DDPM plugin does support a list of callbacks during fitting and I also implemented the Callback class in "synthcity/utils/callbacks.py". If needed, I can implement a PruneOptunaTrial callback class for early pruning.

@robsdavis
Copy link
Contributor

The tutorial created by @tztsai is available here, which closes this issue.

@tztsai
Copy link
Contributor

tztsai commented Aug 15, 2023 via email

@tztsai
Copy link
Contributor

tztsai commented Aug 15, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants