Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add AutoML via Optuna #243

Merged
merged 2 commits into from
Dec 4, 2023
Merged

Add AutoML via Optuna #243

merged 2 commits into from
Dec 4, 2023

Conversation

crwhite14
Copy link
Collaborator

@crwhite14 crwhite14 commented Oct 22, 2023

This pull request adds Optuna for Darcy flow, meant as an example so that users can set up automatic hyperparameter tuning for their models and use-cases.

Changes

  • The evaluate method in Trainer now returns the errors dict
  • New file, hpo/tune_darcy.py, that launches optuna. It is similar to train_darcy.py, but most of the contents are inside a method, objective(trial). At the start of each trial, Optuna samples the most promising hyperparameters based on the results of the previous trials.

How to use Optuna

@JeanKossaifi
Copy link
Member

Thanks @crwhite14 it looks great!

@JeanKossaifi JeanKossaifi merged commit 57aa511 into neuraloperator:main Dec 4, 2023
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants