-
Intro : A simple optimization problem:
- Define
objective
function to be optimized. Let's minimize(x - 2)^2
- Suggest hyperparameter values using
trial
object. Here, a float value ofx
is suggested from-10
to10
- Create a
study
object and invoke theoptimize
method over 100 trials
- Define
-
Example 101 :
-
def objective(trial): x = trial.suggest_float('x', -10, 10) return (x - 2) ** 2 study = optuna.create_study() study.optimize(objective, n_trials=100) study.best_params # E.g. {'x': 2.002108042}``` you see the eqn. is (x-2)**2 -> x should be 2
-
-
Lets talk about general steps :
- Define an objective function to be maximized or Minimized.
- Suggest values of the hyperparameters using a trial object.
- Create a study object and optimize the objective function.
- Optimize the study (objective function , Trials)
- Get the pruned and complete trials inside variables
- print the best trial
-
In this repo i setup a basic Optuna hyper-parameter object for PyTorch basic model applied to MNIST dataset
-
Notifications
You must be signed in to change notification settings - Fork 0
Amr-Abdellatif/HyperParameters-Tuning-using-Optuna---PyTorch
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
starter pack for Hyperparameters Tuning using OPTUNA - PyTorch model
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published