-
Notifications
You must be signed in to change notification settings - Fork 830
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hyper-Parameter Optimization #293
Comments
What is the method of hyper-parameter optimization and are there any papers about it? |
This project supports Hyperband, bohb and population-based training. |
when i try to train, the error is "train_hpo.py: error: unrecognized arguments: --srch-algo", What should I do? |
Use the latest code and try again. |
I would like to ask what configuration is required for the machine to run Hyper-Parameter Optimization, thanks. |
Nothing special for machine configuration. |
This issue is stale because it has been open for 30 days with no activity. |
This issue was closed because it has been inactive for 14 days since being marked as stale. |
This guide explains how to train your model with hyperparameters optimization.
Before You Start
Following Getting Started to setup the environment and install requirements.txt dependencies.
We provide two search algorithms,
HyperOptSearch
andTuneBOHB
. You can refer here to learn more information about search algorithms inray[tune]
. In addition, we provide four trial schedulers,ASHA
,HyperBand
,PBT
, andBOHB
. More information about trial schedulers can be found here.Design Hyperparameters Search Space
There are many hyperparameters used for various training settings, such as batch size, learning rate, weight decay, and so on. Better initial guess will produce better results, so it's important to initialize these values range properly before tuning.
Each search algorithm has a specific way of defining the search space, for example, you can define
bohb
hyperparameters space as belowfast-reid/projects/HPOReID/train_hpo.py
Lines 178 to 190 in a5dac37
Here, we just want to search
batch_size
andnum_instance
. You can define your own hyperparameters search space.Then, you need to modify function
update_config
to make the hyperparameter valid.fast-reid/projects/HPOReID/train_hpo.py
Lines 96 to 118 in a5dac37
At last, you can decide which hyperparameters can be displayed on the console output.
fast-reid/projects/HPOReID/train_hpo.py
Line 206 in a5dac37
Define Fitness Score
Fitness is the value we seek to maximize. In fastreid, we have defined a default fitness score as a weighted combination of metrics: Rank@1 and mAP.
fast-reid/projects/HPOReID/hporeid/tune_hooks.py
Line 56 in a5dac37
Then just pass the name
score
to metric, andmax
to mode.fast-reid/projects/HPOReID/train_hpo.py
Line 138 in a5dac37
HPO Train
Then you can train hyperparameter optimization with BOHB(Bayesian Optimization with HyperBand) search algorithm with 20 trials like the following command line
python3 projects/HPOReID/train_hpo.py --config-file projects/HPOReID/configs/baseline.yml --srch-algo "bohb" --num-trials 20
If you don't have enough resource, you can allocate the specific GPU and CPU from
resources_per_trial
fast-reid/projects/HPOReID/train_hpo.py
Line 213 in a5dac37
Frequently asked questions about
ray[tune]
can be found here.The text was updated successfully, but these errors were encountered: