Skip to content

Plug-and-play hydra sweepers for the EA-based multifidelity method DEHB and several population-based training variations, all proven to efficiently tune RL hyperparameters.

License

facebookresearch/how-to-autorl

AutoRL Hydra Sweepers

This repository contains hydra sweeper versions of proven AutoRL tuning tools for plug-and-play use. Currently included:

We recommend starting in the examples directory to see how the sweepers work. Assume that everything here is a minimizer! You can maximize instead by passing 'maximize=true' as a sweeper kwarg. For more background information, see here.

Installation

We recommend creating a conda environment to install the sweeper in. Choose which you want to use and install the dependencies for these sweepers. For all available options, use 'all' or:

conda create -n autorl-sweepers python==3.9
conda activate autorl-sweepers
pip install -e .[dehb,pb2,bgt]

If you want to work on the code itself, you can also use:

make install-dev

Examples

In 'examples' you can see example configurations and setups for all sweepers on Stable Baselines 3 agents. To run an example with the sweeper, you need to set the '--multirun' flag:

python examples/dehb_for_pendulum_ppo.py -m

Usage in your own project

In your yaml-configuration file, set hydra/sweeper to the sweeper name, e.g. DEHB:

defaults:
  - override hydra/sweeper: DEHB

You can also add hydra/sweeper=<sweeper_name> to your command line. The sweepers will only be found if the hydra_plugins directory is in your PYTHONPATH. You can check if it's loaded by running your script with --info plugins.

Hyperparameter Search Space

The definition of the hyperparameters is based on ConfigSpace. The syntax of the hyperparameters is according to ConfigSpace's json serialization. Please see their user guide for more information on how to configure hyperparameters.

Your yaml-configuration file must adhere to following syntax:

hydra:
  sweeper:
    ...
    search_space:
      hyperparameters:  # required
        hyperparameter_name_0:
          ...
        hyperparameter_name_1:
          ...
        ...

The configspace fields conditions and forbiddens aren't implemented for the solvers (except possibly DEHB?). Please don't use them, they'll be ignored.

Defining a uniform integer parameter is easy:

n_neurons:
  type: uniform_int  # or have a float parameter by specifying 'uniform_float'
  lower: 8
  upper: 1024
  log: true  # optimize the hyperparameter in log space
  default_value: ${n_neurons}  # you can set your default value to the one normally used in your config

Same goes for categorical parameters:

activation:
  type: categorical
  choices: [logistic, tanh, relu]
  default_value: ${activation}

Contribute

See the CONTRIBUTING file for how to help out.

License

This package is Apache 2.0 licensed, as found in the LICENSE file.

About

Plug-and-play hydra sweepers for the EA-based multifidelity method DEHB and several population-based training variations, all proven to efficiently tune RL hyperparameters.

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published