Find file Copy path
Fetching contributors…
Cannot retrieve contributors at this time
73 lines (43 sloc) 3.26 KB

Tune Search Algorithms

Tune provides various hyperparameter search algorithms to efficiently optimize your model. Tune allows you to use different search algorithms in combination with different trial schedulers. Tune will by default implicitly use the Variant Generation algorithm to create trials.

You can utilize these search algorithms as follows:

run_experiments(experiments, search_alg=SearchAlgorithm(...))

Currently, Tune offers the following search algorithms:

Variant Generation (Grid Search/Random Search)

By default, Tune uses the default search space and variant generation process to create and queue trials. This supports random search and grid search as specified by the config parameter of the Experiment.

.. autoclass:: ray.tune.suggest.BasicVariantGenerator

Note that other search algorithms will not necessarily extend this class and may require a different search space declaration than the default Tune format.

HyperOpt Search (Tree-structured Parzen Estimators)

The HyperOptSearch is a SearchAlgorithm that is backed by HyperOpt to perform sequential model-based hyperparameter optimization. Note that this class does not extend ray.tune.suggest.BasicVariantGenerator, so you will not be able to use Tune's default variant generation/search space declaration when using HyperOptSearch.

In order to use this search algorithm, you will need to install HyperOpt via the following command:

$ pip install --upgrade git+git://

This algorithm requires using the HyperOpt search space specification. You can use HyperOptSearch like follows:

run_experiments(experiment_config, search_alg=HyperOptSearch(hyperopt_space, ... ))

An example of this can be found in

.. autoclass:: ray.tune.suggest.HyperOptSearch

Contributing a New Algorithm

If you are interested in implementing or contributing a new Search Algorithm, the API is straightforward:

.. autoclass:: ray.tune.suggest.SearchAlgorithm

Model-Based Suggestion Algorithms

Often times, hyperparameter search algorithms are model-based and may be quite simple to implement. For this, one can extend the following abstract class and implement on_trial_result, on_trial_complete, and _suggest. The abstract class will take care of Tune-specific boilerplate such as creating Trials and queuing trials:

.. autoclass:: ray.tune.suggest.SuggestionAlgorithm

    .. automethod:: ray.tune.suggest.SuggestionAlgorithm._suggest