Permalink
Find file Copy path
Fetching contributors…
Cannot retrieve contributors at this time
112 lines (68 sloc) 4.13 KB

Tune: Scalable Hyperparameter Search

images/tune.png

Tune is a scalable framework for hyperparameter search with a focus on deep learning and deep reinforcement learning.

You can find the code for Tune here on GitHub. To get started with Tune, try going through our tutorial of using Tune with Keras.

(Experimental): You can try out the above tutorial on a free hosted server via Binder.

Features

Take a look at the User Guide for a comprehensive overview on how to use Tune's features.

Getting Started

Installation

You'll need to first install ray to import Tune.

pip install ray  # also recommended: ray[debug]

Quick Start

This example runs a small grid search over a neural network training function using Tune, reporting status on the command line until the stopping condition of mean_accuracy >= 99 is reached. Tune works with any deep learning framework.

Tune uses Ray as a backend, so we will first import and initialize Ray.

import ray
import ray.tune as tune

ray.init()

For the function you wish to tune, pass in a reporter object:

Finally, configure your search and execute it on your Ray cluster:

all_trials = tune.run_experiments({
    "my_experiment": {
        "run": train_func,
        "stop": {"mean_accuracy": 99},
        "config": {"momentum": tune.grid_search([0.1, 0.2])}
    }
})

Tune can be used anywhere Ray can, e.g. on your laptop with ray.init() embedded in a Python script, or in an auto-scaling cluster for massive parallelism.

Citing Tune

If Tune helps you in your academic research, you are encouraged to cite our paper. Here is an example bibtex:

@article{liaw2018tune,
    title={Tune: A Research Platform for Distributed Model Selection and Training},
    author={Liaw, Richard and Liang, Eric and Nishihara, Robert
            and Moritz, Philipp and Gonzalez, Joseph E and Stoica, Ion},
    journal={arXiv preprint arXiv:1807.05118},
    year={2018}
}