Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[tune] shim instantiation of search algorithms #10451

Closed
sumanthratna opened this issue Aug 31, 2020 · 8 comments · Fixed by #10456
Closed

[tune] shim instantiation of search algorithms #10451

sumanthratna opened this issue Aug 31, 2020 · 8 comments · Fixed by #10456
Labels
enhancement Request for new feature and/or capability triage Needs triage (eg: priority, bug/not-bug, and owning component)

Comments

@sumanthratna
Copy link
Member

sumanthratna commented Aug 31, 2020

Describe your feature request

allow creating a search algorithm for tune.run given:

  • (maybe; see option 1 below) a string defining the search algorithm to use
  • (maybe; see option 2 below) the search space configuration

API

there will be two ways to use shim instantiation (option 3 from the design doc):

Option 1

(notice how the search algorithm definition doesn't require redefining the search configuration when initializing HyperOptSearch)

from ray import tune
from ray.tune.param import Float, Integer, Categorical
from ray.tune.param import Grid, Uniform, LogUniform, Normal

space = {
    "lr": Float(1e-4, 1e-1).LogUniform(),
    "num_epochs": Integer(5, 10).Uniform(),
    "temperature": Float().Normal(1, 0.1),
    "batch_size": Categorical([32, 64]).Grid()
}

tune.run(
    trainable,
    space,
    search_alg=HyperOptSearch(metric="mean_loss"))

Option 2

from ray import tune
from ray.tune.param import Float, Integer, Categorical
from ray.tune.param import Grid, Uniform, LogUniform, Normal

space = {
    "lr": Float(1e-4, 1e-1).LogUniform(),
    "num_epochs": Integer(5, 10).Uniform(),
    "temperature": Float().Normal(1, 0.1),
    "batch_size": Categorical([32, 64]).Grid()
}

search_alg = tune.create_searcher("HyperOpt", space,
    metric="mean_loss")

tune.run(
    trainable,
    search_alg=search_alg)

related: #9969, #10401, #10444

CC @richardliaw @krfricke

@sumanthratna sumanthratna added enhancement Request for new feature and/or capability triage Needs triage (eg: priority, bug/not-bug, and owning component) labels Aug 31, 2020
@richardliaw
Copy link
Contributor

richardliaw commented Aug 31, 2020

from ray import tune
from ray.tune.param import Float, Integer, Categorical
from ray.tune.param import Grid, Uniform, LogUniform, Normal

space = {
    "lr": Float(1e-4, 1e-1).LogUniform(),
    "num_epochs": Integer(5, 10).Uniform(),
    "temperature": Float().Normal(1, 0.1),
    "batch_size": Categorical([32, 64]).Grid()
}

###############################
# Shim API - Option 3
################################
tune.run(
    trainable,
    space,
    search_alg=tune.create_searcher("HyperOpt", metric="mean_loss"))

# This should also work.
# tune.run(
#    trainable,
#    space,
#    search_alg=HyperOptSearch(metric="mean_loss"))

@sumanthratna
Copy link
Member Author

sumanthratna commented Aug 31, 2020

untested; we should use an if-statement instead of getattr:

def create_searcher(search_alg: str, **kwargs):
    from ray.tune import suggest
    return getattr(suggest, f'{search_alg.lower()}.{search_alg}Search')(None, **kwargs)
    # in HyperOptSearch.__init__ we check if space is None, and if so, set HyperOptSearch.domain to None
    # in tune.run, we check if `HyperOptSearch.domain` is None, and if so, we set assign the proper config based on the config kwarg using PR #10444

@richardliaw
Copy link
Contributor

instead of this getattr, can you just create a dictionary?

def import_hyperopt_search():
    from ray.tune.suggest import HyperOptSearch
    return HyperOptSearch

SEARCH_ALG_IMPORT = {
	"hyperopt": import_hyperopt_search,
    "dragonfly": import_dragonfly_search,
}

This avoids this dependency on naming convention.

@sumanthratna
Copy link
Member Author

def create_searcher(search_alg, **kwargs):
    # TODO: docstring
    def _import_ax_search():
        from ray.tune.suggest.ax import AxSearch
        return AxSearch

    def _import_dragonfly_search():
        from ray.tune.suggest.dragonfly import DragonflySearch
        return DragonflySearch

    def _import_skopt_search():
        from ray.tune.suggest.skopt import SkOptSearch
        return SkOptSearch

    def _import_hyperopt_search():
        from ray.tune.suggest import HyperOptSearch
        return HyperOptSearch

    def _import_bayesopt_search():
        from ray.tune.suggest.bayesopt import BayesOptSearch
        return BayesOptSearch

    def _import_bohb_search():
        from ray.tune.suggest.bohb import TuneBOHB
        return TuneBOHB

    def _import_nevergrad_search():
        from ray.tune.suggest.nevergrad import NevergradSearch
        return NevergradSearch

    def _import_optuna_search():
        from ray.tune.suggest.optuna import OptunaSearch
        return OptunaSearch

    def _import_zoopt_search():
        from ray.tune.suggest.zoopt import ZOOptSearch
        return ZOOptSearch

    def _import_sigopt_search():
        from ray.tune.suggest.sigopt import SigOptSearch
        return SigOptSearch

    SEARCH_ALG_IMPORT = {
        "ax": _import_ax_search,
        "dragonfly": _import_dragonfly_search,
        "skopt": _import_skopt_search,
        "hyperopt": _import_hyperopt_search,
        "bayesopt": _import_bayesopt_search,
        "bohb": _import_bohb_search,
        "nevergrad": _import_nevergrad_search,
        "optuna": _import_optuna_search,
        "zoopt": _import_zoopt_search,
        "sigopt": _import_sigopt_search,
    }
    return SEARCH_ALG_IMPORT[search_alg](**kwargs)

@richardliaw
Copy link
Contributor

Look good to me! Consider moving some of the default arguments (i.e., metric, mode) also into the wrapper so that it's explicit and captured in the docstring.

@richardliaw
Copy link
Contributor

could you push a PR @sumanthratna and tag me when you do? Thanks!

@sumanthratna
Copy link
Member Author

@richardliaw should I:

  • write a new PR for schedulers
  • add schedulers support to the PR above
  • not add schedulers support

@richardliaw
Copy link
Contributor

feel free to add support to pr above!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Request for new feature and/or capability triage Needs triage (eg: priority, bug/not-bug, and owning component)
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants