-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[tune] shim instantiation of search algorithms #10451
Comments
from ray import tune
from ray.tune.param import Float, Integer, Categorical
from ray.tune.param import Grid, Uniform, LogUniform, Normal
space = {
"lr": Float(1e-4, 1e-1).LogUniform(),
"num_epochs": Integer(5, 10).Uniform(),
"temperature": Float().Normal(1, 0.1),
"batch_size": Categorical([32, 64]).Grid()
}
###############################
# Shim API - Option 3
################################
tune.run(
trainable,
space,
search_alg=tune.create_searcher("HyperOpt", metric="mean_loss"))
# This should also work.
# tune.run(
# trainable,
# space,
# search_alg=HyperOptSearch(metric="mean_loss")) |
untested; we should use an if-statement instead of def create_searcher(search_alg: str, **kwargs):
from ray.tune import suggest
return getattr(suggest, f'{search_alg.lower()}.{search_alg}Search')(None, **kwargs)
# in HyperOptSearch.__init__ we check if space is None, and if so, set HyperOptSearch.domain to None
# in tune.run, we check if `HyperOptSearch.domain` is None, and if so, we set assign the proper config based on the config kwarg using PR #10444 |
instead of this getattr, can you just create a dictionary? def import_hyperopt_search():
from ray.tune.suggest import HyperOptSearch
return HyperOptSearch
SEARCH_ALG_IMPORT = {
"hyperopt": import_hyperopt_search,
"dragonfly": import_dragonfly_search,
} This avoids this dependency on naming convention. |
def create_searcher(search_alg, **kwargs):
# TODO: docstring
def _import_ax_search():
from ray.tune.suggest.ax import AxSearch
return AxSearch
def _import_dragonfly_search():
from ray.tune.suggest.dragonfly import DragonflySearch
return DragonflySearch
def _import_skopt_search():
from ray.tune.suggest.skopt import SkOptSearch
return SkOptSearch
def _import_hyperopt_search():
from ray.tune.suggest import HyperOptSearch
return HyperOptSearch
def _import_bayesopt_search():
from ray.tune.suggest.bayesopt import BayesOptSearch
return BayesOptSearch
def _import_bohb_search():
from ray.tune.suggest.bohb import TuneBOHB
return TuneBOHB
def _import_nevergrad_search():
from ray.tune.suggest.nevergrad import NevergradSearch
return NevergradSearch
def _import_optuna_search():
from ray.tune.suggest.optuna import OptunaSearch
return OptunaSearch
def _import_zoopt_search():
from ray.tune.suggest.zoopt import ZOOptSearch
return ZOOptSearch
def _import_sigopt_search():
from ray.tune.suggest.sigopt import SigOptSearch
return SigOptSearch
SEARCH_ALG_IMPORT = {
"ax": _import_ax_search,
"dragonfly": _import_dragonfly_search,
"skopt": _import_skopt_search,
"hyperopt": _import_hyperopt_search,
"bayesopt": _import_bayesopt_search,
"bohb": _import_bohb_search,
"nevergrad": _import_nevergrad_search,
"optuna": _import_optuna_search,
"zoopt": _import_zoopt_search,
"sigopt": _import_sigopt_search,
}
return SEARCH_ALG_IMPORT[search_alg](**kwargs) |
Look good to me! Consider moving some of the default arguments (i.e., metric, mode) also into the wrapper so that it's explicit and captured in the docstring. |
could you push a PR @sumanthratna and tag me when you do? Thanks! |
@richardliaw should I:
|
feel free to add support to pr above! |
Describe your feature request
allow creating a search algorithm for
tune.run
given:API
there will be two ways to use shim instantiation (option 3 from the design doc):
Option 1
(notice how the search algorithm definition doesn't require redefining the search configuration when initializing
HyperOptSearch
)Option 2
related: #9969, #10401, #10444
CC @richardliaw @krfricke
The text was updated successfully, but these errors were encountered: