Skip to content

Commit

Permalink
Refactoring and renaming of BoTorchSearcher and BoTorch (#507)
Browse files Browse the repository at this point in the history
  • Loading branch information
mseeger committed Jan 23, 2023
1 parent 1220769 commit 7be30b4
Show file tree
Hide file tree
Showing 14 changed files with 176 additions and 96 deletions.
4 changes: 2 additions & 2 deletions docs/source/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -68,8 +68,8 @@ Ray Tune or Bore optimizer, you can run ``pip install 'syne-tune[X]'`` where
or :class:`~syne_tune.optimizer.schedulers.FIFOScheduler` or
:class:`~syne_tune.optimizer.schedulers.HyperbandScheduler` with
``searcher="kde"``)
* ``botorch``: Bayesian optimization from BOTorch (see
:class:`~syne_tune.optimizer.schedulers.searchers.botorch.BotorchSearcher`)
* ``botorch``: Bayesian optimization from BoTorch (see
:class:`~syne_tune.optimizer.schedulers.searchers.botorch.BoTorchSearcher`)
* ``dev``: For developers who would like to extend Syne Tune
* ``extra``: For installing all the above
* ``bore``: For Bore optimizer (see :class:`~syne_tune.optimizer.baselines.BORE`)
Expand Down
2 changes: 1 addition & 1 deletion docs/source/getting_started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@ The following hyperparameter optimization (HPO) methods are available in Syne Tu
The searchers fall into four broad categories, **deterministic**, **random**, **evolutionary** and **model-based**. The random searchers sample candidate hyperparameter configurations uniformly at random, while the model-based searchers sample them non-uniformly at random, according to a model (e.g., Gaussian process, density ration estimator, etc.) and an acquisition function. The evolutionary searchers make use of an evolutionary algorithm.

Syne Tune also supports `BoTorch <https://github.com/pytorch/botorch>`_ searchers,
see :class:`~syne_tune.optimizer.baselines.BOTorch`.
see :class:`~syne_tune.optimizer.baselines.BoTorch`.

Supported multi-objective optimization methods
----------------------------------------------
Expand Down
12 changes: 6 additions & 6 deletions syne_tune/optimizer/baselines.py
Original file line number Diff line number Diff line change
Expand Up @@ -519,10 +519,10 @@ def __init__(
)


class BOTorch(FIFOScheduler):
"""Bayesian Optimization using BOTorch
class BoTorch(FIFOScheduler):
"""Bayesian Optimization using BoTorch
See :class:`~syne_tune.optimizer.schedulers.searchers.botorch.BotorchSearcher`
See :class:`~syne_tune.optimizer.schedulers.searchers.botorch.BoTorchSearcher`
for ``kwargs["search_options"]`` parameters.
:param config_space: Configuration space for evaluation function
Expand All @@ -540,18 +540,18 @@ def __init__(
**kwargs,
):
try:
from syne_tune.optimizer.schedulers.searchers.botorch import BotorchSearcher
from syne_tune.optimizer.schedulers.searchers.botorch import BoTorchSearcher
except ImportError:
logging.info(try_import_botorch_message())
raise

searcher_kwargs = _create_searcher_kwargs(
config_space, metric, random_seed, kwargs
)
super(BOTorch, self).__init__(
super(BoTorch, self).__init__(
config_space=config_space,
metric=metric,
searcher=BotorchSearcher(**searcher_kwargs),
searcher=BoTorchSearcher(**searcher_kwargs),
random_seed=random_seed,
**kwargs,
)
Expand Down
7 changes: 5 additions & 2 deletions syne_tune/optimizer/schedulers/fifo.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,8 +70,11 @@ class FIFOScheduler(TrialSchedulerWithSearcher):
:param config_space: Configuration space for evaluation function
:type config_space: dict
:param searcher: Searcher for ``get_config`` decisions. String values
are passed to ``searcher_factory`` along with ``search_options`` and
extra information. Defaults to "random" (i.e., random search)
are passed to
:func:`~syne_tune.optimizer.schedulers.searchers.searcher_factory` along
with ``search_options`` and extra information. Supported values:
:const:`~syne_tune.optimizer.schedulers.searchers.searcher_factory.SUPPORTED_SEARCHERS_FIFO`.
Defaults to "random" (i.e., random search)
:type searcher: str or
:class:`~syne_tune.optimizer.schedulers.searchers.BaseSearcher`
:param search_options: If searcher is ``str``, these arguments are
Expand Down
8 changes: 8 additions & 0 deletions syne_tune/optimizer/schedulers/hyperband.py
Original file line number Diff line number Diff line change
Expand Up @@ -226,6 +226,14 @@ class HyperbandScheduler(FIFOScheduler, MultiFidelitySchedulerMixin):
Additional arguments on top of parent class
:class:`~syne_tune.optimizer.schedulers.FIFOScheduler`:
:param searcher: Searcher for ``get_config`` decisions. String values
are passed to
:func:`~syne_tune.optimizer.schedulers.searchers.searcher_factory` along
with ``search_options`` and extra information. Supported values:
:const:`~syne_tune.optimizer.schedulers.searchers.searcher_factory.SUPPORTED_SEARCHERS_HYPERBAND`.
Defaults to "random" (i.e., random search)
:type searcher: str or
:class:`~syne_tune.optimizer.schedulers.searchers.BaseSearcher`
:param resource_attr: Name of resource attribute in results obtained
via ``on_trial_result``, defaults to "epoch"
:type resource_attr: str, optional
Expand Down
6 changes: 4 additions & 2 deletions syne_tune/optimizer/schedulers/searchers/botorch/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,11 @@

try:
from syne_tune.optimizer.schedulers.searchers.botorch.botorch_searcher import ( # noqa: F401
BotorchSearcher,
BoTorchSearcher,
BotorchSearcher, # deprecated
)

__all__.append("BotorchSearcher")
__all__.append("BoTorchSearcher")
__all__.append("BotorchSearcher") # deprecated
except ImportError:
print(try_import_botorch_message())

0 comments on commit 7be30b4

Please sign in to comment.