Skip to content

Commit

Permalink
[tune] add back NevergradSearch (ray-project#42305)
Browse files Browse the repository at this point in the history
Signed-off-by: Matthew Deng <matt@anyscale.com>
Signed-off-by: tterrysun <terry@anyscale.com>
  • Loading branch information
matthewdeng authored and tterrysun committed Feb 14, 2024
1 parent 3d2691e commit 33b8dec
Show file tree
Hide file tree
Showing 22 changed files with 2,896 additions and 4 deletions.
2 changes: 2 additions & 0 deletions doc/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -476,6 +476,7 @@ def setup(app):
"joblib",
"lightgbm",
"lightgbm_ray",
"nevergrad",
"numpy",
"pandas",
"pyarrow",
Expand Down Expand Up @@ -540,6 +541,7 @@ def add_line(self, line: str, source: str, *lineno: int) -> None:
"lightgbm": ("https://lightgbm.readthedocs.io/en/latest/", None),
"mars": ("https://mars-project.readthedocs.io/en/latest/", None),
"modin": ("https://modin.readthedocs.io/en/stable/", None),
"nevergrad": ("https://facebookresearch.github.io/nevergrad/", None),
"numpy": ("https://numpy.org/doc/stable/", None),
"pandas": ("https://pandas.pydata.org/pandas-docs/stable/", None),
"pyarrow": ("https://arrow.apache.org/docs", None),
Expand Down
7 changes: 7 additions & 0 deletions doc/source/ray-overview/examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -336,6 +336,13 @@ Ray Examples

How To Use Tune With TuneBOHB

.. grid-item-card:: :bdg-secondary:`Code example`
:class-item: gallery-item tuning
:link: /tune/examples/nevergrad_example
:link-type: doc

How To Use Tune With Nevergrad

.. grid-item-card:: :bdg-secondary:`Code example`
:class-item: gallery-item tuning
:link: /tune/examples/optuna_example
Expand Down
11 changes: 11 additions & 0 deletions doc/source/tune/api/suggestion.rst
Original file line number Diff line number Diff line change
Expand Up @@ -184,6 +184,17 @@ HyperOpt (tune.search.hyperopt.HyperOptSearch)

hyperopt.HyperOptSearch

.. _nevergrad:

Nevergrad (tune.search.nevergrad.NevergradSearch)
-------------------------------------------------

.. autosummary::
:nosignatures:
:toctree: doc/

nevergrad.NevergradSearch

.. _tune-optuna:

Optuna (tune.search.optuna.OptunaSearch)
Expand Down
9 changes: 9 additions & 0 deletions doc/source/tune/examples/hpo-frameworks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ Tune Hyperparameter Optimization Framework Examples
HyperOpt Example <hyperopt_example>
Bayesopt Example <bayesopt_example>
BOHB Example <bohb_example>
Nevergrad Example <nevergrad_example>
Optuna Example <optuna_example>


Expand Down Expand Up @@ -51,6 +52,14 @@ on each of our integrations:

How To Use Tune With TuneBOHB

.. grid-item-card::
:img-top: ../images/nevergrad.png
:class-img-top: pt-2 w-75 d-block mx-auto fixed-height-img

.. button-ref:: nevergrad_example

How To Use Tune With Nevergrad

.. grid-item-card::
:img-top: ../images/optuna.png
:class-img-top: pt-2 w-75 d-block mx-auto fixed-height-img
Expand Down
6 changes: 6 additions & 0 deletions doc/source/tune/examples/includes/nevergrad_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

Nevergrad Example
~~~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/nevergrad_example.py
2,184 changes: 2,184 additions & 0 deletions doc/source/tune/examples/nevergrad_example.ipynb

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion doc/source/tune/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Ray Tune: Hyperparameter Tuning

Tune is a Python library for experiment execution and hyperparameter tuning at any scale.
You can tune your favorite machine learning framework (:ref:`PyTorch <tune-pytorch-cifar-ref>`, :ref:`XGBoost <tune-xgboost-ref>`, :doc:`TensorFlow and Keras <examples/tune_mnist_keras>`, and :doc:`more <examples/index>`) by running state of the art algorithms such as :ref:`Population Based Training (PBT) <tune-scheduler-pbt>` and :ref:`HyperBand/ASHA <tune-scheduler-hyperband>`.
Tune further integrates with a wide range of additional hyperparameter optimization tools, including :doc:`Ax <examples/ax_example>`, :doc:`BayesOpt <examples/bayesopt_example>`, :doc:`BOHB <examples/bohb_example>`, and :doc:`Optuna <examples/optuna_example>`.
Tune further integrates with a wide range of additional hyperparameter optimization tools, including :doc:`Ax <examples/ax_example>`, :doc:`BayesOpt <examples/bayesopt_example>`, :doc:`BOHB <examples/bohb_example>`, :doc:`Nevergrad <examples/nevergrad_example>`, and :doc:`Optuna <examples/optuna_example>`.

**Click on the following tabs to see code examples for various machine learning frameworks**:

Expand Down
4 changes: 4 additions & 0 deletions doc/source/tune/key-concepts.rst
Original file line number Diff line number Diff line change
Expand Up @@ -216,6 +216,10 @@ Here's an overview of all available search algorithms in Tune:
- Bayesian Opt/HyperBand
- [`BOHB <https://github.com/automl/HpBandSter>`__]
- :doc:`/tune/examples/includes/bohb_example`
* - :ref:`NevergradSearch <nevergrad>`
- Gradient-free Optimization
- [`Nevergrad <https://github.com/facebookresearch/nevergrad>`__]
- :doc:`/tune/examples/includes/nevergrad_example`
* - :ref:`OptunaSearch <tune-optuna>`
- Optuna search algorithms
- [`Optuna <https://optuna.org/>`__]
Expand Down
1 change: 1 addition & 0 deletions docker/examples/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ RUN pip install --no-cache-dir -U pip \
bayesian-optimization \
hyperopt \
ConfigSpace==0.4.10 \
nevergrad \
scikit-optimize \
hpbandster \
lightgbm \
Expand Down
1 change: 1 addition & 0 deletions python/ray/air/_internal/usage.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@
"TuneBOHB",
"HEBOSearch",
"HyperOptSearch",
"NevergradSearch",
"OptunaSearch",
"SkOptSearch",
"ZOOptSearch",
Expand Down
9 changes: 9 additions & 0 deletions python/ray/tune/BUILD
Original file line number Diff line number Diff line change
Expand Up @@ -632,6 +632,15 @@ py_test(
args = ["--smoke-test"]
)

py_test(
name = "nevergrad_example",
size = "small",
srcs = ["examples/nevergrad_example.py"],
deps = [":tune_lib"],
tags = ["team:ml", "exclusive", "example"],
args = ["--smoke-test"]
)

py_test(
name = "optuna_define_by_run_example",
size = "small",
Expand Down
1 change: 1 addition & 0 deletions python/ray/tune/examples/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ Search Algorithm Examples
-------------------------

- `Ax example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/ax_example.py>`__: Optimize a Hartmann function with `Ax <https://ax.dev>`_ with 4 parallel workers.
- `Nevergrad example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/nevergrad_example.py>`__: Optimize a simple toy function with the gradient-free optimization package `Nevergrad <https://github.com/facebookresearch/nevergrad>`_ with 4 parallel workers.
- `Bayesian Optimization example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/bayesopt_example.py>`__: Optimize a simple toy function using `Bayesian Optimization <https://github.com/fmfn/BayesianOptimization>`_ with 4 parallel workers.

Tensorflow/Keras Examples
Expand Down
75 changes: 75 additions & 0 deletions python/ray/tune/examples/nevergrad_example.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
"""This example demonstrates the usage of Nevergrad with Ray Tune.
It also checks that it is usable with a separate scheduler.
Requires the Nevergrad library to be installed (`pip install nevergrad`).
"""
import time

from ray import train, tune
from ray.tune.search import ConcurrencyLimiter
from ray.tune.schedulers import AsyncHyperBandScheduler
from ray.tune.search.nevergrad import NevergradSearch


def evaluation_fn(step, width, height):
return (0.1 + width * step / 100) ** (-1) + height * 0.1


def easy_objective(config):
# Hyperparameters
width, height = config["width"], config["height"]

for step in range(config["steps"]):
# Iterative training function - can be any arbitrary training procedure
intermediate_score = evaluation_fn(step, width, height)
# Feed the score back back to Tune.
train.report({"iterations": step, "mean_loss": intermediate_score})
time.sleep(0.1)


if __name__ == "__main__":
import argparse
import nevergrad as ng

parser = argparse.ArgumentParser()
parser.add_argument(
"--smoke-test", action="store_true", help="Finish quickly for testing"
)
args, _ = parser.parse_known_args()

# Optional: Pass the parameter space yourself
# space = ng.p.Dict(
# width=ng.p.Scalar(lower=0, upper=20),
# height=ng.p.Scalar(lower=-100, upper=100),
# activation=ng.p.Choice(choices=["relu", "tanh"])
# )

algo = NevergradSearch(
optimizer=ng.optimizers.OnePlusOne,
# space=space, # If you want to set the space manually
)
algo = ConcurrencyLimiter(algo, max_concurrent=4)

scheduler = AsyncHyperBandScheduler()

tuner = tune.Tuner(
easy_objective,
tune_config=tune.TuneConfig(
metric="mean_loss",
mode="min",
search_alg=algo,
scheduler=scheduler,
num_samples=10 if args.smoke_test else 50,
),
run_config=train.RunConfig(name="nevergrad"),
param_space={
"steps": 100,
"width": tune.uniform(0, 20),
"height": tune.uniform(-100, 100),
"activation": tune.choice(["relu", "tanh"]),
},
)
results = tuner.fit()

print("Best hyperparameters found were: ", results.get_best_result().config)
7 changes: 7 additions & 0 deletions python/ray/tune/search/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,12 @@ def _import_bohb_search():
return TuneBOHB


def _import_nevergrad_search():
from ray.tune.search.nevergrad.nevergrad_search import NevergradSearch

return NevergradSearch


def _import_optuna_search():
from ray.tune.search.optuna.optuna_search import OptunaSearch

Expand All @@ -71,6 +77,7 @@ def _import_hebo_search():
"hyperopt": _import_hyperopt_search,
"bayesopt": _import_bayesopt_search,
"bohb": _import_bohb_search,
"nevergrad": _import_nevergrad_search,
"optuna": _import_optuna_search,
"zoopt": _import_zoopt_search,
"hebo": _import_hebo_search,
Expand Down
3 changes: 3 additions & 0 deletions python/ray/tune/search/nevergrad/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
from ray.tune.search.nevergrad.nevergrad_search import NevergradSearch

__all__ = ["NevergradSearch"]

0 comments on commit 33b8dec

Please sign in to comment.