Ray is a flexible, high-performance distributed execution framework.
Ray is easy to install: pip install ray
Basic Python | Distributed with Ray |
# Execute f serially.
def f():
time.sleep(1)
return 1
results = [f() for i in range(4)] |
# Execute f in parallel.
@ray.remote
def f():
time.sleep(1)
return 1
ray.init()
results = ray.get([f.remote() for i in range(4)]) |
To launch a Ray cluster, either privately, on AWS, or on GCP, follow these instructions.
View the codebase on GitHub.
Ray comes with libraries that accelerate deep learning and reinforcement learning development:
- Tune: Scalable Hyperparameter Search
- RLlib: Scalable Reinforcement Learning
- Distributed Training
installation.rst deploy-on-kubernetes.rst install-on-docker.rst installation-troubleshooting.rst
tutorial.rst api.rst actors.rst using-ray-with-gpus.rst signals.rst async_api.rst
autoscaling.rst using-ray-on-a-cluster.rst
tune.rst tune-usage.rst tune-schedulers.rst tune-searchalg.rst tune-package-ref.rst tune-examples.rst
rllib.rst rllib-training.rst rllib-env.rst rllib-models.rst rllib-algorithms.rst rllib-offline.rst rllib-dev.rst rllib-concepts.rst rllib-package-ref.rst rllib-examples.rst
distributed_sgd.rst pandas_on_ray.rst
example-rl-pong.rst example-policy-gradient.rst example-parameter-server.rst example-newsreader.rst example-resnet.rst example-a3c.rst example-lbfgs.rst example-evolution-strategies.rst example-cython.rst example-streaming.rst using-ray-with-tensorflow.rst
internals-overview.rst serialization.rst fault-tolerance.rst plasma-object-store.rst resources.rst tempfile.rst
troubleshooting.rst user-profiling.rst security.rst development.rst profiling.rst contact.rst