Skip to content

Latest commit

 

History

History
132 lines (98 loc) · 3.44 KB

user-guides.rst

File metadata and controls

132 lines (98 loc) · 3.44 KB

User Guides

AIR User Guides

3

+++ .. button-ref:: /ray-air/preprocessors :color: primary :outline: :expand:

Using Preprocessors

+++ .. button-ref:: trainers :color: primary :outline: :expand:

Using Trainers

+++ .. button-ref:: air-ingest :color: primary :outline: :expand:

Configuring Training Datasets

+++ .. button-ref:: /ray-air/tuner :color: primary :outline: :expand:

Configuring Hyperparameter Tuning

+++ .. button-ref:: predictors :color: primary :outline: :expand:

Using Predictors for Inference

+++ .. button-ref:: /ray-air/examples/serving_guide :color: primary :outline: :expand:

Deploying Predictors with Serve

+++ .. button-ref:: air-deployment :color: primary :outline: :expand:

How to Deploy AIR

Environment variables

Some behavior of Ray AIR can be controlled using environment variables.

Please also see the Ray Tune environment variables <tune-env-vars>.

  • RAY_AIR_FULL_TRACEBACKS: If set to 1, will print full tracebacks for training functions, including internal code paths. Otherwise, abbreviated tracebacks that only show user code are printed. Defaults to 0 (disabled).
  • RAY_AIR_NEW_OUTPUT: If set to 0, this disables the experimental new console output.
  • RAY_AIR_RICH_LAYOUT: If set to 1, this enables the stick table layout (only available for Ray Tune).

Running multiple AIR jobs concurrently on a single cluster

Running multiple AIR training or tuning jobs at the same time on a single cluster is not officially supported. We don't test this workflow and recommend the use of multiple smaller clusters instead.

If you still want to do this, refer to the Ray Tune multi-tenancy docs <tune-multi-tenancy> for potential pitfalls.