Skip to content

Testing ray tune with slurm batch submission and optuna and wandb

License

Notifications You must be signed in to change notification settings

klieret/ray-tune-slurm-demo

Repository files navigation

Ray tune & friends on SLURM

Hyperparameter optimization tryout

pre-commit.ci status link checker License Black

📝 Description

This repository demonstrates/tests hyperparameter optimization with the following frameworks:

Note If you want to see this tech stack in an actual use case, see the GNN tracking Hyperparameter Optimization repository.

📦 Installation

Use the conda environment, THEN pip install the package.

🔥 Running it!

First test without batch system

  • First run src/rtstest/dothetune.py (no batch submission) to also download the data file (because no internet connection on the compute nodes)

Option 1: All-in-one

For a single batch jobs that uses multiple nodes to start both the head node and the works, see slurm/all-in-one. While this is the example used in the ray documentation, it might not be the best for most use cases, as it relies on having enough available nodes directly available for enough time to complete all requested trials.

Live syncing to wandb

Because the compute nodes usually do not have internet, we need a separate tool for this. See the documentation of wandb-osh for how to start the syncer on the head node.

Option 2: Head node and worker nodes

Here, we start the ray head on the head (login) node and then use batch submission to start worker nodes asynchronously. Follow the following steps

  1. Run slurm/head_workers/start-on-headnode.sh and note down the IP and redis password that are printed out
  2. Submit several batch jobs sbatch slurm/head_workers/start-on-worker.slurm <IP> <REDIS PWD>
  3. Start your tuning script on the head node: slurm/head_workers/start-program.sh <IP> <REDIS PWD>

Note In my HPO scripts at my main ML project I instead write out the IP and password to files in my home directory and have dependent scripts read from there rather than passing them around on the command line.

Once the batch jobs for the workers start running, you should see activity in the tuning script output.

About

Testing ray tune with slurm batch submission and optuna and wandb

Topics

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published