This project provides tooling to orchestrate large-scale WRF (Weather Research and Forecasting) simulation using Slurm. The main goal is to make it easy to define, stage, submit, and monitor many WRF runs (including pre/post processing steps). The framework is designed to be modular, which allows you to test the entire pipeline on a laptop or workstation and scale to HPC clusters with minimal changes.
Key points:
- Orchestration: the code manages job creation, staging of input/output, submission to Slurm.
- Extensible: the project is structured so that general-purpose code lives in the
wrf_massivepackage, while project-specific configuration and templates live in separate workspaces.
More specifically,
- The orchestration code and core CLI are implemented in
wrf_massive.wrf_massiveis designed to be as project/workspace-agnostic as possible. It contains the main logic to configure runs, produce Slurm job scripts, submit them and collect outputs.
- Concrete projects/simulation runs should live in
workspaces- An example workspace is provided in
workspace/examplewith sample configuration, job templates and instructions.
- An example workspace is provided in
- Optionally, WRF and WPS are included as submodules in
submodules. You can pull them usinggit submodule update --init --recursiveand build them according to their own instructions.
-
Using the included submodules (optional) to build WRF/WPS inside this repository:
- Clone the NCAR repos using
git submodule update --init --recursive - Compile and setup WRF as per NCAR's instructions.
- Clone the NCAR repos using
-
Using an external WRF/WPS installation (recommended for reproducible or managed HPC installs):
- If you already have a system-wide or separately-maintained WRF/WPS installation, you do not need the submodules.
Point this orchestration tool to your existing installation using the configuration mechanism in your workspace
(see
workspace/example/README.mdfor an example configuration). This repository only orchestrates runs and does not require WRF/WPS to be compiled inside it.
- If you already have a system-wide or separately-maintained WRF/WPS installation, you do not need the submodules.
Point this orchestration tool to your existing installation using the configuration mechanism in your workspace
(see
The repository contains a workspace/example folder with a self-contained example workspace and an example README.md.
Use that as your primary reference for how to structure a workspace that uses this tool.
A conda environment file is provided as environment.yml at the project root to create a reproducible Python environment.
It contains the Python packages typically needed to run the orchestration tools, utility scripts and tests.
Create the environment (conda/miniconda/anaconda required):
conda env create -f environment.yml
# or to update an existing env
conda env update -f environment.ymlActivate it:
conda activate wrf-massive # or the name defined in environment.ymlIf you prefer pip or another environment manager, use the package list in environment.yml as a reference to recreate
the environment. The project also contains pyproject.toml files that provide metadata for the Python packages in the workspace.
- Create and activate the Python environment using
environment.yml. - Prepare WRF/WPS:
- Inspect
workspace/exampleto copy example configuration and job templates into your workspace. - Use the CLI in
wrf_massiveto create and submit evaluations (seewrf_massive/cli.pyfor the available commands). Typical invocation is via the Python CLI in your activated environment.