Skip to content

Getting Started on Stampede

Dom Heinzeller edited this page Feb 25, 2021 · 11 revisions

1. Check out the code:

   git clone -b release/public-v1 https://github.com/ufs-community/ufs-srweather-app.git
   cd ufs-srweather-app

Then, check out the submodules for the SRW application.

  ./manage_externals/checkout_externals

2. Prepare to Build

2.1 Prerequisite Libraries

Visit NCEPLIBS-external wiki page for instructions to install the prerequisite libraries.

2.2 Fix files

Download the static files required for running the SRW Application from the FTP data repository or from Amazon Web Services (AWS) cloud storage.

2.3 Pre-staged raw initial conditions

Download raw initial conditions for your case to be run from the FTP data repository or from Amazon Web Services (AWS) cloud storage. The path to this data should be set in config.sh as described in Section 5.

2.4 Python modules

Install Miniconda3 and the required Python modules in your own user space (preferably $WORK/miniconda3 to avoid disk usage issues in $HOME). When asked by the installer whether to initialize conda in your .bash_profile, make sure to say no (unless you want that). Instead, run the conda init command by hand or create a script to source.

   $> wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
   $> sh Miniconda3-latest-Linux-x86_64.sh
   $> # init conda or create script to source that contains the following line
   $> eval "$(/path/to/your/conda/installation/bin/conda shell.bash hook)"
   $> conda create --name regional_workflow
   $> conda activate regional_workflow
   $> conda install -c conda-forge f90nml
   $> conda install jinja2
   $> conda install pyyaml
   $> conda install cartopy
   $> conda install matplotlib
   $> conda install scipy
   $> conda install -c conda-forge pygrib

Next time, when you log in to a new terminal, you can just execute

  $> eval "$(/path/to/your/conda/installation/bin/conda shell.bash hook)"
  $> conda activate regional_workflow

3. Set up the build environment

You should load the following modules on Stampede:

module purge
module load libfabric/1.7.0
module load git/2.24.1
module load autotools/1.1
module load xalt/2.8
module load TACC

module load intel/18.0.2
module load cmake/3.16.1
module load impi/18.0.2
module load pnetcdf/1.11.0
module load netcdf/4.6.2

And set these environment variables

module use -a /path/to/NCEPLIBS/library/modules
module load NCEPLIBS/2.0.0
module load esmf/8.0.0

export CC=icc
export CXX=icpc
export FC=ifort
export NETCDF=${TACC_NETCDF_DIR}

export CMAKE_C_COMPILER=mpiicc
export CMAKE_CXX_COMPILER=mpiicpc
export CMAKE_Fortran_COMPILER=mpiifort

export CMAKE_Platform=stampede.intel

4. Build the executables:

   mkdir build
   cd build

Run cmake to set up the Makefile, then run make

   cmake .. -DCMAKE_INSTALL_PREFIX=..
   make -j4 |& tee build.out

If this step is successful, there should be twelve executables in ufs-srweather-app/bin including an executable for the model NEMS.exe.

5. Generate the workflow experiment:

If you have not done so yet, activate the regional_workflow conda environment:

  $> eval "$(/path/to/your/conda/installation/bin/conda shell.bash hook)"
  $> conda activate regional_workflow

Also, do not forget to set WORKDIR as described in the UFS Short-Range Weather App Users Guide.

   cd ufs-srweather-app/regional_workflow/ush
   cp config.community.sh config.sh

Edit config.sh to use an account you can charge to ACCOUNT, and the name of the experiment EXPT_SUBDIR.

MACHINE="STAMPEDE"
ACCOUNT="your account"	
EXPT_SUBDIR="my_expt_name"

...

USE_USER_STAGED_EXTRN_FILES="TRUE"
EXTRN_MDL_SOURCE_BASEDIR_ICS="/path-to/model_data/FV3GFS"
EXTRN_MDL_FILES_ICS=( "gfs.pgrb2.0p25.f000" )
EXTRN_MDL_SOURCE_BASEDIR_LBCS="/path-to/model_data/FV3GFS"
EXTRN_MDL_FILES_LBCS=( "gfs.pgrb2.0p25.f006" "gfs.pgrb2.0p25.f012" "gfs.pgrb2.0p25.f018" "gfs.pgrb2.0p25.f024" "gfs.pgrb2.0p25.f030" "gfs.pgrb2.0p25.f036" "gfs.pgrb2.0p25.f042" "gfs.pgrb2.0p25.f048" )

Please replace /path-to/model_data/FV3GFS with the actual GFS dataset path you download in Section 2.3 above.

If you download the fix files in your own working directory, you should add this section to config.sh,

FIXgsm=/path-to/fix/fix_am
TOPO_DIR=/path-to/fix/fix_orog
SFC_CLIMO_INPUT_DIR=/path-to/fix/fix_sfc_climo

Then generate the workflow:

   ./generate_FV3SAR_wflow.sh

Set the environment variable EXPTDIR to the value that the generate_FV3SAR_wflow.sh reports at the end.

6. Run the workflow:

Since there is still no rocoto installed on Stampede2, users should run the SRW Application tasks one by one using the wrapper scripts in ufs-srweather-app/regional_workflow/ush/wrappers.

A set of help scripts is available as fv3lam-slurm that may be useful for you to get started. You can clone the repository using this command,

   $> cd $EXPTDIR
   $> git clone -b ufs-v1.0.0 https://github.com/dtcenter/fv3lam-slurm
   $> cd fv3lam-slurm

where $EXPTDIR is the experiment directory created by the script generate_FV3SAR_wflow.sh in Section 5 above.

Please read file README.md in that directory first. Or you can get a brief instruction about this script by running

   $> run_fv3lam.sh -h

Basically, the script just inserts the SLURM directives (based on the settings in $EXPTDIR/FV3LAM_wflow.xml that is generated in Section 5 above) to the wrapper scripts and then run them one by one. The generated job scripts for all tasks will be in directory $EXPTDIR/log.

First, you run

   $> get_files.sh $EXPTDIR/var_defns.sh

to stage the GFS initial datasets.

Then, you can run

  $> run_fv3lam.sh -m stampede  $EXPTDIR/var_defns.sh job_name

Where the job_name is any one from [grid, orog, sfc, ics, lbcs, fcst, post] in that order.

After each job, you can check the job standard output (out.make_xxxx_%j) and standard error (err.make_xxxx_%j) in $EXPTDIR/log, where xxxx is the job_name from above and %j is the job id assigned by the SLURM scheduler.

7. Plotting the output:

If you have installed all the Python modules as described in section 2.4, you can plot the SRW Application forecast by following Step 7 in Getting Started.

The Natural Earth shapefiles required for this application can be downloaded from the FTP data repository or from Amazon Web Services (AWS) cloud storage.

For more detailed information on the application, see the UFS Short-Range Weather App Users Guide.