Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/develop' into feature/epic-stack
Browse files Browse the repository at this point in the history
* origin/develop:
  Code Updates for New Wave Fix Files  (NOAA-EMC#1605)
  Make JEDI cycling toggle switches YAML configurable and their names more explicit  (NOAA-EMC#1607)
  Update gfs-utils hash to 8965258 (NOAA-EMC#1586)
  Allow YAML input to override config.base and make HPSS_PROJECT configurable (NOAA-EMC#1603)
  Fix RTD python requirements path
  Clean up RTD python install settings
  Use RTD python system packages
  Fix python RTD parameter name
  Update RTD python install
  Restore RTD python version and add requirements
  Revert python version for RTD (NOAA-EMC#1598)
  Move RTD config to root of repo (NOAA-EMC#1597)
  Add RTD config (NOAA-EMC#1596)
  Add `schema` library to manage schema for variety of input configurations. (NOAA-EMC#1567)
  Add S2SA to the allowed app list in setup expt (NOAA-EMC#1591)
  Avoid parsing group name when checking RadMon diagnostic files (NOAA-EMC#1559)
  Remove gldas from global-workflow (NOAA-EMC#1590)
  Add GEFS capability to setup_expt (NOAA-EMC#1582)
  • Loading branch information
KateFriedman-NOAA committed May 16, 2023
2 parents 3c9be33 + c005fbc commit dbdd150
Show file tree
Hide file tree
Showing 151 changed files with 3,097 additions and 1,248 deletions.
21 changes: 21 additions & 0 deletions .readthedocs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# .readthedocs.yaml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details

# Required
version: 2

# Set the version of Python and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"

# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/source/conf.py

python:
install:
- requirements: docs/requirements.txt
system_packages: true
9 changes: 1 addition & 8 deletions Externals.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ protocol = git
required = True

[gfs-utils]
hash = 0b8ff56
hash = 8965258
local_path = sorc/gfs_utils.fd
repo_url = https://github.com/NOAA-EMC/gfs-utils
protocol = git
Expand Down Expand Up @@ -56,13 +56,6 @@ repo_url = https://github.com/NOAA-EMC/GDASApp.git
protocol = git
required = False

[GLDAS]
tag = fd8ba62
local_path = sorc/gldas.fd
repo_url = https://github.com/NOAA-EMC/GLDAS.git
protocol = git
required = False

[EMC-gfs_wafs]
hash = 014a0b8
local_path = sorc/gfs_wafs.fd
Expand Down
15 changes: 7 additions & 8 deletions docs/source/components.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
###########################
Global Workflow Components
Global Workflow Components
###########################

The global-workflow is a combination of several components working together to prepare, analyze, produce, and post-process forecast data.
Expand All @@ -13,20 +13,19 @@ The major components of the system are:
* Post-processing
* Verification

The Global Workflow repository contains the workflow and script layers. After running the checkout script, the code and additional offline scripts for the analysis, forecast, and post-processing components will be present. Any non-workflow component is known as a sub-module. All of the sub-modules of the system reside in their respective repositories on GitHub. The global-workflow sub-modules are obtained by running the checkout script found under the /sorc folder.
The Global Workflow repository contains the workflow and script layers. After running the checkout script, the code and additional offline scripts for the analysis, forecast, and post-processing components will be present. Any non-workflow component is known as a sub-module. All of the sub-modules of the system reside in their respective repositories on GitHub. The global-workflow sub-modules are obtained by running the checkout script found under the /sorc folder.

======================
Component repositories
======================

Components checked out via sorc/checkout.sh:

* **GFS UTILS** (https://github.com/ufs-community/gfs_utils): Utility codes needed by Global Workflow to run the GFS configuration
* **UFS-Weather-Model** (https://github.com/ufs-community/ufs-weather-model): This is the core model used by the Global-Workflow to provide forecasts. The UFS-weather-model repository is an umbrella repository consisting of cooupled component earth systeme that are all checked out when we check out the code at the top level of the repoitory
* **GFS UTILS** (https://github.com/ufs-community/gfs_utils): Utility codes needed by Global Workflow to run the GFS configuration
* **UFS-Weather-Model** (https://github.com/ufs-community/ufs-weather-model): This is the core model used by the Global-Workflow to provide forecasts. The UFS-weather-model repository is an umbrella repository consisting of cooupled component earth systeme that are all checked out when we check out the code at the top level of the repoitory
* **GSI** (https://github.com/NOAA-EMC/GSI): This is the core code base for atmospheric Data Assimilation
* **GSI UTILS** (https://github.com/NOAA-EMC/GSI-Utils): Utility codes needed by GSI to create analysis
* **GSI Monitor** (https://github.com/NOAA-EMC/GSI-Monitor): These tools monitor the GSI package's data assimilation, detecting and reporting missing data sources, low observation counts, and high penalty values
* **GLDAS** (https://github.com/NOAA-EMC/GLDAS): Code base for Land Data Assimiation
* **GSI UTILS** (https://github.com/NOAA-EMC/GSI-Utils): Utility codes needed by GSI to create analysis
* **GSI Monitor** (https://github.com/NOAA-EMC/GSI-Monitor): These tools monitor the GSI package's data assimilation, detecting and reporting missing data sources, low observation counts, and high penalty values
* **GDAS** (https://github.com/NOAA-EMC/GDASApp): Jedi based Data Assimilation system. This system is currently being developed for marine Data Assimilation and in time will replace GSI for atmospheric data assimilation as well
* **UFS UTILS** (https://github.com/ufs-community/UFS_UTILS): Utility codes needed for UFS-weather-model
* **Verif global** (https://github.com/NOAA-EMC/EMC_verif-global): Verification package to evaluate GFS parallels. It uses MET and METplus. At this moment the verification package is limited to providing atmospheric metrics only
Expand All @@ -43,7 +42,7 @@ External dependencies
Libraries
^^^^^^^^^

All the libraries that are needed to run the end to end Global Workflow are built using a package manager. Currently these are served via HPC-STACK but will soon be available via SPACK-STACK. These libraries are already available on supported NOAA HPC platforms
All the libraries that are needed to run the end to end Global Workflow are built using a package manager. Currently these are served via HPC-STACK but will soon be available via SPACK-STACK. These libraries are already available on supported NOAA HPC platforms

Find information on official installations of HPC-STACK here:

Expand Down
5 changes: 1 addition & 4 deletions docs/source/configure.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The global-workflow configs contain switches that change how the system runs. Ma
| APP | Model application | ATM | YES | See case block in config.base for options |
+----------------+------------------------------+---------------+-------------+---------------------------------------------------+
| DOIAU | Enable 4DIAU for control | YES | NO | Turned off for cold-start first half cycle |
| | with 3 increments | | | |
| | with 3 increments | | | |
+----------------+------------------------------+---------------+-------------+---------------------------------------------------+
| DOHYBVAR | Run EnKF | YES | YES | Don't recommend turning off |
+----------------+------------------------------+---------------+-------------+---------------------------------------------------+
Expand All @@ -26,9 +26,6 @@ The global-workflow configs contain switches that change how the system runs. Ma
| DO_GEMPAK | Run job to produce GEMPAK | NO | YES | downstream processing, ops only |
| | products | | | |
+----------------+------------------------------+---------------+-------------+---------------------------------------------------+
| DO_GLDAS | Run GLDAS to spin up land | YES | YES | Spins up for 84hrs if sflux files not available |
| | ICs | | | |
+----------------+------------------------------+---------------+-------------+---------------------------------------------------+
| DO_VRFY | Run vrfy job | NO | YES | Whether to include vrfy job (GSI monitoring, |
| | | | | tracker, VSDB, fit2obs) |
+----------------+------------------------------+---------------+-------------+---------------------------------------------------+
Expand Down
8 changes: 3 additions & 5 deletions docs/source/jobs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ GFS Configuration
#################

.. figure:: _static/GFS_v16_flowchart.png

Schematic flow chart for GFS v16 in operations

The sequence of jobs that are run for an end-to-end (analysis+forecast+post processing+verification) GFS configuration using the Global Workflow is shown above. The system utilizes a collection of scripts that perform the tasks for each step.
Expand All @@ -12,7 +12,7 @@ For any cycle the system consists of two suites -- the "gdas" suite which provid

An experimental run is different from operations in the following ways:

* Workflow manager: operations utilizes `ecFlow <https://www.ecmwf.int/en/learning/training/introduction-ecmwf-job-scheduler-ecflow>`__, while development currently utilizes `ROCOTO <https://github.com/christopherwharrop/rocoto/wiki/documentation>`__. Note, experiments can also be run using ecFlow on platforms with ecFlow servers established.
* Workflow manager: operations utilizes `ecFlow <https://www.ecmwf.int/en/learning/training/introduction-ecmwf-job-scheduler-ecflow>`__, while development currently utilizes `ROCOTO <https://github.com/christopherwharrop/rocoto/wiki/documentation>`__. Note, experiments can also be run using ecFlow on platforms with ecFlow servers established.

* Dump step is not run as it has already been completed during the real-time production runs and dump data is available in the global dump archive on supported machines.

Expand All @@ -25,7 +25,7 @@ An experimental run is different from operations in the following ways:
Downstream jobs (e.g. awips, gempak, etc.) are not included in the diagram. Those jobs are not normally run in developmental tests.

=============================
Jobs in the GFS Configuration
Jobs in the GFS Configuration
=============================
+-------------------+-----------------------------------------------------------------------------------------------------------------------+
| JOB NAME | PURPOSE |
Expand Down Expand Up @@ -65,8 +65,6 @@ Jobs in the GFS Configuration
+-------------------+-----------------------------------------------------------------------------------------------------------------------+
| fcst | Runs the forecast (with or without one-way waves). |
+-------------------+-----------------------------------------------------------------------------------------------------------------------+
| gldas | Runs the Global Land Data Assimilation System (GLDAS). |
+-------------------+-----------------------------------------------------------------------------------------------------------------------+
| metpN | Runs MET/METplus verification via EMC_verif-global. |
+-------------------+-----------------------------------------------------------------------------------------------------------------------+
| prep | Runs the data preprocessing prior to the analysis (storm relocation if needed and generation of prepbufr file). |
Expand Down
23 changes: 13 additions & 10 deletions docs/source/setup.rst
Original file line number Diff line number Diff line change
Expand Up @@ -84,18 +84,20 @@ The following command examples include variables for reference but users should
::

cd workflow
./setup_expt.py forecast-only --idate $IDATE --edate $EDATE [--app $APP] [--start $START] [--gfs_cyc $GFS_CYC] [--resdet $RESDET]
./setup_expt.py gfs forecast-only --idate $IDATE --edate $EDATE [--app $APP] [--start $START] [--gfs_cyc $GFS_CYC] [--resdet $RESDET]
[--pslot $PSLOT] [--configdir $CONFIGDIR] [--comrot $COMROT] [--expdir $EXPDIR]

where:

* ``forecast-only`` is the first positional argument that instructs the setup script to produce an experiment directory for forecast only experiments.
* ``gfs`` is the first positional argument that instructs the setup script to produce a GFS experiment directory
* ``forecast-only`` is the second positional argument that instructs the setup script to produce an experiment directory for forecast only experiments.
* ``$APP`` is the target application, one of:

- ATM: atmosphere-only [default]
- ATMW: atm-wave
- ATMA: atm-aerosols
- S2S: atm-ocean-ice
- S2SA: atm-ocean-ice-aerosols
- S2SW: atm-ocean-ice-wave
- S2SWA: atm-ocean-ice-wave-aerosols

Expand All @@ -116,21 +118,21 @@ Atm-only:
::

cd workflow
./setup_expt.py forecast-only --pslot test --idate 2020010100 --edate 2020010118 --resdet 384 --gfs_cyc 4 --comrot /some_large_disk_area/Joe.Schmo/comrot --expdir /some_safe_disk_area/Joe.Schmo/expdir
./setup_expt.py gfs forecast-only --pslot test --idate 2020010100 --edate 2020010118 --resdet 384 --gfs_cyc 4 --comrot /some_large_disk_area/Joe.Schmo/comrot --expdir /some_safe_disk_area/Joe.Schmo/expdir

Coupled:

::

cd workflow
./setup_expt.py forecast-only --app S2SW --pslot coupled_test --idate 2013040100 --edate 2013040100 --resdet 384 --comrot /some_large_disk_area/Joe.Schmo/comrot --expdir /some_safe_disk_area/Joe.Schmo/expdir
./setup_expt.py gfs forecast-only --app S2SW --pslot coupled_test --idate 2013040100 --edate 2013040100 --resdet 384 --comrot /some_large_disk_area/Joe.Schmo/comrot --expdir /some_safe_disk_area/Joe.Schmo/expdir

Coupled with aerosols:

::

cd workflow
./setup_expt.py forecast-only --app S2SWA --pslot coupled_test --idate 2013040100 --edate 2013040100 --resdet 384 --comrot /some_large_disk_area/Joe.Schmo/comrot --expdir /some_safe_disk_area/Joe.Schmo/expdir
./setup_expt.py gfs forecast-only --app S2SWA --pslot coupled_test --idate 2013040100 --edate 2013040100 --resdet 384 --comrot /some_large_disk_area/Joe.Schmo/comrot --expdir /some_safe_disk_area/Joe.Schmo/expdir

****************************************
Step 2: Set user and experiment settings
Expand Down Expand Up @@ -193,13 +195,14 @@ The following command examples include variables for reference but users should
::

cd workflow
./setup_expt.py cycled --idate $IDATE --edate $EDATE [--app $APP] [--start $START] [--gfs_cyc $GFS_CYC]
./setup_expt.py gfs cycled --idate $IDATE --edate $EDATE [--app $APP] [--start $START] [--gfs_cyc $GFS_CYC]
[--resdet $RESDET] [--resens $RESENS] [--nens $NENS] [--cdump $CDUMP]
[--pslot $PSLOT] [--configdir $CONFIGDIR] [--comrot $COMROT] [--expdir $EXPDIR] [--icsdir $ICSDIR]

where:

* ``cycled`` is the first positional argument that instructs the setup script to produce an experiment directory for cycled experiments.
* ``gfs`` is the first positional argument that instructs the setup script to produce a GFS experiment directory
* ``cycled`` is the second positional argument that instructs the setup script to produce an experiment directory for cycled experiments.
* ``$APP`` is the target application, one of:

- ATM: atmosphere-only [default]
Expand All @@ -226,13 +229,13 @@ Example:
::

cd workflow
./setup_expt.py cycled --pslot test --configdir /home/Joe.Schmo/git/global-workflow/parm/config --idate 2020010100 --edate 2020010118 --comrot /some_large_disk_area/Joe.Schmo/comrot --expdir /some_safe_disk_area/Joe.Schmo/expdir --resdet 384 --resens 192 --nens 80 --gfs_cyc 4
./setup_expt.py gfs cycled --pslot test --configdir /home/Joe.Schmo/git/global-workflow/parm/config --idate 2020010100 --edate 2020010118 --comrot /some_large_disk_area/Joe.Schmo/comrot --expdir /some_safe_disk_area/Joe.Schmo/expdir --resdet 384 --resens 192 --nens 80 --gfs_cyc 4

Example ``setup_expt.py`` on Orion:

::

Orion-login-3$ ./setup_expt.py cycled --pslot test --idate 2022010118 --edate 2022010200 --resdet 192 --resens 96 --nens 80 --comrot /work/noaa/stmp/jschmo/comrot --expdir /work/noaa/global/jschmo/expdir
Orion-login-3$ ./setup_expt.py gfs cycled --pslot test --idate 2022010118 --edate 2022010200 --resdet 192 --resens 96 --nens 80 --comrot /work/noaa/stmp/jschmo/comrot --expdir /work/noaa/global/jschmo/expdir
EDITED: /work/noaa/global/jschmo/expdir/test/config.base as per user input.
EDITED: /work/noaa/global/jschmo/expdir/test/config.aeroanl as per user input.
EDITED: /work/noaa/global/jschmo/expdir/test/config.ocnanal as per user input.
Expand All @@ -243,7 +246,7 @@ What happens if I run ``setup_expt.py`` again for an experiment that already exi

::

Orion-login-3$ ./setup_expt.py cycled --pslot test --idate 2022010118 --edate 2022010200 --resdet 192 --resens 96 --nens 80 --comrot /work/noaa/stmp/jschmo/comrot --expdir /work/noaa/global/jschmo/expdir
Orion-login-3$ ./setup_expt.py gfs cycled --pslot test --idate 2022010118 --edate 2022010200 --resdet 192 --resens 96 --nens 80 --comrot /work/noaa/stmp/jschmo/comrot --expdir /work/noaa/global/jschmo/expdir

directory already exists in /work/noaa/stmp/jschmo/comrot/test

Expand Down
6 changes: 1 addition & 5 deletions ecf/defs/gfs_00.def
Original file line number Diff line number Diff line change
Expand Up @@ -2224,10 +2224,6 @@
trigger /prod/primary/00/obsproc/v1.0/gdas/atmos/dump/jobsproc_gdas_atmos_dump:release_sfcprep
endfamily
endfamily
family init
task jgdas_atmos_gldas
trigger ../analysis/jgdas_atmos_analysis == complete
endfamily
family analysis
task jgdas_atmos_analysis
trigger /prod/primary/00/obsproc/v1.0/gdas/atmos/prep/jobsproc_gdas_atmos_prep == complete and ../obsproc/prep/jgdas_atmos_emcsfc_sfc_prep == complete
Expand Down Expand Up @@ -2354,7 +2350,7 @@
endfamily
endfamily
task jgdas_forecast
trigger ./atmos/analysis/jgdas_atmos_analysis:release_fcst and ./wave/prep/jgdas_wave_prep == complete and ./atmos/init/jgdas_atmos_gldas == complete
trigger ./atmos/analysis/jgdas_atmos_analysis:release_fcst and ./wave/prep/jgdas_wave_prep == complete
endfamily
family enkfgdas
edit RUN 'gdas'
Expand Down
6 changes: 1 addition & 5 deletions ecf/defs/gfs_06.def
Original file line number Diff line number Diff line change
Expand Up @@ -2224,10 +2224,6 @@
trigger /prod/primary/06/obsproc/v1.0/gdas/atmos/dump/jobsproc_gdas_atmos_dump:release_sfcprep
endfamily
endfamily
family init
task jgdas_atmos_gldas
trigger ../analysis/jgdas_atmos_analysis == complete
endfamily
family analysis
task jgdas_atmos_analysis
trigger /prod/primary/06/obsproc/v1.0/gdas/atmos/prep/jobsproc_gdas_atmos_prep == complete and ../obsproc/prep/jgdas_atmos_emcsfc_sfc_prep == complete
Expand Down Expand Up @@ -2354,7 +2350,7 @@
endfamily
endfamily
task jgdas_forecast
trigger ./atmos/analysis/jgdas_atmos_analysis:release_fcst and ./wave/prep/jgdas_wave_prep == complete and ./atmos/init/jgdas_atmos_gldas == complete
trigger ./atmos/analysis/jgdas_atmos_analysis:release_fcst and ./wave/prep/jgdas_wave_prep == complete
endfamily
family enkfgdas
edit RUN 'gdas'
Expand Down
6 changes: 1 addition & 5 deletions ecf/defs/gfs_12.def
Original file line number Diff line number Diff line change
Expand Up @@ -2225,10 +2225,6 @@
trigger /prod/primary/12/obsproc/v1.0/gdas/atmos/dump/jobsproc_gdas_atmos_dump:release_sfcprep
endfamily
endfamily
family init
task jgdas_atmos_gldas
trigger ../analysis/jgdas_atmos_analysis == complete
endfamily
family analysis
task jgdas_atmos_analysis
trigger /prod/primary/12/obsproc/v1.0/gdas/atmos/prep/jobsproc_gdas_atmos_prep == complete and ../obsproc/prep/jgdas_atmos_emcsfc_sfc_prep == complete
Expand Down Expand Up @@ -2355,7 +2351,7 @@
endfamily
endfamily
task jgdas_forecast
trigger ./atmos/analysis/jgdas_atmos_analysis:release_fcst and ./wave/prep/jgdas_wave_prep == complete and ./atmos/init/jgdas_atmos_gldas == complete
trigger ./atmos/analysis/jgdas_atmos_analysis:release_fcst and ./wave/prep/jgdas_wave_prep == complete
endfamily
family enkfgdas
edit RUN 'gdas'
Expand Down
Loading

0 comments on commit dbdd150

Please sign in to comment.