Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updates to RTD documentation based on full review #1287

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
e406b71
Correct and update documentation
KateFriedman-NOAA Jan 23, 2023
5d39ba6
Add note blocks
KateFriedman-NOAA Jan 23, 2023
f1a048f
Convert tables in setup.rst
KateFriedman-NOAA Jan 23, 2023
b92f984
Update table syntax in setup.rst
KateFriedman-NOAA Jan 23, 2023
996ff68
Update setup table syntax
KateFriedman-NOAA Jan 24, 2023
57fb3cf
Adjust setup table syntax again
KateFriedman-NOAA Jan 24, 2023
4443217
Adjust setup table syntax yet again
KateFriedman-NOAA Jan 24, 2023
6ab35db
Adjust setup table syntax once again
KateFriedman-NOAA Jan 24, 2023
8dfb565
Trying different syntax for setup table
KateFriedman-NOAA Jan 24, 2023
744c7f8
Adjust setup table syntax yet again again
KateFriedman-NOAA Jan 24, 2023
1f6a536
Adjust setup table syntax once again again
KateFriedman-NOAA Jan 24, 2023
e4b8454
Adjust setup table syntax
KateFriedman-NOAA Jan 24, 2023
b7fb93a
Several RTD documentation updates
KateFriedman-NOAA Jan 24, 2023
af4361f
Update setup script examples
KateFriedman-NOAA Jan 24, 2023
8b73765
Few updates to RTD documentation
KateFriedman-NOAA Jan 24, 2023
5ec526a
Add development tools section
KateFriedman-NOAA Jan 24, 2023
df57a43
Fix case in new development section
KateFriedman-NOAA Jan 24, 2023
215173a
Fix syntax in new development section
KateFriedman-NOAA Jan 24, 2023
4ca448c
Retrying list table format with code in setup
KateFriedman-NOAA Jan 24, 2023
184ad0f
Update code block in setup table
KateFriedman-NOAA Jan 24, 2023
f793d22
Update setup tables to list format
KateFriedman-NOAA Jan 24, 2023
fa0d200
Correct rocoto table in setup
KateFriedman-NOAA Jan 24, 2023
a554027
Merge branch 'NOAA-EMC:develop' into feature/update-rtd
KateFriedman-NOAA Jan 25, 2023
21e5aca
Merge branch 'NOAA-EMC:develop' into feature/update-rtd
KateFriedman-NOAA Jan 27, 2023
80ca5e6
Update initial condition documentation
KateFriedman-NOAA Jan 27, 2023
3a1f56c
Small updates to init document
KateFriedman-NOAA Jan 27, 2023
24ed3df
Small corrections to init document
KateFriedman-NOAA Jan 27, 2023
dea2866
Update table header in components
KateFriedman-NOAA Jan 27, 2023
0e08eb8
Merge branch 'NOAA-EMC:develop' into feature/update-rtd
KateFriedman-NOAA Jan 31, 2023
a21c26e
Merge branch 'NOAA-EMC:develop' into feature/update-rtd
KateFriedman-NOAA Jan 31, 2023
75f1942
Update clone method section in clone.rst
KateFriedman-NOAA Jan 31, 2023
3f9b701
Remove DO_WAVE from configure table
KateFriedman-NOAA Jan 31, 2023
6be2182
Add note 3 to rocoto viewer section
KateFriedman-NOAA Jan 31, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
76 changes: 55 additions & 21 deletions docs/source/clone.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,17 +6,24 @@ Clone and build Global Workflow
Quick Instructions
^^^^^^^^^^^^^^^^^^

Quick clone/build/link instructions (more detailed instructions below). Note, Here we are making the assumption that you are using the workflow to run an experiment and so are working from the authoritative repository. If you are going to be a developer then follow the instructions in :doc: `development.rst`. Once you do that you can follow the instructions here with the only thing different will be the repository you are cloning from.
Quick clone/build/link instructions (more detailed instructions below).

For forecast-only (coupled or uncoupled)::
.. note::
Here we are making the assumption that you are using the workflow to run an experiment and so are working from the authoritative repository. If you are using a development branch then follow the instructions in :doc:`development.rst`. Once you do that you can follow the instructions here with the only difference being the repository/fork you are cloning from.

For forecast-only (coupled or uncoupled):

::

git clone https://github.com/NOAA-EMC/global-workflow.git
cd global-workflow/sorc
./checkout.sh
./build_all.sh
./link_workflow.sh

For cycled (GSI)::
For cycled (w/ data assimilation):

::

git clone https://github.com/NOAA-EMC/global-workflow.git
cd global-workflow/sorc
Expand All @@ -32,31 +39,47 @@ Clone workflow and component repositories
Workflow
********

https method::
There are several ways to clone repositories from GitHub. Below we describe how to clone the global-workflow using either the ssh or https methods. **The ssh method is highly preferred and recommended.**

git clone https://github.com/NOAA-EMC/global-workflow.git
ssh method (using a password protected SSH key):
KateFriedman-NOAA marked this conversation as resolved.
Show resolved Hide resolved

ssh method (using a password protected SSH key)::
::

git clone git@github.com:NOAA-EMC/global-workflow.git

Note: when using ssh methods you need to make sure that your GitHub account is configured for the computer from which you are accessing the repository (See `this link <https://docs.github.com/en/authentication/connecting-to-github-with-ssh/adding-a-new-ssh-key-to-your-github-account>`_)
.. note::
When using ssh methods you need to make sure that your GitHub account is configured for the computer from which you are accessing the repository (See `this link <https://docs.github.com/en/authentication/connecting-to-github-with-ssh/adding-a-new-ssh-key-to-your-github-account>`_)

https method:

::

git clone https://github.com/NOAA-EMC/global-workflow.git

Check what you just cloned (by default you will have only the develop branch):

Check what you just cloned (by default you will have only the develop branch)::
::

cd global-workflow
git branch
* develop

You now have a cloned copy of the global-workflow git repository. To checkout a branch or tag in your clone::
You now have a cloned copy of the global-workflow git repository. To checkout a branch or tag in your clone:

::

git checkout BRANCH_NAME

Note: Branch must already exist. If it does not you need to make a new branch using the ``-b`` flag::
.. note::
Branch must already exist. If it does not you need to make a new branch using the ``-b`` flag:

::

git checkout -b BRANCH_NAME

The ``checkout`` command will checkout BRANCH_NAME and switch your clone to that branch. Example::
The ``checkout`` command will checkout BRANCH_NAME and switch your clone to that branch. Example:

::

git checkout my_branch
git branch
Expand All @@ -67,27 +90,35 @@ The ``checkout`` command will checkout BRANCH_NAME and switch your clone to that
Components
**********

Once you have cloned the workflow repository it's time to checkout/clone its components. The components will be checked out under the /sorc folder via a script called checkout.sh. Run the script with no arguments for forecast-only::
Once you have cloned the workflow repository it's time to checkout/clone its components. The components will be checked out under the ``/sorc`` folder via a script called checkout.sh. Run the script with no arguments for forecast-only:

::

cd sorc
./checkout.sh

Or with the `-g` switch to include GSI for cycling::
Or with the ``-g`` switch to include data assimilation (GSI) for cycling:

::

cd sorc
./checkout.sh -g

If wishing to run with the operational GTG UPP and WAFS (only for select users) provide the -o flag with checkout.sh::
If wishing to run with the operational GTG UPP and WAFS (only for select users) provide the ``-o`` flag with checkout.sh:

::

./checkout.sh -o

Each component cloned via checkout.sh will have a log (checkout-COMPONENT.log). Check the screen output and logs for clone errors.
Each component cloned via checkout.sh will have a log (``/sorc/logs/checkout-COMPONENT.log``). Check the screen output and logs for clone errors.

^^^^^^^^^^^^^^^^
Build components
^^^^^^^^^^^^^^^^

Under the /sorc folder is a script to build all components called ``build_all.sh``. After running checkout.sh run this script to build all components codes::
Under the ``/sorc`` folder is a script to build all components called ``build_all.sh``. After running checkout.sh run this script to build all components codes:

::

./build_all.sh [-a UFS_app][-c build_config][-h][-v]
-a UFS_app:
Expand All @@ -103,17 +134,20 @@ A partial build option is also available via two methods:

a) modify gfs_build.cfg config file to disable/enable particular builds and then rerun build_all.sh

b) run individual build scripts also available in /sorc folder for each component or group of codes
b) run individual build scripts also available in ``/sorc`` folder for each component or group of codes

^^^^^^^^^^^^^^^
Link components
^^^^^^^^^^^^^^^

At runtime the global-workflow needs all pieces in place within the main superstructure. To establish this a link script is run to create symlinks from the top level folders down to component files checked out in /sorc folders.
At runtime the global-workflow needs all pieces in place within the main superstructure. To establish this a link script is run to create symlinks from the top level folders down to component files checked out in ``/sorc`` folders.

After running the checkout and build scripts run the link script::
After running the checkout and build scripts run the link script:

::

./link_workflow.sh [-o]
where:
-o: Run in operations (NCO) mode. This creates copies instead of using symlinks and is generally only used by NCO during installation into production.

Where:
``-o``: Run in operations (NCO) mode. This creates copies instead of using symlinks and is generally only used by NCO during installation into production.

31 changes: 17 additions & 14 deletions docs/source/components.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,26 +13,27 @@ The major components of the system are:
* Post-processing
* Verification

The Global Workflow repository contains the workflow layer and, after running the checkout script, the code and scripts for the analysis, forecast, and post-processing components. Any non-workflow component is known as a sub-module. All of the sub-modules of the system reside in their respective repositories on GitHub. The global-workflow sub-modules are obtained by running the checkout script found under the /sorc folder.
The Global Workflow repository contains the workflow and script layers. After running the checkout script, the code and additional offline scripts for the analysis, forecast, and post-processing components will be present. Any non-workflow component is known as a sub-module. All of the sub-modules of the system reside in their respective repositories on GitHub. The global-workflow sub-modules are obtained by running the checkout script found under the /sorc folder.

======================
Component repositories
======================

Components checked out via sorc/checkout.sh:

* **GFS UTILS** (https://github.com/ufs-community/gfs_utils): Utility codes needed by Global Workflow to run the GFS configuration
* **UFS-Weather-Model** (https://github.com/ufs-community/ufs-weather-model): This is the core model used by the Global-Workflow to provide forecasts. The UFS-weather-model repository is an umbrella repository consisting of cooupled component earth systeme that are all checked out when we check out the code at the top level of the repoitory
* **GSI** (https://github.com/NOAA-EMC/GSI): This is the core code base for atmospheric Data Assimilation
* **GSI UTILS** (https://github.com/NOAA-EMC/GSI-UTILS): Utility codes needed by GSI to create analysis
* **GSI UTILS** (https://github.com/NOAA-EMC/GSI-Utils): Utility codes needed by GSI to create analysis
* **GSI Monitor** (https://github.com/NOAA-EMC/GSI-Monitor): These tools monitor the GSI package's data assimilation, detecting and reporting missing data sources, low observation counts, and high penalty values
* **GLDAS** (https://github.com/NOAA-EMC/GLDAS): Code base for Land Data Assimiation
* **GDAS** (https://github.com/NOAA-EMC/GDASApp): Jedi based Data Assimilation system. This system is currently being developed for marine Data Assimilation and in time will replace GSI for atmospheric data assimilation as well
* **UFS UTILS** (https://github.com/ufs-community/UFS_UTILS): Utility codes needed for UFS-weather-model
* **GFS UTILS** (https://github.com/ufs-community/gfs_utils): Utility codes needed by Global Workflow to run the GFS configuration
* **Verif global** (https://github.com/NOAA-EMC/EMC_verif-global): Verification package to evaluate GFS parallels. It uses MET and METplus. At this moment the verification package is limited to providing atmospheric metrics only
* **GFS WAFS** (https://github.com/NOAA-EMC/EMC_gfs_wafs): Additional post processing products for Aircrafts

Note, when running the system in forecast mode only the Data Assimilation conmponents are not needed and are hence not checked out.
.. note::
When running the system in forecast-only mode the Data Assimilation components are not needed and are hence not checked out.

=====================
External dependencies
Expand All @@ -55,7 +56,7 @@ Observation data (OBSPROC/prep)
Data
****

Observation data, also known as dump data, is prepared in production and then archived in a global dump archive (GDA) for use by users when running cycled experiment. The GDA (identified as ``$DMPDIR`` in the workflow) is available on supported platforms and the workflow system knows where to find the data.
Observation data, also known as dump data, is prepared in production and then archived in a global dump archive (GDA) for use by users when running cycled experiments. The GDA (identified as ``$DMPDIR`` in the workflow) is available on supported platforms and the workflow system knows where to find the data.

* Hera: /scratch1/NCEPDEV/global/glopara/dump
* Orion: /work/noaa/rstprod/dump
Expand All @@ -68,23 +69,25 @@ Global Dump Archive Structure

The global dump archive (GDA) mimics the structure of its production source: ``DMPDIR/CDUMP.PDY/[CC/atmos/]FILES``

The ``CDUMP`` is either gdas, gfs, or rtofs. All three contain production output for each day (``PDY``). The gdas and gfs folders are further broken into cycle (``CC``) and component (atmos).
The ``CDUMP`` is either gdas, gfs, or rtofs. All three contain production output for each day (``PDY``). The gdas and gfs folders are further broken into cycle (``CC``) and component (``atmos``).

The GDA also contains special versions of some datasets and experimental data that is being evaluated ahead of implementation into production. The following subfolder suffixes exist:

+--------+------------------------------------------------------------------------------------------------------+
| Suffix | What |
+--------+------------------------------------------------------------------------------------------------------+
| nr | Non-restricted versions of restricted files in production. |
| SUFFIX | WHAT |
+========+======================================================================================================+
| nr | Non-restricted versions of restricted files in production. Produced in production. Restriced data is |
| | fully stripped from files. These files remain as is. |
+--------+------------------------------------------------------------------------------------------------------+
| ur | Un-restricted versions of restricted files in production. Produced and archived on a 48hrs delay. |
| | Some restricted datasets are unrestricted. Data amounts: restricted > un-restricted > non-restricted |
+--------+------------------------------------------------------------------------------------------------------+
| x | Experimental global datasets being evaluated for production. Dates and types vary depending on |
| | upcoming global upgrades. |
+--------+------------------------------------------------------------------------------------------------------+
| y | Similar to "x" but only used when there is a duplicate experimental file that is in the x subfolder |
| | with the same name. These files will be different from both the production versions |
| | (if that exists already) and the x versions. This suffix is rarely used. |
| y | Similar to "x" but only used when there is a duplicate experimental file in the x subfolder with the |
| | same name. These files will be different from both the production versions (if that exists already) |
| | and the x versions. This suffix is rarely used. |
+--------+------------------------------------------------------------------------------------------------------+
| p | Pre-production copy of full dump dataset, as produced by NCO during final 30-day parallel ahead of |
| | implementation. Not always archived. |
Expand All @@ -94,9 +97,9 @@ The GDA also contains special versions of some datasets and experimental data th
Data processing
***************

Upstream of the global-workflow is the collection, quality control, and packaging of observed weather. The handling of that data is done by the OBSPROC group codes and scripts. The global-workflow uses two packages from OBSPROC to run its prep step to prepare observation data for use by the analysis system:
Upstream of the global-workflow is the collection, quality control, and packaging of observed weather. The handling of that data is done by the OBSPROC group codes and scripts. The global-workflow uses two packages from OBSPROC to run its prep step to prepare observation (dump) data for use by the analysis system:

1. https://github.com/NOAA-EMC/obsproc
2. https://github.com/NOAA-EMC/prepobs

Both package versions and locations on support platforms are set in the global-workflow system configs.
Package versions and locations on supported platforms are set in the global-workflow system configs, modulefiles, and version files.
Loading