Skip to content

Commit

Permalink
Add initial Sphinx documentation (#1258)
Browse files Browse the repository at this point in the history
Adds Sphinx documentation in the docs directory. The documentation reflects the current information in the wiki. Future commits will automate the rendering of the docs to readthedocs and add Sphinx parsing of script documentation.

Refs: #9
  • Loading branch information
arunchawla-NOAA committed Jan 20, 2023
1 parent 78509d6 commit 9fc8275
Show file tree
Hide file tree
Showing 23 changed files with 1,794 additions and 0 deletions.
20 changes: 20 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#

# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = source
BUILDDIR = build

# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

.PHONY: help Makefile

# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
35 changes: 35 additions & 0 deletions docs/make.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
@ECHO OFF

pushd %~dp0

REM Command file for Sphinx documentation

if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=source
set BUILDDIR=build

if "%1" == "" goto help

%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)

%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end

:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%

:end
popd
2 changes: 2 additions & 0 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
sphinxcontrib-bibtex
sphinx_rtd_theme
Binary file added docs/source/_static/GFS_v16_flowchart.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
19 changes: 19 additions & 0 deletions docs/source/_static/custom.css
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
@import "default.css";

div.admonition-todo {
border-top: 2px solid red;
border-bottom: 2px solid red;
border-left: 2px solid red;
border-right: 2px solid red;
background-color: #ff6347
}

p.admonition-title {
display: offline;
}

/*p.first.admonition-title {
background-color: #aa6347;
width: 100%;
}
*/
Binary file added docs/source/_static/fv3_rocoto_view.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
9 changes: 9 additions & 0 deletions docs/source/_static/theme_overrides.css
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
/* !important prevents the common CSS stylesheets from overriding this CSS since on RTD they are loaded after this stylesheet */

.wy-nav-content {
max-width: 100% !important;
}

.wy-table-responsive table td {
white-space: normal !important;
}
119 changes: 119 additions & 0 deletions docs/source/clone.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,119 @@
===============================
Clone and build Global Workflow
===============================

^^^^^^^^^^^^^^^^^^
Quick Instructions
^^^^^^^^^^^^^^^^^^

Quick clone/build/link instructions (more detailed instructions below). Note, Here we are making the assumption that you are using the workflow to run an experiment and so are working from the authoritative repository. If you are going to be a developer then follow the instructions in :doc: `development.rst`. Once you do that you can follow the instructions here with the only thing different will be the repository you are cloning from.

For forecast-only (coupled or uncoupled)::

git clone https://github.com/NOAA-EMC/global-workflow.git
cd global-workflow/sorc
./checkout.sh
./build_all.sh
./link_workflow.sh

For cycled (GSI)::

git clone https://github.com/NOAA-EMC/global-workflow.git
cd global-workflow/sorc
./checkout.sh -g
./build_all.sh
./link_workflow.sh

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Clone workflow and component repositories
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

********
Workflow
********

https method::

git clone https://github.com/NOAA-EMC/global-workflow.git

ssh method (using a password protected SSH key)::

git clone git@github.com:NOAA-EMC/global-workflow.git

Note: when using ssh methods you need to make sure that your GitHub account is configured for the computer from which you are accessing the repository (See `this link <https://docs.github.com/en/authentication/connecting-to-github-with-ssh/adding-a-new-ssh-key-to-your-github-account>`_)

Check what you just cloned (by default you will have only the develop branch)::

cd global-workflow
git branch
* develop

You now have a cloned copy of the global-workflow git repository. To checkout a branch or tag in your clone::

git checkout BRANCH_NAME

Note: Branch must already exist. If it does not you need to make a new branch using the ``-b`` flag::

git checkout -b BRANCH_NAME

The ``checkout`` command will checkout BRANCH_NAME and switch your clone to that branch. Example::

git checkout my_branch
git branch
* my_branch
develop

**********
Components
**********

Once you have cloned the workflow repository it's time to checkout/clone its components. The components will be checked out under the /sorc folder via a script called checkout.sh. Run the script with no arguments for forecast-only::

cd sorc
./checkout.sh

Or with the `-g` switch to include GSI for cycling::

cd sorc
./checkout.sh -g

If wishing to run with the operational GTG UPP and WAFS (only for select users) provide the -o flag with checkout.sh::

./checkout.sh -o

Each component cloned via checkout.sh will have a log (checkout-COMPONENT.log). Check the screen output and logs for clone errors.

^^^^^^^^^^^^^^^^
Build components
^^^^^^^^^^^^^^^^

Under the /sorc folder is a script to build all components called ``build_all.sh``. After running checkout.sh run this script to build all components codes::

./build_all.sh [-a UFS_app][-c build_config][-h][-v]
-a UFS_app:
Build a specific UFS app instead of the default
-c build_config:
Selectively build based on the provided config instead of the default config
-h:
Print usage message and exit
-v:
Run all scripts in verbose mode

A partial build option is also available via two methods:

a) modify gfs_build.cfg config file to disable/enable particular builds and then rerun build_all.sh

b) run individual build scripts also available in /sorc folder for each component or group of codes

^^^^^^^^^^^^^^^
Link components
^^^^^^^^^^^^^^^

At runtime the global-workflow needs all pieces in place within the main superstructure. To establish this a link script is run to create symlinks from the top level folders down to component files checked out in /sorc folders.

After running the checkout and build scripts run the link script::

./link_workflow.sh [-o]
where:
-o: Run in operations (NCO) mode. This creates copies instead of using symlinks and is generally only used by NCO during installation into production.

102 changes: 102 additions & 0 deletions docs/source/components.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,102 @@
###########################
Global Workflow Components
###########################

The global-workflow is a combination of several components working together to prepare, analyze, produce, and post-process forecast data.

The major components of the system are:

* Workflow
* Pre-processing
* Analysis
* Forecast
* Post-processing
* Verification

The Global Workflow repository contains the workflow layer and, after running the checkout script, the code and scripts for the analysis, forecast, and post-processing components. Any non-workflow component is known as a sub-module. All of the sub-modules of the system reside in their respective repositories on GitHub. The global-workflow sub-modules are obtained by running the checkout script found under the /sorc folder.

======================
Component repositories
======================

Components checked out via sorc/checkout.sh:

* **UFS-Weather-Model** (https://github.com/ufs-community/ufs-weather-model): This is the core model used by the Global-Workflow to provide forecasts. The UFS-weather-model repository is an umbrella repository consisting of cooupled component earth systeme that are all checked out when we check out the code at the top level of the repoitory
* **GSI** (https://github.com/NOAA-EMC/GSI): This is the core code base for atmospheric Data Assimilation
* **GSI UTILS** (https://github.com/NOAA-EMC/GSI-UTILS): Utility codes needed by GSI to create analysis
* **GSI Monitor** (https://github.com/NOAA-EMC/GSI-Monitor): These tools monitor the GSI package's data assimilation, detecting and reporting missing data sources, low observation counts, and high penalty values
* **GLDAS** (https://github.com/NOAA-EMC/GLDAS): Code base for Land Data Assimiation
* **GDAS** (https://github.com/NOAA-EMC/GDASApp): Jedi based Data Assimilation system. This system is currently being developed for marine Data Assimilation and in time will replace GSI for atmospheric data assimilation as well
* **UFS UTILS** (https://github.com/ufs-community/UFS_UTILS): Utility codes needed for UFS-weather-model
* **GFS UTILS** (https://github.com/ufs-community/gfs_utils): Utility codes needed by Global Workflow to run the GFS configuration
* **Verif global** (https://github.com/NOAA-EMC/EMC_verif-global): Verification package to evaluate GFS parallels. It uses MET and METplus. At this moment the verification package is limited to providing atmospheric metrics only
* **GFS WAFS** (https://github.com/NOAA-EMC/EMC_gfs_wafs): Additional post processing products for Aircrafts

Note, when running the system in forecast mode only the Data Assimilation conmponents are not needed and are hence not checked out.

=====================
External dependencies
=====================

^^^^^^^^^
Libraries
^^^^^^^^^

All the libraries that are needed to run the end to end Global Workflow are built using a package manager. Currently these are served via HPC-STACK but will soon be available via SPACK-STACK. These libraries are already available on supported NOAA HPC platforms

Find information on official installations of HPC-STACK here:

https://github.com/NOAA-EMC/hpc-stack/wiki/Official-Installations

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Observation data (OBSPROC/prep)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
****
Data
****

Observation data, also known as dump data, is prepared in production and then archived in a global dump archive (GDA) for use by users when running cycled experiment. The GDA (identified as ``$DMPDIR`` in the workflow) is available on supported platforms and the workflow system knows where to find the data.

* Hera: /scratch1/NCEPDEV/global/glopara/dump
* Orion: /work/noaa/rstprod/dump
* WCOSS2: /lfs/h2/emc/global/noscrub/emc.global/dump
* S4: /data/prod/glopara/dump

-----------------------------
Global Dump Archive Structure
-----------------------------

The global dump archive (GDA) mimics the structure of its production source: ``DMPDIR/CDUMP.PDY/[CC/atmos/]FILES``

The ``CDUMP`` is either gdas, gfs, or rtofs. All three contain production output for each day (``PDY``). The gdas and gfs folders are further broken into cycle (``CC``) and component (atmos).

The GDA also contains special versions of some datasets and experimental data that is being evaluated ahead of implementation into production. The following subfolder suffixes exist:

+--------+------------------------------------------------------------------------------------------------------+
| Suffix | What |
+--------+------------------------------------------------------------------------------------------------------+
| nr | Non-restricted versions of restricted files in production. |
+--------+------------------------------------------------------------------------------------------------------+
| ur | Un-restricted versions of restricted files in production. Produced and archived on a 48hrs delay. |
+--------+------------------------------------------------------------------------------------------------------+
| x | Experimental global datasets being evaluated for production. Dates and types vary depending on |
| | upcoming global upgrades. |
+--------+------------------------------------------------------------------------------------------------------+
| y | Similar to "x" but only used when there is a duplicate experimental file that is in the x subfolder |
| | with the same name. These files will be different from both the production versions |
| | (if that exists already) and the x versions. This suffix is rarely used. |
+--------+------------------------------------------------------------------------------------------------------+
| p | Pre-production copy of full dump dataset, as produced by NCO during final 30-day parallel ahead of |
| | implementation. Not always archived. |
+--------+------------------------------------------------------------------------------------------------------+

***************
Data processing
***************

Upstream of the global-workflow is the collection, quality control, and packaging of observed weather. The handling of that data is done by the OBSPROC group codes and scripts. The global-workflow uses two packages from OBSPROC to run its prep step to prepare observation data for use by the analysis system:

1. https://github.com/NOAA-EMC/obsproc
2. https://github.com/NOAA-EMC/prepobs

Both package versions and locations on support platforms are set in the global-workflow system configs.
Loading

0 comments on commit 9fc8275

Please sign in to comment.