From f73fc3a8e6afc0415f34e371f8122e78663819d7 Mon Sep 17 00:00:00 2001 From: Michael Kavulich Date: Wed, 29 Jun 2022 11:58:26 -0600 Subject: [PATCH] Final updates for v6 (#62) --- CCPPtechnical/source/Acronyms.rst | 4 +- CCPPtechnical/source/AddingNewSchemes.rst | 29 +--- CCPPtechnical/source/AutoGenPhysCaps.rst | 7 +- CCPPtechnical/source/CCPPDebug.rst | 6 +- CCPPtechnical/source/CCPPPreBuild.rst | 156 +++++++++++++++--- .../source/CompliantPhysicsParams.rst | 6 - CCPPtechnical/source/ConfigBuildOptions.rst | 8 +- CCPPtechnical/source/ConstructingSuite.rst | 22 +-- CCPPtechnical/source/Glossary.rst | 60 ++++--- CCPPtechnical/source/Overview.rst | 37 ++--- CCPPtechnical/source/ParamSpecificOutput.rst | 7 +- CCPPtechnical/source/ScientificDocRules.inc | 6 +- CCPPtechnical/source/conf.py | 2 +- 13 files changed, 222 insertions(+), 128 deletions(-) diff --git a/CCPPtechnical/source/Acronyms.rst b/CCPPtechnical/source/Acronyms.rst index 01e6fa9..8653932 100644 --- a/CCPPtechnical/source/Acronyms.rst +++ b/CCPPtechnical/source/Acronyms.rst @@ -92,6 +92,8 @@ Acronyms | NUOPC | National Unified Operational Prediction | | | Capability | +----------------+---------------------------------------------------+ + | NWP | Numerical Weather Prediction | + +----------------+---------------------------------------------------+ | OpenMP | Open Multi-Processing | +----------------+---------------------------------------------------+ | PBL | Planetary Boundary Layer | @@ -138,7 +140,5 @@ Acronyms +----------------+---------------------------------------------------+ | UFS | Unified Forecast System | +----------------+---------------------------------------------------+ - | VLab | Virtual Laboratory | - +----------------+---------------------------------------------------+ | WRF | Weather Research and Forecasting | +----------------+---------------------------------------------------+ diff --git a/CCPPtechnical/source/AddingNewSchemes.rst b/CCPPtechnical/source/AddingNewSchemes.rst index 9ce4247..2a249ed 100644 --- a/CCPPtechnical/source/AddingNewSchemes.rst +++ b/CCPPtechnical/source/AddingNewSchemes.rst @@ -4,17 +4,17 @@ Tips for Adding a New Scheme **************************************** -This chapter contains a brief description on how to add a new scheme to the *CCPP Physics* pool. +This chapter contains a brief description on how to add a new :term:`scheme` to the :term:`CCPP Physics` pool. * Identify the variables required for the new scheme and check if they are already available for use in the CCPP by checking the metadata information in ``GFS_typedefs.meta`` or by perusing file ``ccpp-framework/doc/DevelopersGuide/CCPP_VARIABLES_{FV3,SCM}.pdf`` generated by ``ccpp_prebuild.py``. - * If the variables are already available, they can be invoked in the scheme’s metadata file and one can skip the rest of this subsection. If the variable required is not available, consider if it can be calculated from the existing variables in the CCPP. If so, an interstitial scheme (such as ``scheme_pre``; see more in :numref:`Chapter %s `) can be created to calculate the variable. However, the variable must be defined but not initialized in the host model as the memory for this variable must be allocated on the host model side. Instructions for how to add variables to the host model side is described in :numref:`Chapter %s `. + * If the variables are already available, they can be invoked in the scheme’s metadata file and one can skip the rest of this subsection. If the variable required is not available, consider if it can be calculated from the existing variables in the CCPP. If so, an :term:`interstitial scheme` (such as ``scheme_pre``; see more in :numref:`Chapter %s `) can be created to calculate the variable. However, the variable must be defined but not initialized in the :term:`host model` as the memory for this variable must be allocated on the host model side. Instructions for how to add variables to the host model side is described in :numref:`Chapter %s `. - .. note:: The CCPP framework is capable of performing automatic unit conversions between variables provided by the host model and variables required by the new scheme. See :numref:`Section %s ` for details. + .. note:: The :term:`CCPP framework` is capable of performing automatic unit conversions between variables provided by the host model and variables required by the new scheme. See :numref:`Section %s ` for details. * If new namelist variables need to be added, the ``GFS_control_type`` DDT should be used. In this case, it is also important to modify the namelist file ``input.nml`` to include the new variable. - * It is important to note that not all data types are persistent in memory. Most variables in the interstitial data type are reset (to zero or other initial values) at the beginning of a physics group and do not persist from one set to another or from one group to another. The diagnostic data type is periodically reset because it is used to accumulate variables for given time intervals. However, there is a small subset of interstitial variables that are set at creation time and are not reset; these are typically dimensions used in other interstitial variables. + * It is important to note that not all data types are persistent in memory. Most variables in the interstitial data type are reset (to zero or other initial values) at the beginning of a physics :term:`group` and do not persist from one :term:`set` to another or from one group to another. The diagnostic data type is periodically reset because it is used to accumulate variables for given time intervals. However, there is a small subset of interstitial variables that are set at creation time and are not reset; these are typically dimensions used in other interstitial variables. .. note:: If the value of a variable must be remembered from one call to the next, it should not be in the interstitial or diagnostic data types. @@ -22,16 +22,16 @@ This chapter contains a brief description on how to add a new scheme to the *CCP * Consider allocating the new variable only when needed (i.e. when the new scheme is used and/or when a certain control flag is set). If this is a viable option, following the existing examples in ``GFS_typedefs.F90`` and ``GFS_typedefs.meta`` for allocating the variable and setting the ``active`` attribute in the metadata correctly. -* If an entirely new variable needs to be added, consult the CCPP standard names dictionary and the rules for creating new standard names at https://github.com/escomp/CCPPStandardNames. If in doubt, use the GitHub discussions page in the CCPP Framework repository (https://github.com/ncar/ccpp-framework) to discuss the suggested new standard name(s) with the CCPP developers. +* If an entirely new variable needs to be added, consult the CCPP :term:`standard names` dictionary and the rules for creating new standard names at https://github.com/escomp/CCPPStandardNames. If in doubt, use the GitHub discussions page in the CCPP Framework repository (https://github.com/ncar/ccpp-framework) to discuss the suggested new standard name(s) with the CCPP developers. -* Examine scheme-specific and suite interstitials to see what needs to be replaced/changed; then check existing scheme interstitial and determine what needs to replicated. Identify if your new scheme requires additional interstitial code that must be run before or after the scheme and that cannot be part of the scheme itself, for example because of dependencies on other schemes and/or the order the scheme is run in the SDF. +* Examine scheme-specific and suite interstitials to see what needs to be replaced/changed; then check existing scheme interstitial and determine what needs to replicated. Identify if your new scheme requires additional interstitial code that must be run before or after the scheme and that cannot be part of the scheme itself, for example because of dependencies on other schemes and/or the order the scheme is run in the :term:`SDF`. * Follow the guidelines outlined in :numref:`Chapter %s ` to make your scheme CCPP-compliant. Make sure to use an uppercase suffix ``.F90`` to enable C preprocessing. * Locate the CCPP *prebuild* configuration files for the target host model, for example: - * ``ufs-weather-model/FV3/ccpp/config/ccpp_prebuild_config.py`` for the UFS Atmosphere - * ``ccpp-scm/ccpp/config/ccpp_prebuild_config.py`` for the SCM + * ``ufs-weather-model/FV3/ccpp/config/ccpp_prebuild_config.py`` for the :term:`UFS Atmosphere` + * ``ccpp-scm/ccpp/config/ccpp_prebuild_config.py`` for the :term:`SCM` * Add the new scheme to the Python dictionary in ``ccpp_prebuild_config.py`` using the same path as the existing schemes: @@ -43,19 +43,6 @@ This chapter contains a brief description on how to add a new scheme to the *CCP ’../some_relative_path/new_scheme.F90’, ...] -* If the new scheme uses optional arguments, add information on which ones to use further down in the configuration file. See existing entries and documentation in the configuration file for the possible options: - - .. code-block:: console - - OPTIONAL_ARGUMENTS = { - ’SCHEME_NAME’ : { - ’SCHEME_NAME_run’ : [ - # list of all optional arguments in use for this - # model, by standard_name ], - # instead of list [...], can also say ’all’ or ’none’ - }, - } - * Place new scheme in the same location as existing schemes in the CCPP directory structure, e.g., ``../some_relative_path/new_scheme.F90``. * Edit the SDF and add the new scheme at the place it should be run. SDFs are located in diff --git a/CCPPtechnical/source/AutoGenPhysCaps.rst b/CCPPtechnical/source/AutoGenPhysCaps.rst index c918412..3b6aaf5 100644 --- a/CCPPtechnical/source/AutoGenPhysCaps.rst +++ b/CCPPtechnical/source/AutoGenPhysCaps.rst @@ -4,11 +4,11 @@ Suite and Group *Caps* **************************************** -The connection between the host model and the physics schemes through the CCPP Framework -is realized with *caps* on both sides as illustrated in :numref:`Figure %s `. +The connection between the :term:`host model` and the physics :term:`schemes` through the :term:`CCPP Framework` +is realized with :term:`caps` on both sides as illustrated in :numref:`Figure %s `. The CCPP *prebuild* script discussed in :numref:`Chapter %s ` generates the *caps* that connect the physics schemes to the CCPP Framework. -This chapter describes the suite and group *caps*, +This chapter describes the :term:`suite` and :term:`group caps`, while the host model *caps* are described in :numref:`Chapter %s `. These *caps* autogenerated by ``ccpp_prebuild.py`` reside in the directory defined by the ``CAPS_DIR`` variable (see example in :ref:`Listing 8.1 `). @@ -33,6 +33,7 @@ The CCPP *prebuild* step performs the tasks below. * Populate makefiles with schemes and *caps*. The *prebuild* step will produce the following files for any host model. Note that the location of these files varies between the host models and whether an in-source or out-of-source build is used. + * List of variables provided by host model and required by physics: .. code-block:: console diff --git a/CCPPtechnical/source/CCPPDebug.rst b/CCPPtechnical/source/CCPPDebug.rst index bd14b3b..b4899bf 100644 --- a/CCPPtechnical/source/CCPPDebug.rst +++ b/CCPPtechnical/source/CCPPDebug.rst @@ -22,7 +22,7 @@ Two categories of debugging with CCPP Debugging the actual physical parameterizations is identical in CCPP and in physics-driver based models. The parameterizations have access to the same data and debug print statements can be added in exactly the same way. * Debugging on a suite level - Debugging on a suite level, i.e. outside physical parameterizations, corresponds to debugging on the physics-driver level in traditional, physics-driver based models. In the CCPP, this can be achieved by using dedicated CCPP-compliant debugging schemes, which have access to all the data by requesting them via the metadata files. These schemes can then be called in any place in a SDF, except the ``fast_physics`` group, to produce the desired debugging output. The advantage of this approach is that debugging schemes can be moved from one place to another or duplicated by simply moving/copying a single line in the SDF before recompiling the code. The disadvantage is that different debugging schemes may be needed, depending on the host model and their data structures. For example, the UFS models use blocked data structures. The blocked data structures are commonly known as “GFS types”, are defined in ``GFS_typedefs.F90`` and exposed to the CCPP in ``GFS_typedefs.meta``. The rationale for this storage model is a better cache reuse by breaking up contiguous horizontal grid columns into N blocks with a predefined block size, and allocating each of the GFS types N times. For example, the 3-dimensional air temperature is stored as + Debugging on a suite level, i.e. outside physical parameterizations, corresponds to debugging on the physics-driver level in traditional, physics-driver based models. In the CCPP, this can be achieved by using dedicated CCPP-compliant debugging schemes, which have access to all the data by requesting them via the metadata files. These schemes can then be called in any place in an SDF, except the ``fast_physics`` group, to produce the desired debugging output. The advantage of this approach is that debugging schemes can be moved from one place to another or duplicated by simply moving/copying a single line in the SDF before recompiling the code. The disadvantage is that different debugging schemes may be needed, depending on the host model and their data structures. For example, the UFS models use blocked data structures. The blocked data structures are commonly known as “GFS types”, are defined in ``GFS_typedefs.F90`` and exposed to the CCPP in ``GFS_typedefs.meta``. The rationale for this storage model is a better cache reuse by breaking up contiguous horizontal grid columns into N blocks with a predefined block size, and allocating each of the GFS types N times. For example, the 3-dimensional air temperature is stored as .. code-block:: console @@ -30,7 +30,7 @@ Two categories of debugging with CCPP .. _codeblockends: - Further, the UFS models run a subset of physics inside the dynamical core (“fast physics”), for which the host model data is stored inside the dynamical core and cannot be shared with the traditional (“slow”) physics. As such, different debugging schemes are required for the ``fast_physics`` group. + Further, the UFS models run a subset of physics inside the dynamical core (“:term:`fast physics`”), for which the host model data is stored inside the dynamical core and cannot be shared with the traditional (“:term:`slow`”) physics. As such, different debugging schemes are required for the ``fast_physics`` group. ============================================ @@ -200,7 +200,7 @@ Below is an example for an SDF that prints debugging output from the standard/pe How to customize the debugging schemes and the output for arrays in the UFS --------------------------------------------------------------------------- -At the top of ``GFS_debug.F90``, there are customization options in the form of preprocessor directives (CPP ``#ifdef`` etc statements) and a brief documentation. Users not familiar with preprocessor directives are referred to the available documentation such as `Using fpp Preprocessor Directives `_ +At the top of ``GFS_debug.F90``, there are customization options in the form of preprocessor directives (CPP ``#ifdef`` etc statements) and a brief documentation. Users not familiar with preprocessor directives are referred to the available documentation such as `Using fpp Preprocessor Directives `_ At this point, three options exist: (1) full output of every element of each array if none of the #define preprocessor statements is used, (2) minimum, maximum, and mean value of arrays (default for GNU compiler), and (3) minimum, maximum, and 32-bit Adler checksum of arrays (default for Intel compiler). Note that Option (3), the Adler checksum calculation, cannot be used with gfortran (segmentation fault, bug in malloc?). .. code-block:: console diff --git a/CCPPtechnical/source/CCPPPreBuild.rst b/CCPPtechnical/source/CCPPPreBuild.rst index abd04d8..3bafe6c 100644 --- a/CCPPtechnical/source/CCPPPreBuild.rst +++ b/CCPPtechnical/source/CCPPPreBuild.rst @@ -10,15 +10,15 @@ Technical Aspects of the CCPP *Prebuild* The :term:`CCPP` *prebuild* script ``ccpp-framework/scripts/ccpp_prebuild.py`` is the central piece of code that connects the host model with the :term:`CCPP Physics` schemes (see :numref:`%s `). This script must be run -before compiling the :term:`CCPP Physics` library and the host model cap. This may be done manually or as part -of a host model build-time script. Both the UFS and SCM have incorporated the calls to ``ccpp_prebuild.py`` in their build systems. +before compiling the CCPP Physics library and the host model cap. This may be done manually or as part +of a host model build-time script. Both the :term:`UFS` and :term:`SCM` have incorporated the calls to ``ccpp_prebuild.py`` in their build systems. -The :term:`CCPP` *prebuild* script automates several tasks based on the information collected from the metadata +The CCPP *prebuild* script automates several tasks based on the information collected from the metadata on the host model side and from the individual physics schemes (``.meta`` files; see :numref:`Figure %s `): * Compiles a list of variables provided by the host model. - * Compiles a list of variables required to run all schemes in the :term:`CCPP Physics` pool. + * Compiles a list of variables required to run all schemes in the CCPP Physics pool. * Matches these variables by their ``standard_name``, checks for missing variables and mismatches of their attributes (e.g., units, rank, type, kind). Performs automatic unit conversions if a mismatch of units @@ -29,9 +29,9 @@ on the host model side and from the individual physics schemes (``.meta`` files; * Autogenerates software caps as appropriate: * The script generates caps for the suite as a whole and physics groups as defined in the input - :term:`SDF`\s; in addition, the :term:`CCPP` API for the build is generated. + SDFs; in addition, the CCPP API for the build is generated. - * Populates makefiles with kind/type definitions, schemes, caps. Statements to compile the :term:`CCPP` API are included as well. + * Populates makefiles with kind/type definitions, schemes, caps. Statements to compile the CCPP API are included as well. .. _ccpp_prebuild: @@ -46,7 +46,7 @@ on the host model side and from the individual physics schemes (``.meta`` files; Script Configuration ============================= -To connect the :term:`CCPP` with a host model ``XYZ``, a Python-based configuration file for this model must be created in the host model’s repository. The easiest way is to copy an existing configuration file for the SCM in sub-directory ``ccpp/config`` of the ccpp-scm repository. The configuration in ``ccpp_prebuild_config.py`` depends largely on (a) the directory structure of the host model itself, (b) where the ``ccpp-framework`` and the ``ccpp-physics`` directories are located relative to the directory structure of the host model, and (c) from which directory the ``ccpp_prebuild.py`` script is executed before/during the build process (this is referred to as basedir in ``ccpp_prebuild_config_XYZ.py``). +To connect the CCPP with a host model ``XYZ``, a Python-based configuration file for this model must be created in the host model’s repository. The easiest way is to copy an existing configuration file for the SCM in sub-directory ``ccpp/config`` of the ccpp-scm repository. The configuration in ``ccpp_prebuild_config.py`` depends largely on (a) the directory structure of the host model itself, (b) where the ``ccpp-framework`` and the ``ccpp-physics`` directories are located relative to the directory structure of the host model, and (c) from which directory the ``ccpp_prebuild.py`` script is executed before/during the build process (this is referred to as basedir in ``ccpp_prebuild_config_XYZ.py``). :ref:`Listing 8.1 ` contains an example for the CCPP-SCM prebuild config. Here, both ``ccpp-framework`` and ``ccpp-physics`` are located in directories ``ccpp/framework`` and ``ccpp/physics`` of the top-level directory of the host model, and ``ccpp_prebuild.py`` is executed from the same top-level directory. @@ -108,17 +108,6 @@ To connect the :term:`CCPP` with a host model ``XYZ``, a Python-based configurat # Directory where the suite definition files are stored SUITES_DIR = 'ccpp/suites' - # Optional arguments - only required for schemes that use - # optional arguments. ccpp_prebuild.py will throw an exception - # if it encounters a scheme subroutine with optional arguments - # if no entry is made here. Possible values are: 'all', 'none', - # or a list of standard_names: [ 'var1', 'var3' ]. - OPTIONAL_ARGUMENTS = { - #’subroutine_name_1’ : ’all’, - #’subroutine_name_2’ : ’none’, - #’subroutine_name_3’ : [ ’var1’, ’var2’], - } - # Directory where to write static API to STATIC_API_DIR = 'scm/src/' STATIC_API_SRCFILE = 'scm/src/CCPP_STATIC_API.sh' @@ -140,7 +129,7 @@ Although most of the variables in the ``ccpp_prebuild_config.py`` script are des Running ccpp_prebuild.py ============================= -Once the configuration in ``ccpp_prebuild_config.py`` is complete, the ``ccpp_prebuild.py`` script can be run from a specific directory, dependent on the host model. For the SCM, this is the top level directory, i.e. the correct call to the script is ``./ccpp/framework/scripts/ccpp_prebuild.py``. For the :term:`UFS` Atmosphere host model, the script needs to be called from subdirectory ``FV3/ccpp``, relative to the top-level ``ufs-weather-model`` directory. In the following, we use the SCM directory structure. Note that for both SCM and :term:`UFS`, the ``ccpp_prebuild.py`` script is called automatically by the build system. +Once the configuration in ``ccpp_prebuild_config.py`` is complete, the ``ccpp_prebuild.py`` script can be run from a specific directory, dependent on the host model. For the SCM, this is the top level directory, i.e. the correct call to the script is ``./ccpp/framework/scripts/ccpp_prebuild.py``. For the :term:`UFS Atmosphere` host model, the script needs to be called from subdirectory ``FV3/ccpp``, relative to the top-level ``ufs-weather-model`` directory. In the following, we use the SCM directory structure. Note that for both SCM and :term:`UFS`, the ``ccpp_prebuild.py`` script is called automatically by the build system. For developers adding a CCPP-compliant physics scheme, running ``ccpp_prebuild.py`` periodically is recommended to check that the metadata provided with the physics schemes matches what the host model provided. As alluded to above, the ``ccpp_prebuild.py`` script has six command line options, with the path to a host-model specific configuration file (``--config``) being the only required option: @@ -162,7 +151,7 @@ An example invocation of running the script (called from the SCM’s top level d which uses a configuration script located at the specified path. The ``--verbose`` option can be used for more verbose output from the script. -The :term:`SDF`\(s) to compile into the executable can be specified using the ``--suites`` command-line argument. Such files are included with the SCM and ufs-weather-model repositories, and must be included with the code of any host model to use the :term:`CCPP`\. An example of a build using two :term:`SDF`\s is: +The :term:`SDF`\(s) to compile into the executable can be specified using the ``--suites`` command-line argument. Such files are included with the SCM and ufs-weather-model repositories, and must be included with the code of any host model to use the CCPP. An example of a build using two SDFs is: .. code-block:: console @@ -176,7 +165,7 @@ The :term:`SDF`\(s) to compile into the executable can be specified using the `` The ``--debug`` command-line argument enables additional checks on array sizes inside the auto-generated software caps, prior to entering any of the schemes. -If the :term:`CCPP` *prebuild* step is successful, the last output line will be: +If the CCPP *prebuild* step is successful, the last output line will be: ``INFO: CCPP prebuild step completed successfully.`` @@ -197,11 +186,11 @@ If invoking the ``ccpp_prebuild.py`` script fails, some message other than the s #. ``ERROR: Configuration file`` erroneous/path/to/config/file ``not found`` * Check that the path entered for the ``--config`` command line option points to a readable configuration file. #. ``KeyError``: 'erroneous_scheme_name' when using the ``--suites`` option - * This error indicates that a scheme within the supplied :term:`SDF`\s does not match any scheme names found in the SCHEME_FILES variable of the supplied configuration file that lists scheme source files. Double check that the scheme’s source file is included in the SCHEME_FILES list and that the scheme name that causes the error is spelled correctly in the supplied :term:`SDF`\s and matches what is in the source file (minus any ``*_timestep_init``, ``*_init``, ``*_run``, ``*_finalize``, ``*_timestep_finalize`` suffixes). + * This error indicates that a scheme within the supplied :term:`SDF`\s does not match any scheme names found in the SCHEME_FILES variable of the supplied configuration file that lists scheme source files. Double check that the scheme’s source file is included in the SCHEME_FILES list and that the scheme name that causes the error is spelled correctly in the supplied SDFs and matches what is in the source file (minus any ``*_timestep_init``, ``*_init``, ``*_run``, ``*_finalize``, ``*_timestep_finalize`` suffixes). #. ``CRITICAL: Suite definition file`` erroneous/path/to/SDF.xml ``not found``. ``Exception: Parsing suite definition file`` erroneous/path/to/SDF.xml ``failed``. - * Check that the path ``SUITES_DIR`` in the :term:`CCPP` prebuild config and the names entered for the ``--suites`` command line option are correct. + * Check that the path ``SUITES_DIR`` in the CCPP prebuild config and the names entered for the ``--suites`` command line option are correct. #. ``INFO: Parsing metadata tables for variables provided by host model`` … ``IOError: [Errno 2] No such file or directory``: 'erroneous_file.f90' @@ -267,7 +256,7 @@ If invoking the ``ccpp_prebuild.py`` script fails, some message other than the s #. ``ERROR: Variable`` X ``requested by MODULE_``\Y ``SCHEME_``\Z ``SUBROUTINE_``\A ``not provided by the model`` ``Exception: Call to compare_metadata failed.`` - * A variable requested by one or more physics schemes is not being provided by the host model. If the variable exists in the host model but is not being made available for the :term:`CCPP`, an entry must be added to one of the host model variable metadata sections. + * A variable requested by one or more physics schemes is not being provided by the host model. If the variable exists in the host model but is not being made available for the CCPP, an entry must be added to one of the host model variable metadata sections. #. ``ERROR: error, variable`` X ``requested by MODULE_``\Y ``SCHEME_``\Z ``SUBROUTINE_``\A ``cannot be identified unambiguously. Multiple definitions in MODULE_``\Y ``TYPE_``\B * A variable is defined in the host model variable metadata more than once (with the same standard name). Remove the offending entry or provide a different standard name for one of the duplicates. #. ``ERROR: incompatible entries in metadata for variable`` var_name: @@ -306,7 +295,7 @@ Note: One error that the ``ccpp_prebuild.py`` script will not catch is if a phys CCPP Stub Build ======================================================== -New in version 6.0, CCPP includes a *stub* capability, which will build the appropriate basic software caps needed for the compilation of the host model, but not include any of the physics itself. This can be useful for host model debugging, testing "dry" dynamics with no parameterizations, and other use cases where building the whole CCPP physics library would be unnecessary. Currently this capability is only supported for the UFS Atmosphere. +New in version 6.0, CCPP includes a *stub* capability, which will build the appropriate basic software caps needed for the compilation of the :term:`host model`, but not include any of the physics itself. This can be useful for host model debugging, testing "dry" dynamics with no parameterizations, and other use cases where building the whole CCPP physics library would be unnecessary. Currently this capability is only supported for the :term:`UFS Atmosphere`. To create the stub software caps, rather than using the host configuration file as described above, users can use the provided stub config file ``ccpp/framework/stub/ccpp_prebuild_config.py``. From the ``ccpp/framework/stub`` directory, @@ -320,3 +309,120 @@ the prebuild script can be called in this manner to use the CCPP stub build: The rest of the UFS Atmosphere build can proceed as normal. +======================================================== +CCPP Physics Variable Tracker +======================================================== + +New in version 6.0, CCPP includes a tool that allows users to track a given variable's journey +through a specified physics suite. This tool, ``ccpp-framework/scripts/ccpp_track_variables.py``, +given a :term:`suite definition file` and the :term:`standard name` of a variable, +will output the list of subroutines that use this variable -- in the order that they are called -- +as well as the variable's Fortran *intent* +(``in``, ``out``, or ``inout``) within that subroutine. This can allow the user to more easily +determine where specific errors, biases, or other influences on a specific variable or variables +might originate from within the physics suite. The ``--help`` option will give a basic rundown of +how to use the script: + +.. code-block:: console + + ./ccpp_track_variables.py --help + usage: ccpp_track_variables.py [-h] -s SDF -m METADATA_PATH -c CONFIG -v VARIABLE [--debug] + + optional arguments: + -h, --help show this help message and exit + -s SDF, --sdf SDF suite definition file to parse + -m METADATA_PATH, --metadata_path METADATA_PATH + path to CCPP scheme metadata files + -c CONFIG, --config CONFIG + path to CCPP prebuild configuration file + -v VARIABLE, --variable VARIABLE + variable to track through CCPP suite + --debug enable debugging output + +For this initial implementation, this script must be executed from within a :term:`host model`, and must be +called from the same directory that the ``ccpp_prebuild.py`` script is called from. This first +example is called using the :term:`UFS Atmosphere` as a host model, from the directory ``ufs-weather-model/FV3/ccpp``: + +.. code-block:: console + + framework/scripts/ccpp_track_variables.py -c=config/ccpp_prebuild_config.py \ + -s=suites/suite_FV3_RRFS_v1beta.xml -v air_temperature_of_new_state -m ./physics/physics/ + For suite suites/suite_FV3_RRFS_v1beta.xml, the following schemes (in order for each group) use the variable air_temperature_of_new_state: + In group physics + GFS_suite_stateout_reset_run (intent out) + dcyc2t3_run (intent in) + GFS_suite_stateout_update_run (intent out) + ozphys_2015_run (intent in) + get_phi_fv3_run (intent in) + GFS_suite_interstitial_3_run (intent in) + GFS_MP_generic_pre_run (intent in) + mp_thompson_pre_run (intent in) + mp_thompson_run (intent inout) + mp_thompson_post_run (intent inout) + GFS_MP_generic_post_run (intent in) + maximum_hourly_diagnostics_run (intent in) + In group stochastics + GFS_stochastics_run (intent inout) + +In the example above, we can see that the variable ``air_temperature_of_new_state`` is used in +the FV3_RRFS_v1beta suite by several microphysics subroutines, as well a another stochastics parameterization. + +To learn more about a given subroutine, you can search the physics source code within the ``ccpp-physics`` repository, +or you can consult the `CCPP Scientific Documentation +`_: typing the subroutine name into the search +bar should lead you to further information about the subroutine and how it ties into its associated physics scheme. +In addition, because of the naming conventions for subroutines in CCPP-compliant physics schemes, +we can typically see which scheme, as well as which phase within that scheme, is associated with the listed subroutine, +without having to consult any further documentation or source code. For example, the ``mp_thompson_run`` +subroutine is part of the Thompson microphysics scheme, specifically the *run* phase of that scheme. + +This second example is called using the :term:`SCM` as a host model: + +.. code-block:: console + + ccpp/framework/scripts/ccpp_track_variables.py --config=ccpp/config/ccpp_prebuild_config.py \ + -s=ccpp/suites/suite_SCM_GFS_v17_p8.xml -v surface_friction_velocity_over_land -m ./ccpp/physics/physics/ + For suite ccpp/suites/suite_SCM_GFS_v17_p8.xml, the following schemes (in order for each group) use the variable surface_friction_velocity_over_land: + In group physics + GFS_surface_composites_pre_run (intent inout) + sfc_diff_run (intent inout) + noahmpdrv_run (intent inout) + sfc_diff_run (intent inout) + noahmpdrv_run (intent inout) + GFS_surface_composites_post_run (intent in) + +In the example above, we can see that the variable ``wind_speed_at_lowest_model_layer`` is used in a few subroutines, +two of which (``sfc_diff_run`` and ``noahmpdrv_run`` are listed twice). This is not an error! The +two repeated subroutines are part of a scheme called in a *subcycle* (see :numref:`Section %s `), and so they are called twice in this cycle as designated in the SDF. +The ``ccpp_track_variables.py`` script lists the subroutines in the exact order they are called (within each *group*), including subcycles. + +Some standard names can be exceedingly long and hard to remember, and it is not always convenient to search the full list of standard names for the exact variable you want. Therefore, this script will also return matches for partial variable names. In this example, we will look for the variable "velocity", which is not a standard name of any variable, and see what it returns: + +.. code-block:: console + + framework/scripts/ccpp_track_variables.py --config=config/ccpp_prebuild_config.py \ + -s=suites/suite_FV3_GFS_v16.xml -v velocity -m ./physics/physics/ + Variable velocity not found in any suites for sdf suites/suite_FV3_GFS_v16.xml + + ERROR:ccpp_track_variables:Variable velocity not found in any suites for sdf suites/suite_FV3_GFS_v16.xml + + Did find partial matches that may be of interest: + + In GFS_surface_composites_pre_run found variable(s) ['surface_friction_velocity', 'surface_friction_velocity_over_water', 'surface_friction_velocity_over_land', 'surface_friction_velocity_over_ice'] + In sfc_diff_run found variable(s) ['surface_friction_velocity_over_water', 'surface_friction_velocity_over_land', 'surface_friction_velocity_over_ice'] + In GFS_surface_composites_post_run found variable(s) ['surface_friction_velocity', 'surface_friction_velocity_over_water', 'surface_friction_velocity_over_land', 'surface_friction_velocity_over_ice'] + In cires_ugwp_run found variable(s) ['angular_velocity_of_earth'] + In samfdeepcnv_run found variable(s) ['vertical_velocity_for_updraft', 'cellular_automata_vertical_velocity_perturbation_threshold_for_deep_convection'] + +While the script did not find the variable specified, it did find several partial matches -- ``surface_friction_velocity``, ``surface_friction_velocity_over_water``, ``surface_friction_velocity_over_land``, etc. -- as well as the subroutines they were found in. You can then use this more specific information to refine your next query: + +.. code-block:: console + + framework/scripts/ccpp_track_variables.py --config=config/ccpp_prebuild_config.py \ + -s=suites/suite_FV3_GFS_v16.xml -v surface_friction_velocity -m ./physics/physics/ + For suite suites/suite_FV3_GFS_v16.xml, the following schemes (in order for each group) use the variable surface_friction_velocity: + In group physics + GFS_surface_composites_pre_run (intent in) + GFS_surface_composites_post_run (intent inout) + + diff --git a/CCPPtechnical/source/CompliantPhysicsParams.rst b/CCPPtechnical/source/CompliantPhysicsParams.rst index de76ac8..dfaf622 100644 --- a/CCPPtechnical/source/CompliantPhysicsParams.rst +++ b/CCPPtechnical/source/CompliantPhysicsParams.rst @@ -33,11 +33,6 @@ The implementation of a driver is reasonable under the following circumstances: intact so that it can be synchronized between the WRF model and the CCPP distributions. See more in ``mp_thompson.F90`` in the ``ccpp-physics/physics`` directory. -* To deal with optional arguments. A driver can check whether optional arguments have been - provided by the host model to either write out a message and return an error code or call a - subroutine with or without optional arguments. For example, see ``mp_thompson.F90``, - ``radsw_main.F90``, or ``radlw_main.F90`` in the ``ccpp-physics/physics`` directory. - * To perform unit conversions or array transformations, such as flipping the vertical direction and rearranging the index order, for example, ``cu_gf_driver.F90`` or ``gfdl_cloud_microphys.F90`` in the ``ccpp-physics/physics`` directory. @@ -569,7 +564,6 @@ Where the following has been added to the ``my_physics.meta`` file: dimensions = () type = real intent = in - optional = F This allows the von Karman constant to be defined by the host model and be passed in through the CCPP scheme subroutine interface. diff --git a/CCPPtechnical/source/ConfigBuildOptions.rst b/CCPPtechnical/source/ConfigBuildOptions.rst index 0ca7c5b..b639108 100644 --- a/CCPPtechnical/source/ConfigBuildOptions.rst +++ b/CCPPtechnical/source/ConfigBuildOptions.rst @@ -3,11 +3,13 @@ ***************************************** CCPP Configuration and Build Options ***************************************** -While the *CCPP Framework* code, consisting of a single Fortran source file and associated metadata file, can be compiled and tested independently, the *CCPP Physics* code can only be used within a host modeling system that provides the variables required to execute the physics. As such, it is advisable to integrate the CCPP configuration and build process with the host model build system. Part of the build process, known as the *prebuild* step since it precedes compilation, involves running a Python script that performs multiple functions. These functions include configuring the *CCPP Physics* for use with the host model and autogenerating FORTRAN code to communicate variables between the physics and the dynamical core. The *prebuild* step will be discussed in detail in :numref:`Chapter %s `. +While the :term:`CCPP Framework` code, consisting of a single Fortran source file and associated metadata file, can be compiled and tested independently, the :term:`CCPP Physics` code can only be used within a host modeling system that provides the variables required to execute the physics. As such, it is advisable to integrate the CCPP configuration and build process with the host model build system. Part of the build process, known as the *prebuild* step since it precedes compilation, involves running a Python script that performs multiple functions. These functions include configuring the *CCPP Physics* for use with the host model and autogenerating FORTRAN code to communicate variables between the physics and the dynamical core. The *prebuild* step will be discussed in detail in :numref:`Chapter %s `. The SCM and the UFS Atmosphere are supported for use with the CCPP. In the case of the UFS Atmosphere as the host model, build configuration options can be specified as cmake options to the ``build.sh`` script for manual compilation or through a regression test (RT) configuration file. Detailed instructions for building the UFS Atmosphere and the SCM are discussed in the -`UFS Weather Model User Guide `_ and the -`SCM User Guide `_. For both SCM and UFS the ``ccpp_prebuild.py`` script is run automatically as a step in the build system, although it can be run manually for debugging purposes. +`UFS Weather Model User Guide `_ +and the `SCM User Guide `_. +For both SCM and UFS the ``ccpp_prebuild.py`` script is run automatically as a step in the build system, +although it can be run manually for debugging purposes. The path to a host-model specific configuration file is the only required argument to ``ccpp_prebuild.py``. Such files are included with the ccpp-scm and ufs-weather-model repositories, and must be included with the code of diff --git a/CCPPtechnical/source/ConstructingSuite.rst b/CCPPtechnical/source/ConstructingSuite.rst index e0c7b60..7c145e2 100644 --- a/CCPPtechnical/source/ConstructingSuite.rst +++ b/CCPPtechnical/source/ConstructingSuite.rst @@ -8,24 +8,26 @@ Constructing Suites Suite Definition File ============================== -The :term:`SDF` is a file in XML format used to specify the name of the suite, the physics schemes to run, groups of physics that run together, the order in which to run the physics, and whether subcycling will be used to run any of the parameterizations with shorter timesteps. The :term:`SDF` files are part of the host model code. +The :term:`SDF` is a file in XML format used to specify the name of the suite, the physics schemes to run, groups of physics that run together, the order in which to run the physics, and whether subcycling will be used to run any of the parameterizations with shorter timesteps. The SDF files are part of the host model code. -In addition to the primary parameterization categories (such as radiation, boundary layer, deep convection, resolved moist physics, etc.), the :term:`SDF` can have an arbitrary number of interstitial schemes in between the parameterizations to preprocess or postprocess data. In many models, this interstitial code is not obvious to the model user but, with the :term:`SDF`, both the primary parameterizations and the interstitial schemes are listed explicitly. +In addition to the primary parameterization categories (such as radiation, boundary layer, deep convection, resolved moist physics, etc.), the SDF can have an arbitrary number of interstitial schemes in between the parameterizations to preprocess or postprocess data. In many models, this interstitial code is not obvious to the model user but, with the SDF, both the primary parameterizations and the interstitial schemes are listed explicitly. -The format of the :term:`SDF` is specified by a schema and all host models that use CCPP include file ``suite.xsd`` to describe the schema. +The format of the SDF is specified by a schema and all host models that use CCPP include file ``suite.xsd`` to describe the schema. -The name of the suite is listed at the top of the :term:`SDF`, right after the XML declaration, and must be consistent with the name of the :term:`SDF`: file ``suite_ABC.xml`` contains ``suite name=’ABC’``, as in the example below. The suite name is followed by the version of the XML schema used. +The name of the suite is listed at the top of the SDF, right after the XML declaration, and must be consistent with the name of the SDF: file ``suite_ABC.xml`` contains ``suite name=’ABC’``, as in the example below. The suite name is followed by the version of the XML schema used. -------------- Groups -------------- -The concept of grouping physics in the :term:`SDF` (reflected in the ```` elements) enables “groups” of parameterizations to be called with other computation (such as related to the dycore, I/O, etc.) in between. One can edit the groups to suit the needs of the host application. For example, if a subset of physics schemes needs to be more tightly connected with the dynamics and called more frequently, one could create a group consisting of that subset and place a ``ccpp_physics_run`` call in the appropriate place in the host application. The remainder of the parameterization groups could be called using ``ccpp_physics_run`` calls in a different part of the host application code. +The concept of grouping physics in the SDF (reflected in the ```` elements) enables “groups” of parameterizations to be called with other computation (such as related to the dycore, I/O, etc.) in between. One can edit the groups to suit the needs of the host application. For example, if a subset of physics schemes needs to be more tightly connected with the dynamics and called more frequently, one could create a group consisting of that subset and place a ``ccpp_physics_run`` call in the appropriate place in the host application. The remainder of the parameterization groups could be called using ``ccpp_physics_run`` calls in a different part of the host application code. + +.. _Subcycling: ----------------- Subcycling ----------------- -The :term:`SDF` allows subcycling of schemes, or calling a subset of schemes at a smaller time step than others. The ```` element in the :term:`SDF` controls this function. All schemes within such an element are called ``n`` times during one ``ccpp_physics_run`` call. An example of this is found in the ``suite_FV3_GFS_v16.xml`` :term:`SDF`, where the surface schemes are executed twice for each timestep (implementing a predictor/corrector paradigm): +The :term:`SDF` allows subcycling of schemes, or calling a subset of schemes at a smaller time step than others. The ```` element in the SDF controls this function. All schemes within such an element are called ``n`` times during one ``ccpp_physics_run`` call. An example of this is found in the ``suite_FV3_GFS_v16.xml`` SDF, where the surface schemes are executed twice for each timestep (implementing a predictor/corrector paradigm): .. code-block:: xml @@ -41,13 +43,13 @@ The :term:`SDF` allows subcycling of schemes, or calling a subset of schemes at GFS_surface_loop_control_part2 -Note that currently no time step information is included in the :term:`SDF` and that the subcycling of schemes resembles more an iteration over schemes with the loop counter being available as integer variable with standard name ``ccpp_loop_counter``. If subcycling is used for a set of parameterizations, the smaller time step must be an input argument for those schemes, or computed in the scheme from the default physics time step (``timestep_for_physics``) and the number of subcycles (``ccpp_loop_extent``). +Note that currently no time step information is included in the SDF and that the subcycling of schemes resembles more an iteration over schemes with the loop counter being available as integer variable with standard name ``ccpp_loop_counter``. If subcycling is used for a set of parameterizations, the smaller time step must be an input argument for those schemes, or computed in the scheme from the default physics time step (``timestep_for_physics``) and the number of subcycles (``ccpp_loop_extent``). ---------------------- Order of Schemes ---------------------- -Schemes may be interdependent and the order in which the schemes are run may make a difference in the model output. Reading the :term:`SDF`\(s) and defining the order of schemes for each suite happens at compile time. Some schemes require additional interstitial code that must be run before or after the scheme and cannot be part of the scheme itself. This can be due to dependencies on other schemes and/or the order of the schemes as determined in the :term:`SDF`. Note that more than one SDF can be supplied at compile time, but only one can be used at runtime. +Schemes may be interdependent and the order in which the schemes are run may make a difference in the model output. Reading the SDF(s) and defining the order of schemes for each suite happens at compile time. Some schemes require additional interstitial code that must be run before or after the scheme and cannot be part of the scheme itself. This can be due to dependencies on other schemes and/or the order of the schemes as determined in the SDF. Note that more than one SDF can be supplied at compile time, but only one can be used at runtime. ========================= Interstitial Schemes @@ -62,7 +64,7 @@ SDF Examples Simplest Case: Single Group and no Subcycling ---------------------------------------------------- -Consider the simplest case, in which all physics schemes are to be called together in a single group with no subcycling (i.e. ``subcycle loop="1"``). The subcycle loop must be set in each group. The :term:`SDF` ``suite_Suite_A.xml`` could contain the following: +Consider the simplest case, in which all physics schemes are to be called together in a single group with no subcycling (i.e. ``subcycle loop="1"``). The subcycle loop must be set in each group. The SDF ``suite_Suite_A.xml`` could contain the following: .. code-block:: console @@ -88,7 +90,7 @@ Consider the simplest case, in which all physics schemes are to be called togeth -Note the syntax of the :term:`SDF` file. The root (the first element to appear in the xml file) is the ``suite`` with the ``name`` of the suite given as an attribute. In this example, the suite name is ``Suite_A``. Within each suite are groups, which specify a physics group to call (i.e. ``physics``, ``fast_physics``, ``time_vary``, ``radiation``, ``stochastics``). Each group has an option to subcycle. The value given for loop determines the number of times all of the schemes within the ``subcycle`` element are called. Finally, the ``scheme`` elements are children of the ``subcycle`` elements and are listed in the order they will be executed. In this example, ``scheme_1_pre`` and ``scheme_1_post`` are scheme-specific preprocessing and postprocessing interstitial schemes, respectively. The suite-level preprocessing and postprocessing interstitial ``schemes scheme_2_generic_pre`` and ``scheme_2_generic_post`` are also called in this example. ``Suite_A_interstitial_2`` is a scheme for ``suite_A`` and connects various schemes within this suite. +Note the syntax of the SDF. The root (the first element to appear in the xml file) is the ``suite`` with the ``name`` of the suite given as an attribute. In this example, the suite name is ``Suite_A``. Within each suite are groups, which specify a physics group to call (i.e. ``physics``, ``fast_physics``, ``time_vary``, ``radiation``, ``stochastics``). Each group has an option to subcycle. The value given for loop determines the number of times all of the schemes within the ``subcycle`` element are called. Finally, the ``scheme`` elements are children of the ``subcycle`` elements and are listed in the order they will be executed. In this example, ``scheme_1_pre`` and ``scheme_1_post`` are scheme-specific preprocessing and postprocessing interstitial schemes, respectively. The suite-level preprocessing and postprocessing interstitial ``schemes scheme_2_generic_pre`` and ``scheme_2_generic_post`` are also called in this example. ``Suite_A_interstitial_2`` is a scheme for ``suite_A`` and connects various schemes within this suite. ------------------------------- Case with Multiple Groups diff --git a/CCPPtechnical/source/Glossary.rst b/CCPPtechnical/source/Glossary.rst index c56d1f7..7a488cc 100644 --- a/CCPPtechnical/source/Glossary.rst +++ b/CCPPtechnical/source/Glossary.rst @@ -4,17 +4,17 @@ Glossary .. glossary:: CCPP - Model agnostic, vetted, collection of codes containing atmospheric physical parameterizations + A model-agnostic, well-vetted collection of codes containing atmospheric physical parameterizations and suites for use in NWP along with a framework that connects the physics to host models *CCPP Framework* - The infrastructure that connects physics schemes with a host model; also refers to a software - repository of the same name + The infrastructure that connects physics schemes with a host model; also refers to a software + repository of the same name *CCPP Physics* The pool of CCPP-compliant physics schemes; also refers to a software repository of the same name - "Fast" physics + Fast physics Physical parameterizations that require tighter coupling with the dynamical core than “slow” physics (due to the approximated processes within the parameterization acting on a shorter timescale) and that benefit from a smaller time step. The distinction is useful for greater @@ -26,7 +26,7 @@ Glossary without intervening computations from the host application Group *cap* - Autogenerated interface between a group of physics schemes and the host model. + Autogenerated interface between a :term:`group` of physics schemes and the host model. Host model/application An atmospheric model that allocates memory, provides metadata for the variables passed into @@ -64,13 +64,17 @@ Glossary Parameterization The representation, in a dynamic model, of physical effects in terms of admittedly oversimplified parameters, rather than realistically requiring such effects to be - consequences of the dynamics of the system (AMS Glossary) + consequences of the dynamics of the system (from the `AMS Glossary `_) + + Phase + A CCPP phase is one of five steps that each physics scheme can be broken down into. Phases + are described in more detail in :numref:`Chapter %c `. Physics *cap* Generic name to refer to suite and group physics caps. Physics Suite *cap* - Autogenerated interface between an entire suite of physics schemes and the host model. + Autogenerated interface between an entire :term:`suite` of physics schemes and the host model. It consists of calls to autogenerated physics group caps. It may be used to call an entire suite at once or to call a specific group within a physics suite @@ -79,25 +83,31 @@ Glossary traditionally-accepted definition, as opposed to an interstitial scheme PROD - Compiler flags used by NCEP for operational runs of the UFS Atmosphere and by EMC for - regression tests of the code + Compiler flags used by NCEP for operational (**prod**\ uction) runs of the UFS Atmosphere and by EMC for + regression tests of the code REPRO - Compiler flags used by EMC to guarantee reproducibility of the UFS Atmosphere code + Compiler flags used by EMC to guarantee **repro**\ ducibility of the UFS Atmosphere code Scheme - A CCPP-compliant parameterization (primary scheme) or auxiliary code (interstitial scheme) + A CCPP-compliant parameterization (primary scheme) or auxiliary code (interstitial scheme) SDF - Suite Definition File (SDF) is an external file containing information about the - construction of a physics suite. It describes the schemes that are called, in which - order they are called, whether they are subcycled, and whether they are assembled - into groups to be called together + Suite Definition File (SDF) is an external file containing information about the + construction of a physics suite. It describes the schemes that are called, in which + order they are called, whether they are subcycled, and whether they are assembled + into groups to be called together Set A collection of physics schemes that do not share memory (e.g. fast and slow physics) - "Slow" physics + SCM + The CCPP Single Column Model (SCM) is a simple 1D host model designed to be used with the CCPP + Physics and Framework as a lightweight alternative to full 3D dynamical models for testing + and development of physics schemes and suites. See the `SCM User Guide `_ + for more information. + + Slow physics Physical parameterizations that can tolerate looser coupling with the dynamical core than “fast” physics (due to the approximated processes within the parameterization acting on a longer timescale) and that often use a longer time step. Such parameterizations @@ -105,13 +115,14 @@ Glossary time-splitting) in a section of an atmospheric model that is distinct from the dynamical core in the code organization - Standard_name + Standard name Variable names based on CF conventions (http://cfconventions.org) that are uniquely - identified by the *CCPP-compliant* schemes and provided by a host model + identified by the *CCPP-compliant* schemes and provided by a host model. See + :numref:`Section %s ` for more details. Subcycling Executing a physics scheme more frequently (with a shorter timestep) than the rest of - the model physics or dynamics + the model physics or dynamics. See :numref:`Section %s ` for more details. Suite A collection of primary physics schemes and interstitial schemes that are known to work @@ -125,17 +136,12 @@ Glossary operational numerical weather prediction applications UFS Atmosphere - The atmospheric model component of the UFS. Its fundamental parts are the dynamical + The atmospheric model component of the :term:`UFS`. Its fundamental parts are the dynamical core and the physics UFS Weather Model - Global meduim-range, weather-prediction model previously known as NEMSfv3gfs or FV3GFS - used to create forecasts. - - VLab - Virtual Laboratory - a service and information technology framework, that enables - NOAA employees and their partners to share ideas, collaborate, engage in software - development, and conduct applied research (https://www.nws.noaa.gov/mdl/vlab/) + The combined global/regional medium- to short-range weather-prediction model used in the :term:`UFS` + to create forecasts .xsd file extension XML schema definition diff --git a/CCPPtechnical/source/Overview.rst b/CCPPtechnical/source/Overview.rst index ca516b7..aa72fe4 100644 --- a/CCPPtechnical/source/Overview.rst +++ b/CCPPtechnical/source/Overview.rst @@ -107,10 +107,10 @@ undertaken by NOAA and NCAR (see more information at https://github.com/NCAR/ccp and https://dtcenter.org/community-code/common-community-physics-package-ccpp). The table below lists all parameterizations supported in CCPP public releases and the -`CCPP Scientific Documentation `_ +`CCPP Scientific Documentation `_ describes the parameterizations in detail. The parameterizations are grouped in suites, which can be classified primarily as *operational* or *developmental*. -*Operational* suites are those used by operational, real-time weather prediction models. For this release, the only operational suite is GFS_v16, which is used for `version 16 `_ of the GFS model. +*Operational* suites are those used by operational, real-time weather prediction models. For this release, the only operational suite is GFS_v16, which is used for `version 16 `_ of the GFS model. *Developmental* suites are those that are officially supported for this CCPP release with one or more host models, but are not currently used in any operational models. These may include schemes needed exclusively for research, or "release candidate" schemes proposed for use with future operational models. .. _scheme_suite_table: @@ -122,7 +122,7 @@ are grouped in suites, which can be classified primarily as *operational* or *de +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ | Physics Suite | GFS_v16 | :g:`GFS_v17_p8` | :g:`RAP` |:g:`RRFS_v1beta`| :g:`WoFS` | :g:`HRRR` | +=====================+==================+==================+================+================+================+================+ - | **Supported hosts** | **SCM/SRW** | :gb:`SCM/MRW` | :gb:`SCM` |:gb:`SCM/SRW` | :gb:`SCM/SRW` | :gb:`SCM/SRW` | + | **Supported hosts** | **SCM/SRW** | :gb:`SCM` | :gb:`SCM` |:gb:`SCM/SRW` | :gb:`SCM/SRW` | :gb:`SCM/SRW` | +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ | Microphysics | GFDL | :g:`Thompson` | :g:`Thompson` | :g:`Thompson` | :g:`NSSL` | :g:`Thompson` | +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ @@ -147,27 +147,27 @@ are grouped in suites, which can be classified primarily as *operational* or *de | Ocean | NSST | :g:`NSST` | :g:`NSST` | :g:`NSST` | :g:`NSST` | :g:`NSST` | +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ -Only the suites that are currently supported in the CCPP are listed in the table. Currently all supported suites use the 2015 Navy Research Laboratory (NRL) `ozone `_ and `stratospheric water vapor `_ schemes, +Only the suites that are currently supported in the CCPP are listed in the table. Currently all supported suites use the 2015 Navy Research Laboratory (NRL) `ozone `_ and `stratospheric water vapor `_ schemes, and the `NSST `_ ocean scheme. -The operational GFS_v16 suite includes `GFDL microphysics `_, -the `Turbulent Kinetic Energy (TKE)-based Eddy Diffusivity Mass-Flux (EDMF) `_ planetary boundary layer (PBL) scheme, -`scale-aware (sa) Simplified Arakawa-Schubert (SAS) `_ deep convection, -`scale-aware mass-flux (saMF) `_ shallow convection, -`Rapid Radiation Transfer Model for General Circulation Models (RRTMG) `_ radiation, -`GFS surface layer `_ scheme, -the `Cooperative Institute for Research in the Environmental Sciences (CIRES) unified gravity wave drag (uGWD) `_ scheme, -and the `Noah Land Surface Model (LSM) `_. +The operational GFS_v16 suite includes `GFDL microphysics `_, +the `Turbulent Kinetic Energy (TKE)-based Eddy Diffusivity Mass-Flux (EDMF) `_ planetary boundary layer (PBL) scheme, +`scale-aware (sa) Simplified Arakawa-Schubert (SAS) `_ deep convection, +`scale-aware mass-flux (saMF) `_ shallow convection, +`Rapid Radiation Transfer Model for General Circulation Models (RRTMG) `_ radiation, +`GFS surface layer `_ scheme, +the `Cooperative Institute for Research in the Environmental Sciences (CIRES) unified gravity wave drag (uGWD) `_ scheme, +and the `Noah Land Surface Model (LSM) `_. The five developmental suites are either analogues for current operational physics schemes, or candidates for future operational implementations. -* The GFS_v17_p8 suite is the current (as of June 2022) proposed suite for the next operational GFS implementation (version 17), and features several differences from the GFS_v16 suite, using `Thompson `_ microphysics, `saSAS plus Cellular Automata (CA) `_ deep convection, `Unified uGWP `_ gravity wave drag, and `Noah Multiparameterization (Noah-MP) `_ land surface parameterization. +* The GFS_v17_p8 suite is the current (as of June 2022) proposed suite for the next operational GFS implementation (version 17), and features several differences from the GFS_v16 suite, using `Thompson `_ microphysics, `saSAS plus Cellular Automata (CA) `_ deep convection, `Unified uGWP `_ gravity wave drag, and `Noah Multiparameterization (Noah-MP) `_ land surface parameterization. -* The RAP scheme is similar to the operational Rapid Refresh (RAP) model physics package, and features Thompson microphysics, `Mellor-Yamada-Nakanishi-Niino (MYNN) EDMF `_ PBL, `Grell-Freitas (GF) `_ deep convection and shallow convection schemes, RRTMG radiation, `MYNN surface layer (SFL) `_ scheme, `Global Systems Laboratory (GSL) `_ gravity wave drag scheme, and the `Rapid Update Cycle (RUC) Land Surface Model `_. +* The RAP scheme is similar to the operational Rapid Refresh (RAP) model physics package, and features Thompson microphysics, `Mellor-Yamada-Nakanishi-Niino (MYNN) EDMF `_ PBL, `Grell-Freitas (GF) `_ deep convection and shallow convection schemes, RRTMG radiation, `MYNN surface layer (SFL) `_ scheme, `Global Systems Laboratory (GSL) `_ gravity wave drag scheme, and the `Rapid Update Cycle (RUC) Land Surface Model `_. * The RRFS_v1beta suite is being used for development of the future `Rapid Refresh Forecast System (RRFS) `_, which is scheduled for implementation in late 2023. This scheme features Thompson microphysics, MYNN EDMF PBL, RRTMG radiation, MYNN SFL, CIRES uGWD, and Noah-MP land surface (it does not feature convective parameterization). -* The `Warn-on-Forecast System (WoFS) `_ suite is being used by the WoFS project at the National Severe Storms Laboratory (NSSL) for real-time and potential future operational high-resolution modeling products. The WoFS suite is identical to the RRFS_v1beta suite, except using `NSSL 2-moment `_ microphysics. +* The `Warn-on-Forecast System (WoFS) `_ suite is being used by the WoFS project at the National Severe Storms Laboratory (NSSL) for real-time and potential future operational high-resolution modeling products. The WoFS suite is identical to the RRFS_v1beta suite, except using `NSSL 2-moment `_ microphysics. * Finally, the HRRR scheme is similar to the operational High-Resolution Rapid Refresh (HRRR) model physics package, and is identical to the RAP scheme except it does not have convective parameterization due to its intended use at higher convective-permitting resolutions. @@ -191,15 +191,14 @@ upgrade with the capability to build the code using Python 3 (previously only Py was supported). The CCPP v5.0 release, issued in February 2021, was a major upgrade to enable use with the UFS Short-Range Weather (SRW) Application and the RRFS_v1alpha suite. -The CCPP v6.0 release, issued in June 2022, was a major upgrade in conjunction with the release of the UFS MRW and SRW v2.0 releases. +The CCPP v6.0 release, issued in June 2022, was a major upgrade in conjunction with the release of the UFS SRW v2.0 release. .. [#] As of this writing, the CCPP has been validated with two host models: the CCPP SCM and the atmospheric component of NOAA’s Unified Forecast System (UFS) (hereafter the UFS Atmosphere) that utilizes - the Finite-Volume Cubed Sphere (FV3) dynamical core. The CCPP can be utilized both with the + the Finite-Volume Cubed Sphere (FV3) dynamical core. The CCPP can be utilized both with the global and limited-area configurations of the UFS Atmosphere. CCPP v6.0.0 is the latest - release compatible with the global UFS MRW Application and the - limited-area UFS SRW Application. The CCPP + release compatible with the UFS limited-area UFS SRW Application. The CCPP has also been run experimentally with a Navy model. Work is under way to connect and validate the use of the CCPP Framework with NCAR models. diff --git a/CCPPtechnical/source/ParamSpecificOutput.rst b/CCPPtechnical/source/ParamSpecificOutput.rst index 5e5d456..fd4b8df 100644 --- a/CCPPtechnical/source/ParamSpecificOutput.rst +++ b/CCPPtechnical/source/ParamSpecificOutput.rst @@ -27,7 +27,6 @@ considered in future implementations. These capabilities have been tested and are expected to work with the following suites: * SCM: GFS_v16, GFS_v17_p8, RAP, RRFS_v1beta, WoFS, HRRR -* ufs-weather-model (global): GFS_v17_p8 * ufs-weather-model (regional): GFS_v16, RRFS_v1beta, WoFS, HRRR ========== @@ -57,7 +56,7 @@ photochemistry. The total tendency produced by the ozone photochemistry scheme ( subdivided by subprocesses: production and loss (combined as a single subprocess), quantity of ozone present in the column above a grid cell, influences from temperature, and influences from mixing ratio. For more information about the NRL 2015 ozone photochemistry scheme, consult the `CCPP Scientific Documentation -`_. +`_. There are numerous tendencies in CCPP, and you need to know which ones exist for your configuration to enable them. The model will output a list of available tendencies for your configuration if you run with @@ -316,8 +315,7 @@ Note that some host models, such as the UFS, have a limit of how many fields can When outputting all tendencies, this limit may have to be increased. In the UFS, this limit is determined by variable ``max_output_fields`` in namelist section ``&diag_manager_nml`` in file ``input.nml``. -Further documentation of the ``diag_table`` file can be found in the UFS Weather Model User’s Guide -`here `_. +Further documentation of the ``diag_table`` file can be found in the `UFS Weather Model User’s Guide `_. When the model completes, the fv3_history will contain these new variables. @@ -562,7 +560,6 @@ The ``cu_gf_driver.meta`` file was modified accordingly: @@ -476,3 +476,29 @@ type = integer intent = out - optional = F +[naux2d] + standard_name = number_of_2d_auxiliary_arrays + long_name = number of 2d auxiliary arrays to output (for debugging) diff --git a/CCPPtechnical/source/ScientificDocRules.inc b/CCPPtechnical/source/ScientificDocRules.inc index 9ed0bff..527607e 100644 --- a/CCPPtechnical/source/ScientificDocRules.inc +++ b/CCPPtechnical/source/ScientificDocRules.inc @@ -23,7 +23,7 @@ so that doxygen will parse them correctly, where to put various comments within the code, how to include information from the ``.meta`` files, and how to configure and run doxygen to generate HTML output. For an example of the HTML rendering of the CCPP Scientific Documentation, see -https://dtcenter.ucar.edu/GMTB/v6.0.0p/sci_doc/html/index.html +https://dtcenter.ucar.edu/GMTB/v6.0.0/sci_doc/index.html Part of this documentation, namely metadata about subroutine arguments, has functional significance as part of the CCPP infrastructure. The metadata must be in a particular format to be parsed by Python scripts that “automatically” generate @@ -153,7 +153,7 @@ or ``@ref``. Example from ``suite_FV3_GFS_v16.txt``: ... */ -The HTML result of this Doxygen code `can be viewed here `_. +The HTML result of this Doxygen code `can be viewed here `_. You can see that the ``-`` symbols at the start of a line generate a list with bullets, and the ``\ref`` commands generate links to the appropriately labeled pages. The ``\section`` comands indicate section breaks, and the ``\include`` commands will include the contents of another file. @@ -251,7 +251,7 @@ is used to aggregate all code related to that scheme, even when it is in separat files. Since doxygen cannot know which files or subroutines belong to each physics scheme, each relevant subroutine must be tagged with the module name. This allows doxygen to understand your modularized design and generate the documentation accordingly. -`Here is a list of modules `_ +`Here is a list of modules `_ defined in CCPP. A module is defined using: diff --git a/CCPPtechnical/source/conf.py b/CCPPtechnical/source/conf.py index c1e82a4..9e18434 100644 --- a/CCPPtechnical/source/conf.py +++ b/CCPPtechnical/source/conf.py @@ -21,7 +21,7 @@ project = 'CCPP Technical' copyright = '2022 ' -author = 'Bernardet, L., G. Firl, D. Heinzeller, L. Pan, \\\ M. Zhang, M. Kavulich, L Carson, and J. Schramm' +author = 'Bernardet, L., G. Firl, D. Heinzeller, L. Pan, M. Zhang, \\\ M. Kavulich, L Carson, and J. Schramm' # The short X.Y version version = '6.0'