Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/develop' into feature/epic-stack
Browse files Browse the repository at this point in the history
* origin/develop:
  Redo v16.3 GSI script updates to scripts/exglobal_atmos_analysis.sh (NOAA-EMC#1535)
  Fix bugs in the COM refactor of marine DA (NOAA-EMC#1566)
  Retire `getic.sh` and `init.sh` jobs from global-workflow (NOAA-EMC#1578)
  New UFS_UTILS hash for gdas_init COM reorg updates (NOAA-EMC#1581)
  Update documentation for PR standards (NOAA-EMC#1573)
  • Loading branch information
KateFriedman-NOAA committed May 9, 2023
2 parents 1266c36 + 2fd43d1 commit a9acca0
Show file tree
Hide file tree
Showing 30 changed files with 95 additions and 571 deletions.
2 changes: 1 addition & 1 deletion Externals.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ protocol = git
required = True

[UFS-Utils]
hash = 5b67e4d
hash = 72a0471
local_path = sorc/ufs_utils.fd
repo_url = https://github.com/ufs-community/UFS_UTILS.git
protocol = git
Expand Down
3 changes: 0 additions & 3 deletions docs/source/configure.rst
Original file line number Diff line number Diff line change
Expand Up @@ -48,9 +48,6 @@ The global-workflow configs contain switches that change how the system runs. Ma
| QUILTING | Use I/O quilting | .true. | NO | If .true. choose OUTPUT_GRID as cubed_sphere_grid |
| | | | | in netcdf or gaussian_grid |
+----------------+------------------------------+---------------+-------------+---------------------------------------------------+
| RETRO | Use retrospective parallel | NO | NO | Default of NO will tell getic job to pull from |
| | for ICs | | | production tapes. |
+----------------+------------------------------+---------------+-------------+---------------------------------------------------+
| WAFSF | Run jobs to produce WAFS | NO | YES | downstream processing, ops only |
| | products | | | |
+----------------+------------------------------+---------------+-------------+---------------------------------------------------+
Expand Down
14 changes: 7 additions & 7 deletions docs/source/development.rst
Original file line number Diff line number Diff line change
Expand Up @@ -92,21 +92,21 @@ All new code after 2022 Sep 1 will be required to meet these standards. We will

.. _commit-standards:

========================
Commit message standards
========================
======================
Pull request standards
======================

**ALL** commits must follow best practices for commit messages: https://chris.beams.io/posts/git-commit/
Pull requests should follow the pre-filled template provided when you open the PR. PR titles and descriptions become the commit message when the PR is squashed and merged, so we ask that they follow best practices for commit messages:

* Separate subject from body with a blank line
* Limit the subject line to 50 characters
* Limit the subject line (PR title) to 50 characters
* Capitalize the subject line
* Do not end the subject line with a period
* Use the `imperative mood <https://en.wikipedia.org/wiki/Imperative_mood>`_ in the subject line
* Wrap the body at 72 characters
* Use the body to explain what and why vs. how
* The final line of the commit message should include tags to relevant issues (e.g. ``Refs: #217, #300``)

This list is a modified version of the one provided at https://chris.beams.io/posts/git-commit/ with a couple removed that are not relevant to GitHub PRs. That source also provides the motivation for making sure we have good commit messages.

Here is the example commit message from the article linked above; it includes descriptions of what would be in each part of the commit message for guidance:

::
Expand Down
4 changes: 1 addition & 3 deletions docs/source/init.rst
Original file line number Diff line number Diff line change
Expand Up @@ -261,9 +261,7 @@ Coupled initial conditions are currently only generated offline and copied prior
Forecast-only mode (atm-only)
-----------------------------

Forecast-only mode in global workflow includes ``getic`` and ``init`` jobs for the gfs suite. The ``getic`` job pulls inputs for ``chgres_cube`` (init job) or warm start ICs into your ``ROTDIR/COMROT``. The ``init`` job then ingests those files to produce initial conditions for your experiment.

Users on machines without HPSS access (e.g. Orion) need to perform the ``getic`` step manually and stage inputs for the ``init`` job. The table below lists the needed files for ``init`` and where to place them in your ``ROTDIR``.
The table below lists the needed initial condition files from past GFS versions to be used by the UFS_UTILS gdas_init utility. The utility will pull these files for you. See the next section (Manual Generation) for how to run the UFS_UTILS gdas_init utility and create initial conditions for your experiment.

Note for table: yyyy=year; mm=month; dd=day; hh=cycle

Expand Down
4 changes: 0 additions & 4 deletions env/HERA.env
Original file line number Diff line number Diff line change
Expand Up @@ -261,10 +261,6 @@ elif [[ "${step}" = "epos" ]]; then
[[ ${NTHREADS_EPOS} -gt ${nth_max} ]] && export NTHREADS_EPOS=${nth_max}
export APRUN_EPOS="${launcher} -n ${npe_epos}"

elif [[ "${step}" = "init" ]]; then

export APRUN="${launcher} -n ${npe_init}"

elif [[ "${step}" = "postsnd" ]]; then

export CFP_MP="YES"
Expand Down
4 changes: 0 additions & 4 deletions env/JET.env
Original file line number Diff line number Diff line change
Expand Up @@ -231,10 +231,6 @@ elif [[ "${step}" = "epos" ]]; then
[[ ${NTHREADS_EPOS} -gt ${nth_max} ]] && export NTHREADS_EPOS=${nth_max}
export APRUN_EPOS="${launcher} -n ${npe_epos}"

elif [[ "${step}" = "init" ]]; then

export APRUN="${launcher} -n ${npe_init}"

elif [[ "${step}" = "postsnd" ]]; then

export CFP_MP="YES"
Expand Down
4 changes: 0 additions & 4 deletions env/ORION.env
Original file line number Diff line number Diff line change
Expand Up @@ -260,10 +260,6 @@ elif [[ "${step}" = "epos" ]]; then
[[ ${NTHREADS_EPOS} -gt ${nth_max} ]] && export NTHREADS_EPOS=${nth_max}
export APRUN_EPOS="${launcher} -n ${npe_epos}"

elif [[ "${step}" = "init" ]]; then

export APRUN="${launcher} -n ${npe_init}"

elif [[ "${step}" = "postsnd" ]]; then

export CFP_MP="YES"
Expand Down
4 changes: 0 additions & 4 deletions env/S4.env
Original file line number Diff line number Diff line change
Expand Up @@ -234,10 +234,6 @@ elif [[ "${step}" = "epos" ]]; then
[[ ${NTHREADS_EPOS} -gt ${nth_max} ]] && export NTHREADS_EPOS=${nth_max}
export APRUN_EPOS="${launcher} -n ${npe_epos}"

elif [[ "${step}" = "init" ]]; then

export APRUN="${launcher} -n ${npe_init}"

elif [[ "${step}" = "postsnd" ]]; then

export CFP_MP="YES"
Expand Down
4 changes: 0 additions & 4 deletions env/WCOSS2.env
Original file line number Diff line number Diff line change
Expand Up @@ -249,10 +249,6 @@ elif [[ "${step}" = "epos" ]]; then
[[ ${NTHREADS_EPOS} -gt ${nth_max} ]] && export NTHREADS_EPOS=${nth_max}
export APRUN_EPOS="${launcher} -n ${npe_epos} -ppn ${npe_node_epos} --cpu-bind depth --depth ${NTHREADS_EPOS}"

elif [[ "${step}" = "init" ]]; then

export APRUN="${launcher}"

elif [[ "${step}" = "postsnd" ]]; then

export MPICH_MPIIO_HINTS_DISPLAY=1
Expand Down
5 changes: 3 additions & 2 deletions jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_CHKPT
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,9 @@ export GPREFIX="${GDUMP}.t${gcyc}z."
export APREFIX="${CDUMP}.t${cyc}z."

# Generate COM variables from templates
RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_ATMOS_HISTORY:COM_ATMOS_HISTORY_TMPL
RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_ATMOS_ANALYSIS:COM_ATMOS_ANALYSIS_TMPL
YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS

RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_ATMOS_HISTORY_PREV:COM_ATMOS_HISTORY_TMPL


##############################################
Expand Down
5 changes: 2 additions & 3 deletions jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_POST
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
export STRICT="NO"
source "${HOMEgfs}/ush/preamble.sh"
export WIPE_DATA="NO"
DATA="${DATAROOT}/${RUN}ocnanal_${cyc}"
export DATA="${DATAROOT}/${RUN}ocnanal_${cyc}"
source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalpost" -c "base ocnanalpost"


Expand All @@ -14,8 +14,7 @@ export CDATE=${CDATE:-${PDY}${cyc}}
export GDUMP=${GDUMP:-"gdas"}

# Generate COM variables from templates
RUN=${GDUMP} YMD=${PDY} HH=${cyc} generate_com -rx COM_OCEAN_ANALYSIS:COM_OCEAN_ANALYSIS_TMPL
RUN=${GDUMP} YMD=${PDY} HH=${cyc} generate_com -rx COM_ICE_RESTART
YMD=${PDY} HH=${cyc} generate_com -rx COM_OCEAN_ANALYSIS COM_ICE_RESTART

mkdir -p "${COM_OCEAN_ANALYSIS}"
mkdir -p "${COM_ICE_RESTART}"
Expand Down
9 changes: 5 additions & 4 deletions jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,12 @@ export GPREFIX="${GDUMP}.t${gcyc}z."
export APREFIX="${CDUMP}.t${cyc}z."

# Generate COM variables from templates
RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_OCEAN_HISTORY:COM_OCEAN_HISTORY_TMPL
RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_ICE_HISTORY:COM_ICE_HISTORY_TMPL
RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_ICE_RESTART:COM_ICE_RESTART_TMPL

YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS

RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \
COM_OCEAN_HISTORY_PREV:COM_OCEAN_HISTORY_TMPL \
COM_ICE_HISTORY_PREV:COM_ICE_HISTORY_TMPL \
COM_ICE_RESTART_PREV:COM_ICE_RESTART_TMPL

##############################################
# Begin JOB SPECIFIC work
Expand Down
5 changes: 4 additions & 1 deletion jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_RUN
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,14 @@ export DATA="${DATAROOT}/${RUN}ocnanal_${cyc}"
source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalrun" -c "base ocnanal ocnanalrun"


##############################################
# Set variables used in the script
##############################################

##############################################
# Begin JOB SPECIFIC work
##############################################

export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/ocean}

###############################################################
# Run relevant script
Expand Down
95 changes: 48 additions & 47 deletions jobs/rocoto/coupled_ic.sh
Original file line number Diff line number Diff line change
Expand Up @@ -44,25 +44,22 @@ error_message(){
echo "FATAL ERROR: Unable to copy ${1} to ${2} (Error code ${3})"
}

YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_INPUT COM_ICE_RESTART COM_WAVE_RESTART
YMD=${gPDY} HH=${gcyc} generate_com -rx COM_OCEAN_RESTART

###############################################################
# Start staging

# Stage the FV3 initial conditions to ROTDIR (cold start)
ATMdir="${COM_ATMOS_INPUT}"
[[ ! -d "${ATMdir}" ]] && mkdir -p "${ATMdir}"
YMD=${PDY} HH=${cyc} generate_com -r COM_ATMOS_INPUT
[[ ! -d "${COM_ATMOS_INPUT}" ]] && mkdir -p "${COM_ATMOS_INPUT}"
source="${BASE_CPLIC}/${CPL_ATMIC}/${PDY}${cyc}/${CDUMP}/${CASE}/INPUT/gfs_ctrl.nc"
target="${ATMdir}/gfs_ctrl.nc"
target="${COM_ATMOS_INPUT}/gfs_ctrl.nc"
${NCP} "${source}" "${target}"
rc=$?
[[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}"
err=$((err + rc))
for ftype in gfs_data sfc_data; do
for tt in $(seq 1 6); do
source="${BASE_CPLIC}/${CPL_ATMIC}/${PDY}${cyc}/${CDUMP}/${CASE}/INPUT/${ftype}.tile${tt}.nc"
target="${ATMdir}/${ftype}.tile${tt}.nc"
target="${COM_ATMOS_INPUT}/${ftype}.tile${tt}.nc"
${NCP} "${source}" "${target}"
rc=$?
[[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}"
Expand All @@ -71,52 +68,56 @@ for ftype in gfs_data sfc_data; do
done

# Stage ocean initial conditions to ROTDIR (warm start)
OCNdir="${COM_OCEAN_RESTART}"
[[ ! -d "${OCNdir}" ]] && mkdir -p "${OCNdir}"
source="${BASE_CPLIC}/${CPL_OCNIC}/${PDY}${cyc}/ocn/${OCNRES}/MOM.res.nc"
target="${OCNdir}/${PDY}.${cyc}0000.MOM.res.nc"
${NCP} "${source}" "${target}"
rc=$?
[[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}"
err=$((err + rc))
case $OCNRES in
"025")
for nn in $(seq 1 4); do
source="${BASE_CPLIC}/${CPL_OCNIC}/${PDY}${cyc}/ocn/${OCNRES}/MOM.res_${nn}.nc"
if [[ -f "${source}" ]]; then
target="${OCNdir}/${PDY}.${cyc}0000.MOM.res_${nn}.nc"
${NCP} "${source}" "${target}"
rc=$?
[[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}"
err=$((err + rc))
fi
done
;;
*)
echo "FATAL ERROR: Unsupported ocean resolution ${OCNRES}"
rc=1
err=$((err + rc))
;;
esac
if [[ "${DO_OCN:-}" = "YES" ]]; then
YMD=${gPDY} HH=${gcyc} generate_com -r COM_OCEAN_RESTART
[[ ! -d "${COM_OCEAN_RESTART}" ]] && mkdir -p "${COM_OCEAN_RESTART}"
source="${BASE_CPLIC}/${CPL_OCNIC}/${PDY}${cyc}/ocn/${OCNRES}/MOM.res.nc"
target="${COM_OCEAN_RESTART}/${PDY}.${cyc}0000.MOM.res.nc"
${NCP} "${source}" "${target}"
rc=$?
[[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}"
err=$((err + rc))
case "${OCNRES}" in
"025")
for nn in $(seq 1 4); do
source="${BASE_CPLIC}/${CPL_OCNIC}/${PDY}${cyc}/ocn/${OCNRES}/MOM.res_${nn}.nc"
if [[ -f "${source}" ]]; then
target="${COM_OCEAN_RESTART}/${PDY}.${cyc}0000.MOM.res_${nn}.nc"
${NCP} "${source}" "${target}"
rc=$?
[[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}"
err=$((err + rc))
fi
done
;;
*)
echo "FATAL ERROR: Unsupported ocean resolution ${OCNRES}"
rc=1
err=$((err + rc))
;;
esac
fi

# Stage ice initial conditions to ROTDIR (cold start as these are SIS2 generated)
ICEdir="${COM_ICE_RESTART}"
[[ ! -d "${ICEdir}" ]] && mkdir -p "${ICEdir}"
ICERESdec=$(echo "${ICERES}" | awk '{printf "%0.2f", $1/100}')
source="${BASE_CPLIC}/${CPL_ICEIC}/${PDY}${cyc}/ice/${ICERES}/cice5_model_${ICERESdec}.res_${PDY}${cyc}.nc"
target="${ICEdir}/${PDY}.${cyc}0000.cice_model.res.nc"
${NCP} "${source}" "${target}"
rc=$?
[[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}"
err=$((err + rc))
if [[ "${DO_ICE:-}" = "YES" ]]; then
YMD=${PDY} HH=${cyc} generate_com -r COM_ICE_RESTART
[[ ! -d "${COM_ICE_RESTART}" ]] && mkdir -p "${COM_ICE_RESTART}"
ICERESdec=$(echo "${ICERES}" | awk '{printf "%0.2f", $1/100}')
source="${BASE_CPLIC}/${CPL_ICEIC}/${PDY}${cyc}/ice/${ICERES}/cice5_model_${ICERESdec}.res_${PDY}${cyc}.nc"
target="${COM_ICE_RESTART}/${PDY}.${cyc}0000.cice_model.res.nc"
${NCP} "${source}" "${target}"
rc=$?
[[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}"
err=$((err + rc))
fi

# Stage the WW3 initial conditions to ROTDIR (warm start; TODO: these should be placed in $RUN.$gPDY/$gcyc)
if [[ "${DO_WAVE}" = "YES" ]]; then
WAVdir="${COM_WAVE_RESTART}"
[[ ! -d "${WAVdir}" ]] && mkdir -p "${WAVdir}"
if [[ "${DO_WAVE:-}" = "YES" ]]; then
YMD=${PDY} HH=${cyc} generate_com -r COM_WAVE_RESTART
[[ ! -d "${COM_WAVE_RESTART}" ]] && mkdir -p "${COM_WAVE_RESTART}"
for grdID in ${waveGRD}; do # TODO: check if this is a bash array; if so adjust
source="${BASE_CPLIC}/${CPL_WAVIC}/${PDY}${cyc}/wav/${grdID}/${PDY}.${cyc}0000.restart.${grdID}"
target="${WAVdir}/${PDY}.${cyc}0000.restart.${grdID}"
target="${COM_WAVE_RESTART}/${PDY}.${cyc}0000.restart.${grdID}"
${NCP} "${source}" "${target}"
rc=$?
[[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}"
Expand Down
Loading

0 comments on commit a9acca0

Please sign in to comment.