Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GFSv16.3.0 updates for GSI tag, resources, versions, and WAFS CFP #983

Conversation

KateFriedman-NOAA
Copy link
Member

Description

This PR is a collection of small end-stage changes for the GFSv16.3.0 package. Changes are:

  1. Update GSI tag to gfsda.v16.3.0
  2. Rocoto-side resource updates to match ecflow-side changes made in prior PR gfsv16.3 ecflow merge from parallel #976
  3. Update CRTM version to 2.4.0 in workflow version files
  4. Additional changes to version files for gfs_ver=v16.3, obsproc_ver=1.1.0, and HOMEobsproc
  5. Add USE_CFP=YES support for the wafsgrib2 and wafsgrib20p25 jobs in rocoto

Refs #744
FYI @emilyhcliu @lgannoaa

Type of change

  • New feature (non-breaking change which adds functionality)

How Has This Been Tested?

Resources used in GFSv16.3.0 parallel testing.

Checklist

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • My changes need updates to the documentation. I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • New and existing tests pass with my changes
  • Any dependent changes have been merged and published

- Update obsproc_ver in ecf/versions/obsproc.ver to "1.1.0"
- Update obsproc_ver in versions/run.ver to "v1.1"

Refs: NOAA-EMC#744
- Update crtm_ver to 2.4.0 in version files.

Refs: NOAA-EMC#744
…ase/gfs.v16.3.0

* upstream/release/gfs.v16.3.0:
  Clean up ecflow script action 2
  Clean up ecflow script
  This branch contain: New ecflow definition file ecflow job script with memory requirement change from implementation parallel Remove ecflow workflow manager package
- Fold in resources updates from ecf script resource changes for the
ediag, waveprep, and waveawipsgridded jobs.

Refs: NOAA-EMC#744
- The wafsgrib2 and wafsgrib20p25 jobs need to set "USE_CFP=YES", so
have added a section in WCOSS2.env for those jobs to set that.
- Updated the wafsgrib2.sh and wafsgrib20p25.sh rocoto job scripts to
source the env file to get that setting at runtime.

Refs: NOAA-EMC#744
- Similar to change to WCOSS2.env, add wafsgrib2 and wafsgrib20p25 block
to Hera and Orion env files for setting USE_CFP variable.

Refs: NOAA-EMC#744
- Change obsproc_run_ver to "1.1.0" in WCOSS2, Hera, and Orion version
files.

Refs: NOAA-EMC#744
@KateFriedman-NOAA KateFriedman-NOAA added the maintenance Regular updates and maintenance work label Aug 18, 2022
@KateFriedman-NOAA KateFriedman-NOAA added this to the GFSv16.3.0 - ops milestone Aug 18, 2022
@KateFriedman-NOAA KateFriedman-NOAA self-assigned this Aug 18, 2022
Copy link
Contributor

@aerorahul aerorahul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good.

@KateFriedman-NOAA KateFriedman-NOAA merged commit 16774f5 into NOAA-EMC:release/gfs.v16.3.0 Aug 18, 2022
DavidNew-NOAA added a commit to DavidNew-NOAA/global-workflow that referenced this pull request Mar 28, 2024
aerorahul pushed a commit that referenced this pull request Apr 23, 2024
This PR, a companion to GDASApp PR
[#983](NOAA-EMC/GDASApp#983), creates a new
Rocoto job called "atmanlfv3inc" that computes the FV3 atmosphere
increment from the JEDI variational increment using a JEDI OOPS app in
GDASApp, called fv3jedi_fv3inc.x, that replaces the GDASApp Python
script, jediinc2fv3.py, for the variational analysis. The "atmanlrun"
job is renamed "atmanlvar" to better reflect the role it plays of
running now one of two JEDI executables for the atmospheric analysis
jobs.

Previously, the JEDI variational executable would interpolate and write
its increment, during the atmanlrun job, to the Gaussian grid, and then
the python script, jediinc2fv3.py, would read it and then write the FV3
increment on the Gaussian grid during the atmanlfinal job. Following the
new changes, the JEDI increment will be written directly to the cubed
sphere. Then during the atmanlfv3inc job, the OOPS app will read it and
compute the FV3 increment directly on the cubed sphere and write it out
onto the Gaussian grid.

The reason for writing first to the cubed sphere grid is that otherwise
the OOPS app would have to interpolate twice, once from Gaussian to
cubed sphere before computing the increment and then back to the
Gaussian, since all the underlying computations in JEDI are done on the
native grid.

The motivation for this new app and job is that eventually we wish to
transition all intermediate data to the native cubed sphere grid, and
the OOPS framework allows us the flexibility to read and write to/from
any grid format we wish by just changing the YAML configuration file
rather than hardcoding. When we do switch to the cubed sphere, it will
be an easy transition. Moreover, it the computations the OOPS app will
be done with a compiled executable rather than an interpreted Python
script, providing some performance increase.

It has been tested with a cycling experiment with JEDI in both Hera and
Orion to show that it runs without issues, and I have compared the FV3
increments computed by the original and news codes. The delp and
hydrostatic delz increments, the key increments produced during this
step, differ by a relative error of 10^-7 and 10^-2 respectively. This
difference is most likely due to the original python script doing its
internal computation on the interpolated Gaussian grid, while the new
OOPS app does its computations on the native cubed sphere before
interpolating the the Gaussian grid.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
maintenance Regular updates and maintenance work
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants