Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Use Case: StatAnalysis Python Embedding to read native grid (u-grid) #1561

Closed
7 of 24 tasks
JohnHalleyGotway opened this issue Jan 23, 2020 · 8 comments · Fixed by #1927
Closed
7 of 24 tasks

New Use Case: StatAnalysis Python Embedding to read native grid (u-grid) #1561

JohnHalleyGotway opened this issue Jan 23, 2020 · 8 comments · Fixed by #1927
Assignees
Labels
alert: NEED MORE DEFINITION Not yet actionable, additional definition required component: external dependency External dependency issue MET: Library Code priority: high High Priority requestor: NCAR/RAL NCAR Research Applications Laboratory requestor: NOAA/EMC NOAA Environmental Modeling Center requestor: UK Met Office United Kingdom Met Office required: FOR OFFICIAL RELEASE Required to be completed in the official release for the assigned milestone
Milestone

Comments

@JohnHalleyGotway
Copy link
Collaborator

JohnHalleyGotway commented Jan 23, 2020

Describe the New Use Case

Note that this was originally an issue in MET with the plan of adding support directly to the C++ tools. However, as of 4/6/2022, the plan has changed to leveraging existing python functionality for an initial implementation. Along with that, we'll need a new use case to demonstrate that functionality. In the long run, enhancements could be added directly to MET to support unstructured grids, but those would be described in a different issue.

Here's the original text of this issue for background:

This issue is the result of a meeting with Mark Miesch about JEDI on 1/23/2020.

Consider enhancing MET to leverage the JEDI C++ interface for reading data from LFRic, FV3, MPAS, Neptune, WRF, and SOCA. The geometry object in JEDI can instantiate a grid.

JEDI Source Code (not public): https://github.com/JCSDA/oops

State.h header file defines the State class. Each model instantiates a State in a different way, including grid information and parallel distribution information. The State object will retrieve a model value at a given location by calling getValues(). This returns a GeoVaLs object for the model values at the requested location(s). The "State::read(const eckit::Configuration &)" member function can read data from a model output file and populates the state object. The "State::geometry()" member function contains the grid information.

Dependencies: ecmwf/eckit, oops, FV3 JEDI, oops depends on boost headers (not actually compiled).
JEDI is built using ecbuild.
ecmwf/eckit does all the MPI handling.

Some functional steps...

  • Query the model to get values from a specific location.
  • Pull observations from an observation database (either ODB2 or NetCDF, dependent on WCOSS).
  • Investigate the use of BUMP and/or oops for the handling of unstructured grids.
    • Currently cube sphere grids and MPAS variable mesh grids are processed as unstructured grids by BUMP.

Here's a specific idea:
(1) Work on Hera.
(2) Enhance MET to support a new configurable option to interface with JEDI from FV3.
(3) Specifically, enhance Point-Stat to call Shape::getValues() for each observation location and generate matched pairs.
(4) Compute the resulting statistics.

So really "JEDI" is a new input "file type"... from which you extract forecast values.

This is needed by September 30, 2020 (end of Q4).

Use Case Name and Category

Provide use case name, following Contributor's Guide naming template, and list which category the use case will reside in.
If a new category is needed for this use case, provide its name and brief justification

Input Data

List input data types and sources.
Provide a total input file size, keeping necessary data to a minimum.

Acceptance Testing

Describe tests required for new functionality.
As use case develops, provide a run time here

Time Estimate

Estimate the amount of work required here.
Issues should represent approximately 1 to 3 days of work.

Sub-Issues

Consider breaking the new feature down into sub-issues.

  • Add a checkbox for each sub-issue here.

Relevant Deadlines

List relevant project deadlines here or state NONE.

Funding Source

Define the source of funding and account keys here or state NONE.

Define the Metadata

Assignee

  • Select engineer(s) or no engineer required
  • Select scientist(s) or no scientist required

Labels

  • Select component(s)
  • Select priority
  • Select requestor(s)
  • Select privacy

Projects and Milestone

  • Select Repository and/or Organization level Project(s) or add alert: NEED PROJECT ASSIGNMENT label
  • Select Milestone as the next official version or Future Versions

Define Related Issue(s)

Consider the impact to the other METplus components.

New Use Case Checklist

See the METplus Workflow for details.

  • Complete the issue definition above, including the Time Estimate and Funding source.
  • Fork this repository or create a branch of develop.
    Branch name: feature_<Issue Number>_<Description>
  • Complete the development and test your changes.
  • Add/update log messages for easier debugging.
  • Add/update unit tests.
  • Add/update documentation.
  • Add any new Python packages to the METplus Components Python Requirements table.
  • Push local changes to GitHub.
  • Submit a pull request to merge into develop.
    Pull request: feature <Issue Number> <Description>
  • Define the pull request metadata, as permissions allow.
    Select: Reviewer(s) and Linked issues
    Select: Repository level development cycle Project for the next official release
    Select: Milestone as the next official version
  • Iterate until the reviewer(s) accept your changes. Merge branch into develop.
  • Create a second pull request to merge develop into develop-ref, following the same steps for the first pull request.
  • Delete your fork or branch.
  • Close this issue.
@JohnHalleyGotway JohnHalleyGotway added requestor: NOAA/EMC NOAA Environmental Modeling Center priority: high High Priority requestor: UK Met Office United Kingdom Met Office requestor: NCAR/RAL NCAR Research Applications Laboratory alert: NEED MORE DEFINITION Not yet actionable, additional definition required labels Jan 23, 2020
@dwfncar
Copy link

dwfncar commented Jan 23, 2020

See dakota:/d3/projects/JEDI

@TaraJensen TaraJensen added the required: FOR OFFICIAL RELEASE Required to be completed in the official release for the assigned milestone label May 26, 2021
@JohnHalleyGotway
Copy link
Collaborator Author

JohnHalleyGotway commented Jun 2, 2021

On 6/2/2021, @j-opatz @TaraJensen and @JohnHalleyGotway met with Mark Miesch to discuss MET interfacing with JEDI.

Could be worthwhile to attend a JEDI training academy. JEDI documentation:
https://jointcenterforsatellitedataassimilation-jedi-docs.readthedocs-hosted.com/en/latest/

bump and saber would be good libraries with which to interface.
Deadlines, we need work done by the end of September.
Links to recorded video lectures from prior academies:
https://www.jcsda.org/jedi-academies

JEDI is really spread over many repositories... 8 or 9 of them. New release JEDI-FV3 version 1.0 and IODA 2.0 coming mid-June. IODA 2.0 does not yet fully support MET-office ODB but there's been a lot of progress. Getting closer. Internal repository, IODA-Converters can ready ODB files but they'll eventually be readable directly.

Questions.

  • Are there compilation requirements? Should this be optional?
  • Need to find an existing running instance of JEDI into which we can interface.
  • JEDI 1.0 is running on Hera, Orion, and Cheyenne. They provide module files.
  • FV3, LFRic, and MPAS are our 3 targets.

Recommend that we move development up to Cheyenne, run JEDI there, and demonstrate that we can interface with it there. The h-of-x application writes out h-of-x files. The h-of-x application is part of oops. It writes output to a file. MET could be enhanced to read matched pairs from those files. Eventually, MET could potentially read this data from memory rather than via a temporary file.

The Atlas package from ECMWF is being incorporated into JEDI. It can be used to create grids and grid meshes for subsetting tasks for mpi:
https://github.com/ecmwf/atlas
Working to get cubed-sphere incorporated into Atlas. But it is not yet fully integrated because it does not yet fully support interpolating between all grid types, masking, and adjoints.

@mpm-meto
Copy link

mpm-meto commented Aug 5, 2021

Some sample LFRic netcdf files attached for testing. This is the data originally sent in March 2021. I have asked if there is something more up to date. I do not know whether this has been tested with the JEDI libraries. Awaiting further info.
sample_data.zip

@mpm-meto
Copy link

mpm-meto commented Aug 5, 2021

Further info/update wrt use of JEDI-BUMP from someone working on JEDI in Met Office.

The good news.
There currently is a LFRic model interface that uses BUMP for interpolation to observation locations.

The bad news.

  1. The API to BUMP keeps on changing! The owner of BUMP keeps applying large changes to BUMP that can involve code changes of the order 200 > files which makes it very hard to check that it is doing what it is supposed to be doing and maintaining the interface.
  2. As far as I know, unless it has changed, it assumes that all data is collocated horizontally and vertically. It will not deal with the vertical stagger.
  3. There is no NetCDF reader of LFRic files within SABER/BUMP. Some of the other models do have this.

@TaraJensen TaraJensen changed the title Enhance MET to interface with JEDI. Enhance MET to read native grid for verification Oct 11, 2021
@TaraJensen
Copy link
Contributor

TaraJensen commented Oct 11, 2021

Here's a new sample of LFRic - obtained in September. Philip Gill also says:
On the LFRic grid side I’m attaching some updated canned metadata. Stuart Whitehouse is the best Met Office contact for this and again is happy to talk to your developers.

canned_metadata_lfric_r30711.zip

@TaraJensen TaraJensen changed the title Enhance MET to read native grid for verification Enhance MET to read native grid (u-grid) for verification Jan 6, 2022
@mpm-meto
Copy link

mpm-meto commented Jan 7, 2022

Some clarification: the term "native" can refer to any kind of grid, unstructured (ugrid) or structured mesh (lat-lon) but simply refers to the mesh that the model is integrated on. Often times we do not verify our forecasts on the native (structured) grid, e.g. WMO CBS statistics are calculated on a 1.5deg regular lat-lon grid. This is not the native grid for any of the global NWP models that submit/exchange statistics.

As far as this issue goes I think it will soon be time to break into sub-issues, and treat this issue as the overarching one. I see at least two sub-issues (for now) addressing distinct tasks which can be completed in isolation of each other:

  1. the ugrid-to-ugrid comparison. Having both the model analyses and the forecasts on the same native (ugrid) and comparing the two. This does NOT require ANY interpolation. It does require the ability to read the ugrid netcdf. These are effectively vectors of lat, lon, value files and are more like a station list for point stat than anything we associate with a regular grid. There is work that needs to be done to see how far we can get in terms of treating these ugrids like a very dense observing network and understand the computation performance of following such a path and what optimisations may be possible.
  2. internal interpolation of ugrid to regular grid. Internalising the ability to regrid ugrid to another (any other?) regular grid is probably desirable. There are python libraries (esmf) which can do the regridding for you before you feed your fields into MET so if fields are needed on a regular grid it is possible doing this before leveraging the stats in MET, i.e. it won't preclude the use of MET if an internal conversion is not possible in the short-term. It may also be worth exploring whether this could work in a python embedding sense though the computational costs of e.g. stand alone vs embedding would need to be explored alongside the value of keeping regridded fields on disk for other uses/users (as most downstream applications will be using ugrid data in a regridded format). Either way, 1) above would appear to address the issue of reading in a ugrid file, which is the first hurdle to overcome, and then figuring out what to do with it within MET.

@TaraJensen
Copy link
Contributor

TaraJensen commented Feb 3, 2022

ESMpy might be a possible solution to reading the u-grid

@TaraJensen
Copy link
Contributor

Marion sent scripts for vinterp and hinterp to Will and Tara. She's uncertain if they should be attached to the open GitHub issue. Will attach here if it is found worthwhile and cleared for doing so.

@JohnHalleyGotway JohnHalleyGotway transferred this issue from dtcenter/MET Apr 7, 2022
@JohnHalleyGotway JohnHalleyGotway added this to the METplus-5.0.0 milestone Apr 7, 2022
@JohnHalleyGotway JohnHalleyGotway changed the title Enhance MET to read native grid (u-grid) for verification New Use Case: To demonstrate python embedding with MET to read native grid (u-grid) for verification. Apr 7, 2022
@georgemccabe georgemccabe linked a pull request Nov 14, 2022 that will close this issue
14 tasks
@georgemccabe georgemccabe changed the title New Use Case: To demonstrate python embedding with MET to read native grid (u-grid) for verification. New Use Case: StatAnalysis Python Embedding to read native grid (u-grid) Nov 17, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
alert: NEED MORE DEFINITION Not yet actionable, additional definition required component: external dependency External dependency issue MET: Library Code priority: high High Priority requestor: NCAR/RAL NCAR Research Applications Laboratory requestor: NOAA/EMC NOAA Environmental Modeling Center requestor: UK Met Office United Kingdom Met Office required: FOR OFFICIAL RELEASE Required to be completed in the official release for the assigned milestone
Projects
No open projects
Development

Successfully merging a pull request may close this issue.

9 participants