-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Point2Grid: Add a new MET tool to process point observations into a gridded field. #1078
Labels
Milestone
Comments
hsoh-u
added a commit
that referenced
this issue
Dec 11, 2019
hsoh-u
added a commit
that referenced
this issue
Dec 12, 2019
hsoh-u
added a commit
that referenced
this issue
Dec 12, 2019
hsoh-u
added a commit
that referenced
this issue
Dec 12, 2019
Done for the initial release.
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Here's an email exchange describing the functionality that was requested by the DTC - Regional Ensembles group as well as the NOAA - HWT group... and NOAA - HMT group
Need to think of a good name for this. Maybe "grid_point_obs"... but that's confusing. Could be "process_point_obs"?
FROM JOHN:
Do you agree that the practically perfect algorithm would require a new tool which...
(1) Reads point observations... specifically one or more NetCDF output files from ascii2nc, madis2nc, pb2nc, or lidar2nc.
(2) Read a configuration file which defines the group(s) of point obs to be processed (i.e. ADPSFC surface temperatures or 6-hour rain gauge obs) and defines the desired output grid (e.g. the 80km grid you referenced).
(3) For each "field" (i.e. group of point observations), apply the gaussian filter to define a practically perfect forecast for the requested output domain.
Some questions to consider are:
(1) Are there any additional configuration options required for the practically perfect algorithm?
(2) How do you name the output varables?
(3) Should the tool process multiple output times in a single run or just one? How do you define those time windows... and therefore, the timestamp of the output. Since these are observations... from the perspective of timing meta-data this should be considered an analysis.
Supporting the practically perfect algorithm would be the first step, but we could potentially add support for additional "gridding" algorithms in the future.
Do all those details sound right? And if so, what would we name such a tool? We currently have "plot_point_obs"... would this be "grid_point_obs"?
FROM BURKLEY TWIEST:
(1) For the configuration of the practically perfect algorithm, having the ability to vary the sigma used in the Gaussian smoother is critical, particularly so that this tool can be applicable across a variety of space and time scales. Also, (and I think you already have this in the description of the tool) the ability to have the practically perfect fields generated on and output to grids other than the 80-km grid is necessary. Finally, being able to indicate a neighborhood around the report that would indicate a "hit" would be useful. I think this would need to go into the new tool being outlined here. For example, I've recently done some regridding of practically perfect fields, and needed to light up every point within a 40-km radius of the observation prior to applying the smoother in order to get the practically perfect values we needed on a fine-scale grid.
(2) Good question! We use "practically perfect" when dealing with the observations (and that's pretty widespread in the literature) and "surrogate severe" when dealing with the model fields. However, when looking at things like precip, you might have two gridded fields of precip exceedance that you're dealing with. Perhaps indicating the output fields as "Gaussian" or "Gaussian smoothed"+ would be a good compromise that could apply equally to the observations and model field datasets?
(3) I think our main applications for these would be to 24-h time periods and 1-h time periods, although there are a couple projects in the works here that look at time periods between 1-h and 24-h, and there could be some potential with Warn-on Forecast to look at even shorter time frames. It would be really nice to be able to process multiple times without having to call the tool multiple times for shorter increments, especially if we start applying this to warn-on forecast output. Would it be possible to be able to designate different chunks of time, one smaller than the other? So the tool could accumulate over a 24-h period, but also spit out hourly fields simultaneously? This would be a really useful functionality, but I'm not sure if it's too large of an ask!
Hopefully these responses help in your development, and feel free to poke me with any continued questions. We really appreciate the implementation of this functionality into MET, and think it'll be very useful to the community as a whole. [MET-1078] created by johnhg
The text was updated successfully, but these errors were encountered: