-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merge feature/hafs_ensda branch back to develop #58
Conversation
… feature/hafs_vigsi
…y/HAFS into feature/hafs_vigsi
hafs_regional_da_C96s1n4_320x312.conf, which will be used as a start point for the HAFS DA related developments.
…e model_configure.
b. Add the analyis (GSI) related script and parm changes.
successfully for a regional C96s1n4_320x312 configuration.
… analysis job. And enable dealing with prepbufr.nr file.
…ange in parm/hafs_hycom.conf.
… developed by Henry) and enable the workflow running in different cycling modes: coldstart, warmstart, gsi_vr, 3DVar, and 3DEnVar.
b. Check, update, and unify the gsi related namelist and fix files c. Optimize the analysis/analysis_vr related scripts d. Only do gsi based relocation if the forecast and observed storm center is 0.2 degree away from each other
…ion of observational bufr files.
…ailability for the 3DEnVar with GDAS ensemble DA configuration. b. Fix a typo for delivering the atmos_static.nc file in the forecast job.
…configurations b. For GSI vortex relocation, only relocate the storm if its observed intensity vmax >= 35 kt
…y/HAFS into feature/hafs_vigsi
observations from the forecast model restart files in hafs_obs_preproc. This is useful when dealing with high-resolution forecast configurations. Also need to use more cores for the analysis/gsi job.
…reproc when sampling the synobs for GSI based relocation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The regression tests completed successfully on WCOSS-CRAY
The upgrade from the esmf/810bs27 to esmf/810bs47 caused an issue to substantially slow down the forecast model when using omp threading (by resetting thread to 1 in the forecast model). More details can be seen in this issue. A newer version of esmf/811bs03 has been tested and fixed the issue. So, we need to wait the new esmf/811bs03 besing installed and switch to use it in HAFS. Otherwise, the HAFS forecast job has been substantially slowed down and will run out of wallclock limit in most cases. |
…hich seems to help reduce the forecast job failure.
… version of esmf/811bs03 on Orion, Hera, Jet, and wcoss_dell_p3.
Hello Bin, |
RT on Orion passed (codes checked out on 4/20). There were several commits made after the test started. Building and testing it again. |
RT has passed on Hera; no changes needed. This test did not include @BinLiu-NOAA's two most recent commits (clean up jobs/JHAFS_PRODUCT; update .gitmodules to point support/HAFS branch for hafs_gsi.fd), but included all of the others. Should I repeat the RTs? |
Thanks @evankalina! And for the last two commits, you do not need to test them again. They do not change results. I am also trying to wait for the esmf/811 being installed under hpc-stack. Again, that will not change results either. Once that's updated, we can merge this PR into the develop branch. |
I cloned the feature/hafs_ensda branch this afternoon and did a ufs-model-level regression test. All cpld failed for results check. I believed all latest commits are included in this copy. |
Thanks @ChunxiZhang-NOAA! As you can see from the PR description, it actually has some changes ahead of the ufs-weather-model develop branch. So, as long as the regression tests can successfully run through, we do not expect all the regression tests produce identical results. Next time after we syncing with the ufs-weather-model develop, we should then expect reproducing every regression tests. |
@BinLiu-NOAA I see, thanks. |
…ommunity/HAFS into feature/hafs_ensda_202104
Now esmf/8_1_1 is available on all HAFS supported platforms. I think we are ready to merge this PR. Waiting for at least two review approvals. Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am good with this being merged. RTs have passed on Hera and Orion (among other platforms). It is very exciting to have all of these new DA capabilities in HAFS.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good!
This Pull Request enables using the HWRF CCPP physics in the HAFS system. *Initial workflow level changes came from the feature/hwrf_physics branch of @mzhangw *Import hwrf_physics_fix: DETAMPNEW_DATA.expanded_rain_LE is needed for the F-A microphysics scheme and the following files are needed for the HWRF version of NOAH LSM (note: other versions of NOAH LSM may need different versions of these files): GENPARM.TBL, SOILPARM.TBL, VEGPARM.TBL *Add two hwrf physics suite related workflow level regression tests *Workflow level (parm/script/build) clean ups and unification to support both HWRF ccpp suite and the existing HAFS suite. All workflow level regression test passed. And this PR addresses issue #58. Note: Enabling using the HWRF CCPP Suite in HAFS is a collaborative group effort among DTC, GSL, EMC, including: Man Zhang, Grant Firl, Mrinal Biswas, Dominikus Heinzeller, Chunxi Zhang, Eric Aligo, Weiguo Wang, Qingfu Liu, Bin Liu, etc.
Features and enhancements with this PR: *The following HAFS DA workflow capabilities for the single regional domain configuration with a regular gnomonic grid or Extended Schmidt Gnomonic (ESG) grid are available now, including: -Cold-start, warm-start capabilities -GSI-based Vortex Relocation (originally developed by Henry Winterbottom, not fully tested though) -3DVar and 3DEnVar with GDAS ensembles (EMC hurricane and fv3cam project teams) -FGAT capability (OU) -3DEnVar with dual-resolution self-cycled EnKF ensembles (EMC/OU) -Assimilating all observations ingested in HWRF/GDAS/GFS (tempdrop drifting not considered yet) *Generalized HAFS workflow task/job names according to the latest HAFS workflow schematic as of 04/16/2021, together with substantial cleanup and optimization for the workflow scripts. *HAFS submodules (hafs_forecast.fd, hafs_utils.fd, hafs_post.fd, and hafs_gsi.fd) were synced with their latest authoritative branches as of 04/16/2021. For the hafs_forecast.fd submodule (the support/HAFS branch of ufs-weather-model), it also picked up the following two upcoming PRs (which have not yet gone into its develop branch). -Add 'netcdf' and 'grib2' to input 'source' options in regional IC/LBC routines PR -Add new PBL diagnostic variables dkt and dku PR *Clean up hafs_tools.fd and use CMake to build HAFS specific components including hafs_tools.fd, hafs_vortextracker.fd, and hafs_hycom_utils.fd. (Biju) *Use hpc-stack version libraries when applicable. (Biju) *Switch to use GFSv16 version of scale-aware TKE EDMF PBL scheme as the default PBL scheme. (Chunxi, Weiguo, JungHoon) *Switch to use the latest esmf/8_1_1 version on Orion, Jet, Hera, wcoss_cray and wcoss_dell_p3. Notes: *HAFS DA workflow capabilities are mainly developed by the HAFS DA Development Team including: EMC hurricane project team in close collaboration with the FV3CAM group; OU collaborators; UMD collaborators; HRD and UM CIMAS collaborators; DTC collaborators; University at Albany collaborators, etc. *This version of feature/hafs_ensda branch will be used in the HAFSv0.2a phase-2 combined (H2PC) experiment, as well as the HAFSv0.2a DA baseline and control experiments (H2DB and H2DC). *A rocoto_utils.sh script is now available under the rocoto dir, which can be used to easily check status for rocoto tasks and/or rewind failed rocoto tasks. Usage: ./rocoto_utils.sh [-a | -f | -r | -s] # Loop over all *.xml files ./rocoto_utils.sh [-a | -f | -r | -s] hafs # Loop over all *.xml files starting with hafs -a: check all active tasks -f: check all failed tasks -r: check and rewind all failed tasks -s: check status for all tasks
This Pull Request enables using the HWRF CCPP physics in the HAFS system. *Initial workflow level changes came from the feature/hwrf_physics branch of @mzhangw *Import hwrf_physics_fix: DETAMPNEW_DATA.expanded_rain_LE is needed for the F-A microphysics scheme and the following files are needed for the HWRF version of NOAH LSM (note: other versions of NOAH LSM may need different versions of these files): GENPARM.TBL, SOILPARM.TBL, VEGPARM.TBL *Add two hwrf physics suite related workflow level regression tests *Workflow level (parm/script/build) clean ups and unification to support both HWRF ccpp suite and the existing HAFS suite. All workflow level regression test passed. And this PR addresses issue hafs-community#58. Note: Enabling using the HWRF CCPP Suite in HAFS is a collaborative group effort among DTC, GSL, EMC, including: Man Zhang, Grant Firl, Mrinal Biswas, Dominikus Heinzeller, Chunxi Zhang, Eric Aligo, Weiguo Wang, Qingfu Liu, Bin Liu, etc.
Features and enhancements with this PR:
Notes: