Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Physics update in cumulus convection, PBL & surface layer for UFS_P7 #665

Merged
merged 58 commits into from
Jul 15, 2021

Conversation

JongilHan66
Copy link
Collaborator

@JongilHan66 JongilHan66 commented May 24, 2021

Cumulus convection:
a) More strict convection trigger
b) Reduced entrainment rate below cloud base
c) Enhanced downdraft detrainments starting from 60mb above the ground surface
d) Reduced rain evaporation
e) Modification of cloud depth separating shallow convection from deep convection

PBL and surface layer:
a) Inclusion of wind shear effect reducing characteristic mixing length
b) Reduction of background diffusivity in the inversion layers as a function of surface roughness and green vegetation fraction
c) PBL updraft overshooting limited by bulk Richardson number-based-PBL depth
d) Inclusion of new canopy heat storage parameterization
e) Modification of thermal roughness calculation over land
f) Increase of momentum roughness length over sea
g) Inclusion of sea spray effect parameterization

The verification results for the physics updates above can be found in https://www.emc.ncep.noaa.gov/gmb/jhan/vsdbw/ccaa14
More detailed description for the updates and ccaa14 results are given in the attached ppt file.
Tests_physics_MRF_MJO_JHan_WLi.pptx

Verification results for additional physics updates are in:

  1. Increased sea spray effects: https://www.emc.ncep.noaa.gov/gmb/jhan/vsdbw/ccaa27
  2. Inclusion of new maximum z/L parameterization in the GFS surface layer scheme:
    Winter: https://www.emc.ncep.noaa.gov/gmb/jhan/vsdbw/ccaa41
    Summer: https://www.emc.ncep.noaa.gov/gmb/jhan/vsdbw/ccaa41x

The final update is based on 'ccaa41'.

Description for the new maximum z/L parameterization can be found in the attached ppt file.
PBL_SL_inv_JHan.pptx

Dependencies

NOAA-EMC/fv3atm#314
ufs-community/ufs-weather-model#592

@grantfirl
Copy link
Collaborator

@JongilHan66 Please consider merging this PR from me (JongilHan66#1) into your PR branch. These changes WILL NOT affect FV3/UFS in any way; they are only to make sure that the SCM stays compatible with the changes in this PR when specified surface fluxes are used in the SCM. If JongilHan66#1 is not merged into your branch, it will need to go into ccpp-physics in a followup PR, so including it now will probably save some work on @climbfuji and my side. Thanks!

@JongilHan66
Copy link
Collaborator Author

@JongilHan66 Please consider merging this PR from me (JongilHan66#1) into your PR branch. These changes WILL NOT affect FV3/UFS in any way; they are only to make sure that the SCM stays compatible with the changes in this PR when specified surface fluxes are used in the SCM. If JongilHan66#1 is not merged into your branch, it will need to go into ccpp-physics in a followup PR, so including it now will probably save some work on @climbfuji and my side. Thanks!

It is now being regression test for each machine with a new baseline. Could you check the merging issue with Jun Wang (Jun.Wang@noaa.gov) who has a PR schedule for each update?

@grantfirl
Copy link
Collaborator

@JongilHan66 Please consider merging this PR from me (JongilHan66#1) into your PR branch. These changes WILL NOT affect FV3/UFS in any way; they are only to make sure that the SCM stays compatible with the changes in this PR when specified surface fluxes are used in the SCM. If JongilHan66#1 is not merged into your branch, it will need to go into ccpp-physics in a followup PR, so including it now will probably save some work on @climbfuji and my side. Thanks!

It is now being regression test for each machine with a new baseline. Could you check the merging issue with Jun Wang (Jun.Wang@noaa.gov) who has a PR schedule for each update?

Sure, I can ask Jun. The changed files in ccpp-physics in JongilHan66#1 are not even parsed by the FV3 build system, so I think it is impossible for those changes to affect the UFS regression tests.

@climbfuji
Copy link
Collaborator

@JongilHan66 Please consider merging this PR from me (JongilHan66#1) into your PR branch. These changes WILL NOT affect FV3/UFS in any way; they are only to make sure that the SCM stays compatible with the changes in this PR when specified surface fluxes are used in the SCM. If JongilHan66#1 is not merged into your branch, it will need to go into ccpp-physics in a followup PR, so including it now will probably save some work on @climbfuji and my side. Thanks!

It is now being regression test for each machine with a new baseline. Could you check the merging issue with Jun Wang (Jun.Wang@noaa.gov) who has a PR schedule for each update?

Sure, I can ask Jun. The changed files in ccpp-physics in JongilHan66#1 are not even parsed by the FV3 build system, so I think it is impossible for those changes to affect the UFS regression tests.

I think I can speak for the ufs code managers, this PR can be merged as long as you update the submodule pointers in fv3atm and ufs-weather-model accordingly. I recommend taking a local backup of the entire ufs-weather-model directory, then merge in Grant's changes and update the submodule pointers. Then you confirm that only the two files have changed.

And then you check out your ufs-weather-model PR recursively in a different directory and make sure it is identical to where you did the merge and submodule pointer updates. I can do this last step for you if you prefer. But this way you are 100% certain that everything is as expected and that the ufs-weather-model will not be affected.

@grantfirl
Copy link
Collaborator

@JongilHan66 Please consider merging this PR from me (JongilHan66#1) into your PR branch. These changes WILL NOT affect FV3/UFS in any way; they are only to make sure that the SCM stays compatible with the changes in this PR when specified surface fluxes are used in the SCM. If JongilHan66#1 is not merged into your branch, it will need to go into ccpp-physics in a followup PR, so including it now will probably save some work on @climbfuji and my side. Thanks!

It is now being regression test for each machine with a new baseline. Could you check the merging issue with Jun Wang (Jun.Wang@noaa.gov) who has a PR schedule for each update?

Sure, I can ask Jun. The changed files in ccpp-physics in JongilHan66#1 are not even parsed by the FV3 build system, so I think it is impossible for those changes to affect the UFS regression tests.

I think I can speak for the ufs code managers, this PR can be merged as long as you update the submodule pointers in fv3atm and ufs-weather-model accordingly. I recommend taking a local backup of the entire ufs-weather-model directory, then merge in Grant's changes and update the submodule pointers. Then you confirm that only the two files have changed.

And then you check out your ufs-weather-model PR recursively in a different directory and make sure it is identical to where you did the merge and submodule pointer updates. I can do this last step for you if you prefer. But this way you are 100% certain that everything is as expected and that the ufs-weather-model will not be affected.

Thanks @climbfuji. The alternative (and what typically happens) is to do a follow-up PR with any SCM-only changes to ccpp-physics to be merged along with a future PR. It's not a big deal to do it that way, there is just a lag for SCM users, and there has been a request to use Jongil's changes with the SCM already.

@climbfuji
Copy link
Collaborator

@JongilHan66 @grantfirl @junwang-noaa here is what we will do. We merge Jongil's PR as is (assuming that all the testing is successful). Afterwards, we merge Grant's changes into ccpp-physics main directly w/o testing in UFS, but the changes must be tested in SCM (since this is the only model reading the two files that changed). The next PR has to update to the head of main anyway and will get Grant's SCM-only changes automatically.

@grantfirl
Copy link
Collaborator

@JongilHan66 @grantfirl @junwang-noaa here is what we will do. We merge Jongil's PR as is (assuming that all the testing is successful). Afterwards, we merge Grant's changes into ccpp-physics main directly w/o testing in UFS, but the changes must be tested in SCM (since this is the only model reading the two files that changed). The next PR has to update to the head of main anyway and will get Grant's SCM-only changes automatically.

OK, @climbfuji that is fine. I'll remove the PR into Jongil's branch and put it into main, to be updated once #665 is merged.

@SMoorthi-emc
Copy link
Collaborator

Well, Grant may have to update some more with my updates next.

else
ztmax1 = 99.0_kp
xkzo = gdx / xkgdx
endif
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not write this as just "xkzo = min(one, gdx/xkgdx)"?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure. You can write that as "xkzo = min(one, gdx/xkgdx)".

!
! --- ... flag for open water
do i = 1, im

flag(i) = (wet(i) .and. flag_iter(i) .and. .not. use_flake(i))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do you need to define this extra array "flag"? It is used only once!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I take that back. I didn't see the code below

@climbfuji
Copy link
Collaborator

Since there were more problems with these code changes on Cheyenne (which uses a newer Intel compiler) than on other systems, I am going to test this code with Intel 2020.2 on Hera today.

@climbfuji
Copy link
Collaborator

Since there were more problems with these code changes on Cheyenne (which uses a newer Intel compiler) than on other systems, I am going to test this code with Intel 2020.2 on Hera today.

I ran rt.conf on Hera with Intel 2020.2. The following two tests failed, because they timed out towards the end of the run:

control_wrtGauss_netcdf_parallel 013 failed
control_wrtGauss_netcdf_parallel 013 failed in run_test
regional_quilt_netcdf_parallel 036 failed
regional_quilt_netcdf_parallel 036 failed in run_test

All other tests completed successfully within the walltime. @junwang-noaa @DeniseWorthen @DusanJovic-NOAA

@junwang-noaa
Copy link
Collaborator

junwang-noaa commented Jun 26, 2021 via email

@climbfuji
Copy link
Collaborator

climbfuji commented Jun 26, 2021 via email

@climbfuji
Copy link
Collaborator

Same for develop. Could also be a mistake on my end when installing hpc-stack for Intel 2020.2. Definitely something to keep in mind.

@@ -917,13 +950,23 @@ subroutine satmedmfvdifq_run(im,km,ntrac,ntcw,ntiw,ntke, &
if(n == 1) then
dz = zl(i,1)
tem1 = tsea(i)*(1.+fv*max(q1(i,1,1),qmin))
! tem1 = 0.5 * (tem1 + thvx(i,n))
!! tem1 = 0.5 * (tem1 + thlvx(i,n))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At some point (not this PR) we should remove all these comments.

@climbfuji
Copy link
Collaborator

Regression testing completed, will merge this PR now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants