-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stem area index AND canopy snow radiation fixes. #750
Conversation
biogeophys/EDSurfaceAlbedoMod.F90
Outdated
if(total_lai_sai(L,ft,iv).gt.0._r8)then | ||
frac_lai = currentPatch%elai_profile(L,ft,iv)/total_lai_sai(L,ft,iv) | ||
frac_sai = 1.0_r8 - frac_lai | ||
f_abs(L,ft,iv,ib) = 1.0_r8 - (frac_lai*(rhol(ft,ib) + taul(ft,ib))+& |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this initialized when total_lai_sai<=0?
If uninitialized, could propogate nans, even in cases where it is multiplied by zero.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch @rgknox. That might explain some of the slightly odd behaviour I'm currently chasing...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've taken that if statement out now...
Made some changes to simplify this PR, and went through the process of checking whether it gave identical answers to the main branch when frac_lai = 1.0 , which it should, and now does. At the moment this makes SABV go up, and GPP do down. My hypothesis is that this is to do with the vertical layering and that the layers getting more radiation are light saturated and that the layers getting less are not. Will plot out vertical profiles to check, then this will boe good to go I think. |
On further consideration, it is in fact impossible to assess the impacts of this change in the absence, ironically, of SP mode. Either I need to merge this into the SP branch for testing, or wait until it is fixed/integrated. Not sure it's worth doing all of that merging or if it's best to wait until the SP tag is ready. The impact of this change, in any case, will be a decline in GPP resulting from the non-use of PAR absorbed by stems (which is current;y being used for PSN) |
@rosiealice I think there are few things here to consider:
|
I think the thing to do is to actually print out the error and then look at what the status quo is before we decide what to do with all these thresholds. Then we'll know whether there is an existing problem, and also how it will respond to changes in e.g. the SAI handling etc. I'd put a placeholder to track the error and on my branch am using an existing history field to output it. Need to make a new field to add to this PR. |
Reviewing the other tests, particularly the non-debug tests reveals that more are seeing a blowing up of the radiation_error: |
@glemieux , it does look like the error term is being zero'd properly, I'm just wondering if the errors are growing because the canopy is growing, and with more stuff to reflect, there is more error. |
In these lines:
EDIT: The descriptions on variables are not correct, need to track these down. e.g. sabs_dir, sabs_dif, local: abs_rad() sabs_dir is filled by the local abs_dir variable, but the abs_dir variables is tough to follow, not sure what it is supposed to be. ie: in = absorbed + reflected
|
Also possible that the zero'ing of the radiation_error is happening inside the "radtype" loop, but seems like it should be outside and prior to this loop? |
This came up in discussion during the SE call. The consensus was that there needs to be a general clean-up of radiation identified in issue #794 For this current PR the plan was to address this immediate problem without refactoring the radiation code and then address a clean-up, refactor, and stand-alone unit test as part of a new issue or within #794 @glemieux @rgknox |
I went down a bit of a rabbit hole today trying to understand these radiation errors that are behind the test failures that @glemieux ran into in #750 (comment). I think there are (at least) two distinct problems with different causes; and interestingly they both appear to be situations that don't happen in FATES-SP mode and so don't show up when we look at that. So if we look at the radiation error in FATES-SP mode, we get something that looks well behaved, with values on the order of 10^-9 W/m2 and structure in space and time, which you can see here in both time-average map and zonal-mean fields: If we run the same thing in full-FATES mode, the behavior is pretty different. First I have to plot it as log(abs(RAD_ERROR)) to span the wide dynamic range, and in map and zonal-mean views it looks like this: So as I see it, there are three categories of gridcells if full-FATES mode: some where the error is still order 10^-8-10^-9 W/m2, which are in arid-semi-arid ecosystems, some where the error is order 10^-1 W/m2, which are in forested ecosystems, and then a handfull of gridcells in the high arctic where the error is order 10^20ish W/m2. So I think those lsat ones are the ones that are causing the model to fail when debug is turned on and leading to the test failures. Given that neither of the types of badly-behaved gridcells show up in FATES-SP mode, I was wondering which of the differences in model dynamics between those configurations might be driving things, e.g.:
I am pretty sure, just based on spatial patterns alone, that the 10^-1 error gridcells are from multiple canopy layers (see next map of mean number of canopy layers). I tried setting nclmax to one in order to confirm, but it crashed when I tried to reuse my old restart file and I didn't pursue that anymore. So we're going to have to look into that, but I'll set that aside for now as I don't think its what's crashing the model. For the 10^20 W/m2 errors, its less clear to me. Since I initially suspected uninitialized values of patch creation, I tried running in both FATES-ST3 (i.e. no phenology, no growth, no disturbance) mode, and also in full-FATES with no patch creation by setting mort_disturb_frac to zero and turning off fire. The FATES-ST3 run had none of the 10^20 W/m2 error gridcells but did have the 10^0 gridcells. The full-FATES but no new patch creation had both types of bad gridcells. So that said that the 10^20 error is not related to disturbance per se but is related to something relevant to plant growth at high latitudes. I turned off the snow occlusion of LAI by asserting ELAI = TLAI and that didn't fix the problem. I looked at patterns of LAI and it does seem that the really bad gridcells correspond to ones with really small LAI values (see next map of log(max(ELAI)): So then I looked into the lai_min parameter, which was reinstated via #739 to prevent overflows (we had removed it in #733 but then reversed that, so #261 should really be reopened). I set the ai_min value from 0.1 to 10^-3 (any smaller than that and the model crashes), and that changed the value of the 10^20 W/m2 errors by a couple orders of magnitude, and so helped confirm that this is an edge case related to super-small LAI values. So I think what's going on is that there is some divide-by-tiny-number error that is causing these radiation error blowups. Interestingly they don't seem to show up in other downstream things, so possibly this only happens in cases where the plant-occupied fraction of the patches is also tiny and so the signal is getting swamped by the better-behaved bare-ground patch? Anyway, I haven't fixed the problem but I think I've narrowed it down a lot. |
OK I think I figured out what's going on that's causing the crashes, and fixed it. Basically on patches with a really small fractional coverage by vegetation, the calculations are blowing up because of See maps and zonal-mean plots of log(abs(RAD_ERROR)) below. There is still clearly a problem that we need to fix for the multiple-canopy-layer case -- some of these errors are not small. But that should I think be a separate PR from this. PR to the PR branch with this change is rosiealice#10 |
Phew. Thanks @ckoven ! So, the upshot is that this PR with the radiation error will now not crash, but nonetheless it will tell us that the error is large in full-fates cases? There is a lot of complexity in the multiple canopy layer representation relating to how light gets through layers which are partially filled , spatially. I do wonder if it makes sense to try and do a run with the maximum NCL set to 1 (from bare ground) such that one can isolate whether it's the multiple canopy layers per se or just the n>1 cohorts that is the source of the problem. Good thing that SP mode is behaving moderately sensibly to act as a backstop, even if this is troublesome... Thanks for doing this. Much appreciated. |
Yep, and agreed. I'll start that run off to confirm the multi-layer hypothesis. Thank you @rosiealice for this PR! |
fix for patch%radiation_error
brief update to answer @rosiealice's question of what happens when I run with a single canopy layer from bare ground in full-FATES configuration. Turns out that the radiation errors are still much higher (order 10^-1 W/m2) than in the FATES-SP mode case. So not sure exactly what's behind them, but I guess its not required to have multiple canopy levels for that to happen. Also there still seem to be some very-high-error (10^20ish) blips sneaking through. That is consistent with @glemieux finding that the revised code also is still running into crashing situations when in debug mode in the test suite. |
This appears to be strongly correlated with the check on the patch variable fates/biogeophys/EDSurfaceAlbedoMod.F90 Lines 127 to 153 in e1cf819
If the check returns Perhaps we can zero out this error within the above |
Incorporate leaf xylem water potential to address over estimation of transpiration. Additionally, provide an option to use _gross in BB stomatal conductance model
After implementing fc4a301, I merged in the latest tags and retested the fates suite. All expected tests now pass with DIFFs, as expected. Testing directory can be found here: Note that the ctsm aux_clm suite still needs to be exercised. This PR will be integrated after ctsm5.1.dev062 has been integrated. |
Nice job Greg!!! On the multiple layers thing, that's interesting that it's not the multiple layers themselves. I suspect it's to do with the complexity in the canopy structure when there are multiple cohorta of different LAI depths. That creates lots of oddities in how light is passed through. For example, if cohort 1 has an Lai of 4 and cohort 2 has 6, then for the layers 5 and 6 light has to pass through space for Cohort #1 and a leaf matrix for #2, both on the way up and and on the way down. Ive been trying to figure out a test that will stop this happening in full fates mode. If only to check out this theory. I.e. by changing how the LAI is distributed into the boxes. One way might be to 'spreae out' the leaf area in the spatially incomplete layers so that the canopy is always 'rectangular'. That might not make scientific sense but it would allow this to be checked at least! |
All test pass with DIFFs as expected on izumi: |
This set of changes by Junyan Ding, updates the Van Genuchten P-V and P-K relationships to use the secondary pore side distribution parameter N. This also fixes a redundant parameter entry for lwp_k.
Final testing is complete. All expected tests pass, with DIFFs as expected:
|
This PR addresses issue #744, by implementing leaf layer level optical properties, and using these instead of the previous, erroneous, usage of rhol and taul in the radiation transfer code.
It also now seems likely that the absence of a proper treatment of the radiative effects of canopy snow interception in FATES is causing large discrepancies between FATES-SP and CLM5-SP simulations.
The implementation of snow on canopy layers needs to follow from the fix to the treatment of stem area, as it affects the reflective and transmission parameters that are weighted by other changes in this PR, hence, I have bundled them together.
The physics/energy/hydrology dynamics of canopy snow interception, sublimation, melting, etc. are handled by the HLM, and so here there is a modification to the interface to pass fcansno (the fraction of the canopy that has snow on) into FATES.
This (fcansno) is currently a patch level property so is applied uniformly to all FATES layers.
In CLM5, the treatment of snow affect the 'omega' parameter of the big leaf model, which is the reflectance + transmittance of the leaves. This is not applicable to FATES RTM and so instead I have modified the rho and tau parameters directly to account for the snow. The reflectance parameters for snow are hard-wired here but they could be put in the parameter file, or be sent from the HLM, or hard-wired somewhere more sensible (I couldn't figure out where that might be though) as people see fit.
Description:
This PR has a ctsm counter-part: ESCOMP/CTSM#1531
Collaborators:
Expectation of Answer Changes:
This is expected to be (slightly!) answer changing, where SAI>0
Checklist:
Test Results:
I have tested these changes in full FATES mode, (this PR being independant of SP mode).
Analysis of results is ongoing. I wanted to get the PR up for the sake of discussion.
CTSM (or) E3SM (specify which) test hash-tag:
CTSM (or) E3SM (specify which) baseline hash-tag:
FATES baseline hash-tag:
Test Output: