Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extrapolate from flowline #1490

Closed
wants to merge 3 commits into from

Conversation

anoukvlug
Copy link
Collaborator

This PR contains a notebook that shows an option for extrapolating the glacier covered area, simulated along the flowline again to a 2-D grid. In addition it re-distributes the ice thickness. The notebook is far from perfect. Some things to be addressed are the time it takes to run the notebook, which is extremely long, so suggestions regarding that are especially welcome. Also the resolution is still quite course, making it finer would be an option once the run time has decreased.

Generally speaking I'm looking for constructive feedback and discussing the different options. (This notebook isn't meant to be merged.)

This issue addresses #1448

@fmaussion
Copy link
Member

Maybe some collective knowledge will help ;-)

cc @phiscu @lilianschuster @pat-schmitt @lvdlaan

@lilianschuster
Copy link
Member

write down if you have a look at it (it would be bad if there are two people working on it at the same time). I won't look at it this week but could try next week.

@pat-schmitt
Copy link
Member

I looked at the code and the most obvious runtime optimisation would be to avoid operations which loop over each pixel separately.
This is done for example in the function distribute_thick to get the first-order thickness of the points. This first-order thickness is also quite crude and probably can be replaced with an element-by-element function. Pseudo code how this could look like:

import numpy as np
from scipy import interpolate

thick_fl = fl.thickness_m[dict(time=yr)]
surface_h_fl = fl.bed_h + thick_fl

# just make a linear interpolation between surface_h and thick_fl
sort_index = np.argsort(surface_h_fl)
tck = interpolate.splrep(surface_h_fl[sort_index],
                         thick_fl[sort_index],
                         k=1)  # degree of the fitting, 1 is linear

# this function returns the ice thickness depending on the surface height
def lin_interpolate_h(surface_height):
    return interpolate.splev(surface_height, tck)

# now vectorize so it can be used elementwise on a numpy array
vec_lin_interpolate_h = np.vectorize(lin_interpolate_h)

# get the first-order thickness, topo_smoothed is the map with the 2D surface heights of the current glacier
thick = vec_lin_interpolate_h(topo_smoothed)

It also would need some safeguards during the interpolation to make sure there are no surface_h values along the flowline which are the same (in this case maybe one can use a mean thick of the two). And I am also not sure how effective np.vectorize is. Has anyone experience with this? (But probably not very efficient according to the docs "The vectorize function is provided primarily for convenience, not for performance. The implementation is essentially a for loop." https://numpy.org/doc/stable/reference/generated/numpy.vectorize.html ;-), but at least nicer to read)

I can try to implement this in the upcoming days, but before I can try to create a minimum python script which is a bit faster and makes it more fun for testing new things. Maybe @anoukvlug can help with this because I am not quite sure what in the notebook is testing and what is essentially needed. If a minimum script is not possible, we maybe could save intermediate steps to disk and start from there to increase the run time for testing (because if I understand correctly the goal is to optimise distribute_thick and compute_extend?).

@fmaussion
Copy link
Member

Yes - the other day I also had an even simpler idea for this, lets discuss all three of us tomorrow or another day maybe

@pat-schmitt
Copy link
Member

To add to my comment above, if we want to vectorize some functions for real, we could have a look at Numba https://numba.pydata.org/ ;)

@fmaussion
Copy link
Member

yes, although this should really just be the last step, done after the possible optimisations are done

@fmaussion
Copy link
Member

Two papers which might explain how the GloGEM model does the extrapolation (at least according to a text I just read):

  • doi:10.5194/esurf-10-723-2022
  • Compagno, L., Huss, M., Zekollari, H., Miles, E. S., & Farinotti, D. (2022). Future growth and decline of high mountain Asia's ice-dammed lakes and associated risk. Communications Earth & Environment, 3(1), 1-9

Can someone skim them through and check how this is done there?

@pat-schmitt
Copy link
Member

I understand that they extrapolate the ice thickness change of individual elevation bands to the corresponding 2D pixels. The idea is that each oggm-flowline grid point has 2D pixels assigned. With this, the elevation change of the oggm-flowline could be extrapolated on the belonging 2D pixels.

What I think is needed for this:

  • Initial thickness map of the 2D glacier, with glacier bed
  • convert the 2D glacier (outline & DEM) into elevation bands and save for each pixel to which elevation band it belongs. This is already done in elevation_band_flowline and just needs to be stored in the gdir (tricky could be the conversion with fixed_dx_elevation_band_flowlines). Also, save the initial-oggm-flowline for later.
  • sort the pixels inside of each elevation band, e.g. order them by initial thickness and distance from the outline. This defines an order in which pixels inside the elevation band should melt first (this could be useful to conserve area and volume, see below)

All these steps only need to be done once in a postprocessing step. In the end, we have for each 2D pixel the initial thickness, the glacier bed, the belonging grid point of the oggm-flowline, and the ranking inside the elevation band.

The actual extrapolation could look like this:

  1. extrapolate the thickness change (compared to the initial-oggm-flowline) of all oggm-flowline grid points onto the 2D map
  2. check for area conservation in each elevation band (e.g. add or remove grid cells following the internal elevation bands pixel ordering)
  3. check for volume conservation in each elevation band (e.g. add or remove delta volume evenly over all grid cells without creating or deleting any grid cell)

I think the actual extrapolation could be quite fast in the end and could be done for any timestep without the need of computing preceding glacier states.

Note: this method only works if the resulting glacier is smaller than the initial glacier. If we also want to be able to advance we need to think of connecting ice-free 2D pixels to the oggm-flowline in some way...

@fmaussion
Copy link
Member

I think I also understand what Anouk meant that one cannot preserve area and volume easily - do they mention this?

@pat-schmitt
Copy link
Member

No, they basically only mention the extrapolation in one sentence, with not much detail:

  • 'For computational reasons, the model is discretized into 10m elevation bands but results on area and thickness changes in individual bands are extrap- olated to the same 10×10m grid'
  • 'Combining the surface DEM with the modelled rate of ice thickness change, we model the glacier evolution in three dimensions'

@pat-schmitt
Copy link
Member

pat-schmitt commented Jan 30, 2023

I think with the idea I had over the weekend it is possible to preserve area and volume per elevation band (first add/delete pixels to match the area, second add/remove delta volume evenly over remaining pixels without deleting any pixel)

@anoukvlug anoukvlug mentioned this pull request May 3, 2023
@anoukvlug
Copy link
Collaborator Author

Thanks for all the ideas and feedback! I will now close this PR, as I just created a fresh one.

@anoukvlug anoukvlug closed this May 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants