Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradient descending coreg #346

Merged
merged 38 commits into from
Jun 13, 2023
Merged

Gradient descending coreg #346

merged 38 commits into from
Jun 13, 2023

Conversation

liuh886
Copy link
Contributor

@liuh886 liuh886 commented Mar 27, 2023

Adding support for point co-registration

To check it run the following:

  • A new coregistration algorithm, gradient descending coregistration (GDS Coreg)
from tests.test_coreg import TestCoregClass
longyearbyen = TestCoregClass()
longyearbyen.test_coreg_example_gradiendescending(verbose=True,downsampling=20000)
  • A updated fit_pts version of NuthKaab coregistration.
#Shift a ref_dem on purpose, e.g. shift_px = (1,1), and then apply coregs to shift it back (a) NuthKaab (2) GradientDescending (2) NuthKaab_pts
longyearbyen.test_coreg_example_shift_test(verbose=True)

Changes in environment.yml, dev-environment.yml

  • noisyopt

This pack provide a pattern search enhanced gradient descent algorithm, which is better than scipy.optimize.minimize.

Changes in coreg.py

Add the following functions/Class required by point coregistration:

  • def df_sampling_from_dem(). Sample a DEM into a data frame if the points clouds are not provided.
  • def residuals_df(). The dh between points cloud and DEM is calculated by this function.
  • Class GradientDescending. Solving the coregistration, using residuals_df as a cost function.
  • def apply_xyz_shift_df(). In both GDS or NuthKaab_pts, the points cloud will be shifted instead of DEM.

to make it happen, updated:

  • def _fit_pts_func(). Now nuth_kaab.fit_pts works.
  • def fit_pts(). CDC and NuthhKaab can call fit_pts. It will sample ref_dem into the point cloud (datafame) by df_sampling_from_dem if the dataframe is not provided.

Known issues:

  1. The result of GDS Coreg is the minimal NMAD of samples. Technically, the bigger of N of samples, the closer to the minimal NMAD of the population. By adjusting the following parameters, we can reach the best results, here is an example:
gds = xdem.coreg.GradientDescending(downsampling=6000, x0, bounds, deltainit, deltatol, feps)
gds.fit_pts(ref_dem_or_dataframe, tba_dem, inlier_mask=tba_dem.inlier_mask,verbose=verbose, weights='column_w')

  • downsampling is the number of points of downsampling to run the coreg. Set None to disable it.
  • inlier_mask: same as before, excluding the moving terrain,
  • weights: if the ref_dem_or_dataframe is a dataframe, the column_w can be used as weights when calculating residuals for the cost function. Here still not yet support weights if the ref_dem is a DEM.

The following parameters normally do not need to change:

x0: tuple = (0, 0): The initial shift.
bounds: tuple = (-5, 5). The bounds to search.
deltainit: int = 2. The pattern size for search.
deltatol: int = 0.004. Target pattern size, or the precision you want to achieve.
feps: int = 0.0001. The smallest difference in function value to resolve.

  1. For points cloud, changing dataframe into ndarray
  2. df_sampling_from_dem can be replaced by functions from Geoutils.

Copy link
Contributor

@erikmannerfelt erikmannerfelt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @liuh886, this is great stuff! I have comments but I think we can fix this together and merge it soon. This is a fantastic addition to xdem!

Before merging, we should have a proper description here of the functionality that is introduced. Technically, we should also add documentation, but that can come later too.

- geoutils==0.0.10
- noisyopt
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to make sure this propagates to pypi and conda-forge (I'll help! This is more a reminder to me)

examples/data/ICESat2/dtm_10_snowfree.tif Outdated Show resolved Hide resolved
reproject_test.ipynb Outdated Show resolved Hide resolved
tests/test_coreg.py Outdated Show resolved Hide resolved
xdem/coreg.py Outdated
@@ -431,7 +597,7 @@ def __init__(self, meta: CoregDict | None = None, matrix: NDArrayf | None = None

def fit(
self: CoregType,
reference_dem: NDArrayf | MArrayf | RasterType,
reference_dem: NDArrayf | MArrayf | RasterType | pd.DataFrame,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's have a discussion on dataframes. Is that not just a nicely wrapped ndarray? In other parts of xdem and geoutils, we use point clouds as ndarrays of shape=(N, 3) where the second dimension is (X, Y, Z). Unless there's a good use of dataframes, I really think we should make it simpler by only accepting ndarrays.

A compromise is to have the fit_pts function take dataframes, but it's coverted to an ndarray that the internal function (_fit_pts_func) uses.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As with below, this signature should only be for fit_pts, not fit. fit is for rasters while fit_pts is for points (and thus dataframes)!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so far, we use fit_pts to call point coregistration, instead of fit, for example:

gds = xdem.coreg.GradientDescending(downsampling=6000)
gds.fit_pts(shifted_ref, self.tba, inlier_mask=self.inlier_mask,verbose=verbose)

nkp = xdem.coreg.NuthKaab()
nkp.fit_pts(shifted_ref, self.tba, inlier_mask=self.inlier_mask,verbose=verbose)

The NDArray will take replace of DataFrame soon.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My take here is that we should accept any type of point data (Vector, GeoDataFrame, NDArray), but this is more a problem for GeoUtils to tackle, to make things more easily consistent. For xdem.Coreg, I agree it's good to keep a standard input for now (+1 on NDArray).

.history/dev-environment_20230302122644.yml Outdated Show resolved Hide resolved
xdem/coreg.py Outdated Show resolved Hide resolved
xdem/coreg.py Outdated Show resolved Hide resolved
xdem/coreg.py Outdated Show resolved Hide resolved
xdem/coreg.py Outdated Show resolved Hide resolved
@rhugonnet
Copy link
Contributor

rhugonnet commented Mar 28, 2023

Great addition @liuh886! 😄 :
I will go through the details a bit later, for now I just have generic comments, for you but also for @erikmannerfelt and @adehecq as it concerns the structure of the Coreg module:

  • The coregistration structure/names are getting hard to grasp (for me, so I can't imagine for outsiders), maybe we should subclass Coreg by type of rigid transform instead: Translation, Rotation, Scaling? (don't know what we'd do for the ones that have several rigid transforms like ICP... could it be subclassed from both Translation and Rotation? would probably have to override some methods in the process)
  • Those classes Translation, Rotation, would then call a different method for solving the shift and, technically, these methods could use any optimization algorithm packaged in fit.py, for instance: coreg.Translation(method="NuthKaab", optimize="least-squares"),
  • Based on the above, I don't think GradientDescent should be a coregistration name, as it only concerns the optimization method, instead it could be called coreg.Translation(method="NuthKaab", optimize="gradient-descent") (but maybe we should clarify what kind of gradient descent, as least-squares can get solved by gradient descend as well),
  • The coreg.py module is getting quite busy (and it might only be the beginning), maybe we should create a folder coreg/ where we split into modules depending on the coregistration type coreg/translation.py, coreg/rotation.py, and call those through the classes in coreg/coreg.py?

What do you think?
Trying to think ahead so that we don't have to re-think everything later on!

@erikmannerfelt
Copy link
Contributor

Great addition @liuh886! smile : I will go through the details a bit later, for now I just have generic comments, for you but also for @erikmannerfelt and @adehecq as it concerns the structure of the Coreg module:

* The coregistration structure/names are getting hard to grasp (for me, so I can't imagine for outsiders), maybe we should subclass `Coreg` by type of rigid transform instead: `Translation`, `Rotation`, `Scaling`? (don't know what we'd do for the ones that have several rigid transforms like ICP... could it be subclassed from both `Translation` and `Rotation`? would probably have to override some methods in the process)

* Those classes `Translation`, `Rotation`, would then call a different `method` for solving the shift and, technically, these methods could use any optimization algorithm packaged in `fit.py`, for instance: `coreg.Translation(method="NuthKaab", optimize="least-squares")`,

* Based on the above, I don't think `GradientDescend` should be a coregistration name, as it only concerns the optimization method, instead it could be called `coreg.Translation(method="NuthKaab", optimize="gradient-descent")` (but maybe we should clarify what kind of gradient descent, as least-squares can get solved by gradient descend as well),

* The `coreg.py` module is getting quite busy (and it might only be the beginning), maybe we should create a folder `coreg/` where we split into modules depending on the coregistration type `coreg/translation.py`, `coreg/rotation.py`, and call those through the classes in `coreg/coreg.py`?

What do you think? Trying to think ahead so that we don't have to re-think everything later on!

Thanks a lot for your input, @rhugonnet!

I wholeheartedly agree on the need to abstract the coreg classes and potentially make it into a submodule. I certainly think we should consider this in the very near future and I would be more than happy to contribute. For now, however, I ask that we consider one step at a time. Perhaps the context of the timing of this PR will shed some light on the situation:

Marco Mazzolini and Désirée Treichler have both been working with different forks of xdem for months. Most notably, it's the fit_pts functionality that I believe you implemented first, @rhugonnet. This has more or less worked, but has hindered our attempts at optimization since the branches slowly diverge. @liuh886 required this for his MSc thesis, of which I have seen incredibly impressive results of so far! I asked him to go ahead with this PR now, which also includes his new shiny co-registration approach, because I myself am in dire need of the fit_pts functionality!

I'm contributing to GlaMBIE together with Désirée which involves some heavy ICESat-2 processing. As you may know, the deadline is quite close (late May)... I need ICP and NuthKaab with points for it to work, or else we have to rescind our contribution. My options were either to fork @liuh886's branch which is a fork of your branch which is a fork of xdem main, or we could finally converge the functionality developed here in Oslo into mainline xdem. After this PR, I will jump straight into a fit_pts PR for ICP (which is significantly easier than this one!), whereby I can finally get on to my task at hand.

Because of these circumstances, I ask that we can focus on the functionality that will be added with this PR, and strongly consider your suggestions (which I very much like the sound of) until after the GlaMBIE deadline!

Finally, while @liuh886 is the expert on his own work, I can at least say that GradientDescending is not NuthKaab with a different optimizer. NuthKaab uses aspect and slope to calculate the most likely direction of the shift. This approach shifts the dataset in the direction of falling uncertainty. They are therefore two distinctly different approaches, but of course with the same goal. We should all three have a meeting soon to discuss our co-registration intercomparison ideas!

@erikmannerfelt
Copy link
Contributor

Again, for clarification @rhugonnet, NuthKaab solves a cosine function between aspect and slope-normalized elevation using least squares, from which x/y shifts can be derived. GradientDescend (or should it maybe be called GradientDescent, @liuh886?) solves only for elevation change NMAD as a cost function using gradient descent. I thought this would just fall into local minima before he showed some compelling graphs that it's actually about as good as NuthKaab, and it can easily be an order of magnitude (about 10x) faster.

@liuh886, please let me know if you can explain it better! Or we'll just sit tight for your thesis ;) That is more than fine for me too! Your blog post is also great, but I seem to have lost the link!

@liuh886
Copy link
Contributor Author

liuh886 commented Mar 28, 2023

Again, for clarification @rhugonnet, NuthKaab solves a cosine function between aspect and slope-normalized elevation using least squares, from which x/y shifts can be derived. GradientDescend (or should it maybe be called GradientDescent, @liuh886?) solves only for elevation change NMAD as a cost function using gradient descent. I thought this would just fall into local minima before he showed some compelling graphs that it's actually about as good as NuthKaab, and it can easily be an order of magnitude (about 10x) faster.

@liuh886, please let me know if you can explain it better! Or we'll just sit tight for your thesis ;) That is more than fine for me too! Your blog post is also great, but I seem to have lost the link!

Thanks, your explanation is very nice. And 'Gradient Descent' is the correct way to call it!

This PR aims to support a 'usable' points co-registration. And we will also update the pts version of NuthKaab (there are several improvements to make it run faster). Sure, we need to rethink how to organize Coreg class...

I used 'gradient descent coregistration (GDC)' heavily in recent months, and have done coreg over DTM1 and ICESat-2 on a national scale. So far I am more concerned about the limitation of GDC:

It is fast, but needs as many as possible points (5,000 for example) to overcome the noise, as (1) NMAD is still a 'statistical' metric. (2) a half-pixel shift for a 10 m DEM is 5 meters, but just 0.5 m for a 1 m DEM, where the gradient is not steep on the scale of shift. (3) GDC does not give a fixed result, because there are several routes to reach 'the same minimum'; in most cases, the differences of results are ineligible.

There are several ways to overcome the noise: increasing sample points, and assigning the weight to 'high-quality measurements'.

I have a post about that.

@rhugonnet
Copy link
Contributor

rhugonnet commented Mar 28, 2023

Thanks for the details @liuh886 and the amazing work, can't wait to dive into it! 😊

And also thanks @erikmannerfelt for explaining the background behind all this, it's good to understand all of this to best move forward :).
To simplify version control moving forward, we could:

  1. Keep this PR pretty much as it is (same structure, keep GradientDescending naming, and all the rest), only focus on passing tests and solving minor comments,
  2. Merge the PR into xDEM and make a release 0.0.8 associated to it (that you'll be able to call for Glambie),
  3. Later, start another PR to do the restructuring described above.

For 3/, we'll need to figure out the most appropriate time. I'll be working on the Coreg comparison study most of April-May, and we don't want to have to merge forks at that point. So maybe it's better to keep the current structure for now and only make 3/ all at once during the summer (June-July)?

The study will be a good occasion to test all methods on the same benchmark! I will follow up by email on this in the next weeks @liuh886 @erikmannerfelt! 😉

@erikmannerfelt
Copy link
Contributor

@liuh886, let me know if you want me to help with my PR to your branch. I see that it was reversed in 0175ab8. Difficult merge conflict I presume?

@liuh886
Copy link
Contributor Author

liuh886 commented Apr 17, 2023

@erikmannerfelt Yes, I do need help. I found the test environment looks like it does not run under the latest version of geoutils v0.0.10.

In geoutils v.0.0.10, 'area_or_point' has been replaced with 'shift_area_or_point' to solve the relevant 'upper left corner or center' problem, this is actually the origins of the half-pixel issue.

I saw in your PR 'area_or_point' were addressing again to fix the half-pixel issue again. However, it cannot run under the latest version of geoutils v0.0.10 on my computer. So, can you double check the test environment?

@erikmannerfelt erikmannerfelt added the enhancement Feature improvement or request label Apr 18, 2023
@erikmannerfelt
Copy link
Contributor

Hi @liuh886. @rhugonnet seems to have figured out the problem with the build. Try his hack from 8defc14; move richdem in the dev-environment.yml to make it use pip instead of conda.

It's not the first time that richdem has been an issue...

@rhugonnet
Copy link
Contributor

rhugonnet commented May 30, 2023

Unfortunately in the end this didn't work, using pip the richdem wheels build on Unix but not Windows (GDAL version problems). 😢

I had to do those two commits instead: ba3bc92 fb83de2. I will open an issue!

@rhugonnet
Copy link
Contributor

@liuh886 If you have nothing to add specific to your method, I can help merge this into the package and solve the conflicts! 😉

@liuh886
Copy link
Contributor Author

liuh886 commented Jun 7, 2023

@liuh886 If you have nothing to add specific to your method, I can help merge this into the package and solve the conflicts! 😉

Yes, please. I don‘t have any new functions to add 🙂

@rhugonnet rhugonnet marked this pull request as ready for review June 13, 2023 19:08
@rhugonnet
Copy link
Contributor

Alright, updated with main and solved all linting/typing.

It was in good state, but still a bit of an effort to make Mypy's (and especially its NumPy plugin) happy with the new code (see 933cb78)! 😅

Thanks again for the great contribution @liuh886! 🙂

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Feature improvement or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants