Skip to content

Commit

Permalink
Incremental commit
Browse files Browse the repository at this point in the history
  • Loading branch information
rhugonnet committed May 22, 2024
1 parent fe1c9f0 commit 461d61e
Show file tree
Hide file tree
Showing 2 changed files with 57 additions and 25 deletions.
25 changes: 20 additions & 5 deletions doc/source/coregistration.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,8 +96,10 @@ function are included in coregistration methods, which include:
- rotations, reflections,
- scalings.

## Using a coregistration

(coreg_object)=
## The {class}`~xdem.coreg.Coreg` object
### The {class}`~xdem.coreg.Coreg` object

Each coregistration method implemented in xDEM inherits their interface from the {class}`~xdem.coreg.Coreg` class<sup>1</sup>, and has the following methods:
- {func}`~xdem.coreg.Coreg.fit` for estimating the transform.
Expand All @@ -109,7 +111,7 @@ Each coregistration method implemented in xDEM inherits their interface from the
<sup>1</sup>In a style inspired by [scikit-learn's pipelines](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LinearRegression.html#sklearn-linear-model-linearregression).
```

First, {func}`~xdem.coreg.Coreg.fit` is called to estimate the transform, and then this transform can be used or exported using the subsequent methods.
{func}`~xdem.coreg.Coreg.fit` is called to estimate the transform, and then this transform can be used or exported using the subsequent methods.

**Inheritance diagram of implemented coregistrations:**

Expand All @@ -120,6 +122,10 @@ First, {func}`~xdem.coreg.Coreg.fit` is called to estimate the transform, and th

See {ref}`biascorr` for more information on non-rigid transformations ("bias corrections").

### Accessing coregistration metadata



## Coregistration methods

```{important}
Expand Down Expand Up @@ -350,9 +356,12 @@ ax[1].set_title("After ICP")
_ = ax[1].set_yticklabels([])
```

## The {class}`~xdem.coreg.CoregPipeline` object
## Building coregistration pipelines

### The {class}`~xdem.coreg.CoregPipeline` object

Often, more than one coregistration approach is necessary to obtain the best results. For example, ICP works poorly with large initial vertical shifts, so a {class}`~xdem.coreg.CoregPipeline` can be constructed to perform both sequentially:
Often, more than one coregistration approach is necessary to obtain the best results, and several need to be combined
sequentially. A {class}`~xdem.coreg.CoregPipeline` can be constructed for this:

```{code-cell} ipython3
# We can list sequential coregistration methods to apply
Expand All @@ -363,7 +372,7 @@ pipeline = xdem.coreg.ICP() + xdem.coreg.NuthKaab()
```

The {class}`~xdem.coreg.CoregPipeline` object exposes the same interface as the {class}`~xdem.coreg.Coreg` object.
The results of a pipeline can be used in other programs by exporting the combined transformation matrix using {func}`~xdem.coreg.CoregPipeline.to_matrix`.
The results of a pipeline can be used in other programs by exporting the combined transformation matrix using {func}`~xdem.coreg.Coreg.to_matrix`.

```{margin}
<sup>2</sup>Here again, this class is heavily inspired by SciKit-Learn's [Pipeline](https://scikit-learn.org/stable/modules/generated/sklearn.pipeline.Pipeline.html#sklearn-pipeline-pipeline) and [make_pipeline()](https://scikit-learn.org/stable/modules/generated/sklearn.pipeline.make_pipeline.html#sklearn.pipeline.make_pipeline) functionalities.
Expand Down Expand Up @@ -411,3 +420,9 @@ Additionally, ICP tends to fail with large initial vertical differences, so a pr
```{code-cell} ipython3
pipeline = xdem.coreg.VerticalShift() + xdem.coreg.ICP() + xdem.coreg.NuthKaab()
```

## Dividing a coregistration between blocks

### The {class}`~xdem.coreg.BlockwiseCoreg` object


57 changes: 37 additions & 20 deletions doc/source/uncertainty.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,18 +23,18 @@ pyplot.rcParams['savefig.dpi'] = 600

# Uncertainty analysis

xDEM integrates spatial uncertainty analysis tools from the recent literature that **rely on joint methods from two
xDEM integrates uncertainty analysis tools from the recent literature that **rely on joint methods from two
scientific fields: spatial statistics and uncertainty quantification**.

While uncertainty analysis technically refers to both systematic and random errors, systematic errors of elevation data
are corrected using {ref}`coregistration` and {ref}`biascorr`, so we here refer to **uncertainty analysis for quantifying and
propagating random errors**.

In detail, we provide tools to:
In detail, xDEM provide tools to:

1. Account for elevation **heteroscedasticity** (e.g., varying precision such as with terrain slope or stereo-correlation),
2. Quantify the **spatial correlation of random errors** (e.g., from native spatial resolution or instrument noise),
3. Perform an **error propagation to elevation derivatives** (e.g., spatial average, or more complex derivatives such as slope and aspect).
1. Estimate and model elevation **heteroscedasticity, i.e. variable random errors** (e.g., such as with terrain slope or stereo-correlation),
2. Estimate and model the **spatial correlation of random errors** (e.g., from native spatial resolution or instrument noise),
3. Perform **error propagation to elevation derivatives** (e.g., spatial average, or more complex derivatives such as slope and aspect).

:::{admonition} More reading
:class: tip
Expand Down Expand Up @@ -85,26 +85,38 @@ print("Random elevation errors at a distance of 1 km are correlated at {:.2f} %.

## Summary of available methods

Our methods for modelling the structure of error in DEMs and propagating errors to spatial derivatives analytically
are primarily based on [Rolstad et al. (2009)]() and [Hugonnet et al. (2022)]().
Methods for modelling the structure of error are based on [spatial statistics](https://en.wikipedia.org/wiki/Spatial_statistics), and methods for
propagating errors to spatial derivatives analytically rely on [uncertainty propagation](https://en.wikipedia.org/wiki/Propagation_of_uncertainty).

These frameworks are generic and thus encompass that of most other studies on the topic (e.g., Anderson et al. (2020),
others), referred to as "traditional" below. This is because accounting for possible multiple correlation ranges also
works for the case of single correlation range, or accounting for potential heteroscedasticity also works on
homoscedastic elevation data.
To improve the robustness of the uncertainty analysis, we provide refined frameworks for application to elevation data based on
[Rolstad et al. (2009)](http://dx.doi.org/10.3189/002214309789470950) and [Hugonnet et al. (2022)](http://dx.doi.org/10.1109/JSTARS.2022.3188922),
both for modelling the structure of error and to efficiently perform error propagation.
**These frameworks are generic, simply extending an aspect of the uncertainty analysis to better work on elevation data**,
and thus generally encompass methods described in other studies on the topic (e.g., [Anderson et al. (2019)](http://dx.doi.org/10.1002/esp.4551)).

The tables below summarize the characteristics of these three category of methods.
The tables below summarize the characteristics of these methods.

### Estimating and modelling the structure of error

Traditionally, in spatial statistics, a single correlation range is considered ("traditional" method below).
However, elevation data often contains errors with correlation ranges spanning different orders of magnitude.
For this, [Rolstad et al. (2009)](http://dx.doi.org/10.3189/002214309789470950) and
[Hugonnet et al. (2022)](http://dx.doi.org/10.1109/JSTARS.2022.3188922) considers
potential multiple ranges of spatial correlation (instead of a single one). In addition, [Hugonnet et al. (2022)](http://dx.doi.org/10.1109/JSTARS.2022.3188922)
considers potential heteroscedasticity or variable errors (instead of homoscedasticity, or constant errors), also common in elevation data.

Because accounting for possible multiple correlation ranges also works if you have a single correlation range in your data,
and accounting for potential heteroscedasticity also works on homoscedastic data, **there is nothing to lose by using
a more advanced framework!**

```{list-table}
:widths: 1 1 1 1 1
:header-rows: 1
:stub-columns: 1
:align: center
* - Method
- Heteroscedasticity
- Heteroscedasticity (i.e. variable error)
- Correlations (single-range)
- Correlations (multi-range)
- Outlier-robust
Expand All @@ -127,6 +139,11 @@ The tables below summarize the characteristics of these three category of method

### Propagating errors to spatial derivatives

Exact uncertainty propagation scales exponentially (by computing every pairwise combinations, for potentially millions of elevation data points).
To remedy this, [Rolstad et al. (2009)](http://dx.doi.org/10.3189/002214309789470950) and [Hugonnet et al. (2022)](http://dx.doi.org/10.1109/JSTARS.2022.3188922)
both provide an approximation of exact uncertainty propagations for spatial derivatives (to avoid long
computing times). **These approximations are valid in different contexts**, described below.

```{list-table}
:widths: 1 1 1 1
:header-rows: 1
Expand All @@ -136,19 +153,19 @@ The tables below summarize the characteristics of these three category of method
* - Method
- Accuracy
- Computing time
- Remarks
- Validity
* - Exact discretized
- Exact
- Slow on large samples
- Complexity scales exponentially
- Slow on large samples (exponential complexity)
- Always
* - R2009
- Conservative
- Instantaneous
- Only valid for near-circular contiguous areas
- Instantaneous (numerical integration)
- Only for near-circular contiguous areas
* - H2022 (default)
- Accurate
- Fast
- Complexity scales linearly
- Fast (linear complexity)
- As long as variance is nearly stationary
```

(spatialstats-heterosc)=
Expand Down

0 comments on commit 461d61e

Please sign in to comment.