-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update documentation and warnings before 0.1
release
#502
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow, such an improvement to the documentation!! Super great job @rhugonnet!
I have some questions, generally pertaining to the non-documentation part of the PR (perhaps in the future changes like this could be a separate PR? :) )
|
||
### What are accuracy and precision? | ||
|
||
[Accuracy and precision](https://en.wikipedia.org/wiki/Accuracy_and_precision) describe random and systematic errors, respectively. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Flip "random and systematic", as accuracy is systematic and precision is random.
|
||
### Translating these concepts for elevation data | ||
|
||
However, elevation data rarely consists of a single independent measurement but of a **series of measurement** (image grid, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"series of measurements"
Accuracy is generally considered from two focus points: | ||
|
||
- **Absolute elevation accuracy** describes systematic errors to the true positioning, usually important when analysis focuses on the exact location of topographic features at a specific epoch. | ||
- **Relative elevation accuracy** describes systematic errors with reference to other elevation data that does not necessarily matches the true positioning, important for analyses interested in topographic change over time. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"does not necessarily match the true positioning"
@@ -25,8 +25,8 @@ | |||
# -- Project information ----------------------------------------------------- | |||
|
|||
project = "xDEM" | |||
copyright = "2021, Erik Mannerfelt, Romain Hugonnet, Amaury Dehecq and others" | |||
author = "Erik Mannerfelt, Romain Hugonnet, Amaury Dehecq and others" | |||
copyright = "2020, GlacioHack" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see some packages leaving the original year while others keep it updated. Any ideas on implications of this?
(ecosystem)= | ||
|
||
# Ecosystem | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe not here, but somewhere; should we refer to demcompare as that's still used quite much? Sorry if it already says somewhere.
|
||
Elevation data benefits from an uncommon asset, which is that **large proportions of planetary surface elevations | ||
usually remain virtually unchanged through time**. Those static surfaces, sometimes also referred to as "stable terrain", | ||
generally refer to bare-rock, grasslands, and are often isolated by excluding dynamic surfaces such as glaciers, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
..., rivers, human settlements, lakes, ....
@@ -460,7 +460,7 @@ def __init__( | |||
| Literal["norder_polynomial"] | |||
| Literal["nfreq_sumsin"] = "norder_polynomial", | |||
fit_optimizer: Callable[..., tuple[NDArrayf, Any]] = scipy.optimize.curve_fit, | |||
bin_sizes: int | dict[str, int | Iterable[float]] = 100, | |||
bin_sizes: int | dict[str, int | Iterable[float]] = 1000, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this link to an issue? I'm sure it's a good update but it's a bit off topic!
@@ -2627,7 +2701,7 @@ def copy(self: CoregType) -> CoregType: | |||
"""Return an identical copy of the class.""" | |||
new_coreg = self.__new__(type(self)) | |||
|
|||
new_coreg.__dict__ = {key: copy.deepcopy(value) for key, value in self.__dict__.items() if key != "pipeline"} | |||
new_coreg.__dict__ = {key: copy.copy(value) for key, value in self.__dict__.items() if key != "pipeline"} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What does this change do?
@@ -848,6 +788,34 @@ def to_matrix(self) -> NDArrayf: | |||
"""Convert the transform to a 4x4 transformation matrix.""" | |||
return self._to_matrix_func() | |||
|
|||
def to_translations(self) -> tuple[float, float, float]: | |||
""" | |||
Convert the affine transformation matrix to only its X/Y/Z translations. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think something funky will happen here if scaling is involved (I don't remember which order the components are applied in). I feel like a warning should be added to this and the to_rotations()
saying something like "Warning: Information may be lost".
And the conversion is less of a conversion and more of an extraction I would say! Since there's no validation that scaling won't mess with the result.
@@ -960,8 +960,7 @@ def test_blockwise_coreg_large_gaps(self) -> None: | |||
ddem_post = (aligned - self.ref).data.compressed() | |||
ddem_pre = (tba - self.ref).data.compressed() | |||
assert abs(np.nanmedian(ddem_pre)) > abs(np.nanmedian(ddem_post)) | |||
# TODO: Figure out why STD here is larger since PR #530 | |||
# assert np.nanstd(ddem_pre) > np.nanstd(ddem_post) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is an identical assertion commented out here?
Ongoing
TO-DO:
BlockwiseCoreg
to coregistration page,.meta
keys for users (and make the attribute public in a separate PR),.info()
method forCoregPipeline
andBlockwiseCoreg
,0.1
,Resolves #477 (last step)
Resolves #505
Resolves #464
Resolves #434
Resolves #285
Resolves #275
Resolves #431
Resolves #532
Resolves #528
Resolves #562
Resolves #577
Resolves #583