New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Poincare Manifold methods #78
Changes from all commits
0095f36
289c705
7b09e05
3454302
78e224d
62d5bd4
a25b81d
5add5a1
142dfe6
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -1273,6 +1273,33 @@ def _parallel_transport0(y, v, c, dim: int = -1): | |
return v * (1 - c * y.pow(2).sum(dim=dim, keepdim=True)).clamp_min(MIN_NORM) | ||
|
||
|
||
def parallel_transport0back(x, v, *, c=1.0, dim: int = -1): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Perhaps There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Parallel transport is a special case of vector transport that is exact one. That's the purpose of having |
||
r""" | ||
Special case parallel transport with last point at zero that | ||
can be computed more efficiently and numerically stable | ||
|
||
Parameters | ||
---------- | ||
x : tensor | ||
target point | ||
v : tensor | ||
vector to be transported | ||
c : float|tensor | ||
ball negative curvature | ||
dim : int | ||
reduction dimension for operations | ||
|
||
Returns | ||
------- | ||
tensor | ||
""" | ||
return _parallel_transport0back(x, v, c=c, dim=dim) | ||
|
||
|
||
def _parallel_transport0back(x, v, c, dim: int = -1): | ||
return v / (1 - c * x.pow(2).sum(dim=dim, keepdim=True)).clamp_min(MIN_NORM) | ||
|
||
|
||
def egrad2rgrad(x, grad, *, c=1.0, dim=-1): | ||
r""" | ||
Translate Euclidean gradient to Riemannian gradient on tangent space of :math:`x` | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have concerns whether this method should work with non-matching
v, *more
. The easy way is to do this in a loop and do not care.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or maybe check and raise the warning? (still need to loop tho)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the point in
*more
anyway and where we will use it? I have completely missed that part.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A flag in parameters may save nerves and add more clarity in what's going on
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
like
stack=True
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The point is to be able to transport multiple vectors with one pass, as this might be much more computationally cheaper. E.g. in Stiefel manifolds this allows performing this operation with one LU decomposition
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is how flag is supposed to be implemented
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
However, tuple approach is not that bad