Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make LieTensor a subclass of torch.Tensor and override function/operators #452

Merged
merged 17 commits into from
Mar 8, 2023

Conversation

luisenp
Copy link
Contributor

@luisenp luisenp commented Jan 28, 2023

This PR is mostly to illustrate some ideas on how to safely overload operators in LieTensor class. It'd better to start looking at the example script, and if that looks OK, you can take a quick look at what's happening under the hood. This is not meant to be a final implementation.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jan 28, 2023
@luisenp luisenp changed the base branch from main to lep.add_lie_tensor January 28, 2023 17:30
examples/lie_labs_api.py Outdated Show resolved Hide resolved
examples/lie_labs_api.py Outdated Show resolved Hide resolved
examples/lie_labs_api.py Outdated Show resolved Hide resolved
Copy link
Member

@mhmukadam mhmukadam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The example script looks good to me!

@luisenp luisenp force-pushed the lep.hacky_lie_example branch 2 times, most recently from 3df7e53 to 52b05d9 Compare January 31, 2023 21:30
@luisenp luisenp marked this pull request as draft January 31, 2023 21:37
@luisenp
Copy link
Contributor Author

luisenp commented Feb 1, 2023

One other detail to consider for discussion. I did some quick tests with torch optimizers, and could only get the subclass based version to work by doing the following, which requires an extra step. We also need to do some tests to make sure backprop is doing the right thing. Curious to hear thoughts from @albanD

import theseus.labs.lie as lie
import theseus.labs.lie.functional as lieF

# Leaf tensor needs to be a regular tensor, so we need to explicitly pass the tensor data
g1_data = lieF.se3.rand(1, requires_grad=True)
g1 = lie.cast(g1_data, ltype=lie.SE3)
g2 = lie.rand(1, lie.SE3)
opt = torch.optim.Adam([g1_data])

opt.zero_grad()
d = g1.inv().compose(g2).log()
loss = torch.sum(d**2)
loss.backward()
opt.step()

@luisenp
Copy link
Contributor Author

luisenp commented Feb 1, 2023

cc @rmurai0610

@luisenp luisenp mentioned this pull request Feb 1, 2023
11 tasks
theseus/labs/lie/lie_tensor.py Outdated Show resolved Hide resolved
theseus/labs/lie/lie_tensor.py Outdated Show resolved Hide resolved
theseus/labs/lie/lie_tensor.py Outdated Show resolved Hide resolved
@exhaustin exhaustin self-requested a review February 10, 2023 19:45
Copy link
Contributor

@fantaosha fantaosha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some comments. Not sure if we should override __add() and use LieAsEuclidean.


# However, for safety, one cannot use overriden + with torch tensors
try:
y = g + x
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Honestly, I'm a little nervous about overriding __add() since they are very dangerous. Also, one can not use *, -, /, either in that case.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we have some operators called to_euclidean() that makes a pytorch tensor copy (not reference) of current data.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In summary, I would like exactly function names rather than overrides to avoid confusion. I encountered lots of problems to generalize the first-order optimizers on Lie groups.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be consistent, if g + x = g.compose(x), then g-x should be g.compose(x.inverse()).

print(e)

# But you can add lie.as_euclidean() and then everything is valid
with lie.as_euclidean():
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm concerned with this. If the user wants to use Lie group as a Euclidean tensor. Maybe it is safer to do something like g.to_euclidean().matmul(torch.rand(4,7). Maybe to_euclidean() should be a property such that users can not modify the data.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the concern that users would override the internal data? (e.g., something like g.add_(something)? This seems a reasonable concern, because I believe subclassing allows this. I'm OK with replacing this with to_euclidean().

theseus/labs/lie/lie_tensor.py Show resolved Hide resolved
theseus/labs/lie/types.py Show resolved Hide resolved
Base automatically changed from lep.add_lie_tensor to main February 21, 2023 20:01
f"tensor of type {tensor.ltype}"
)
super().set_(tensor) # type: ignore

Copy link
Contributor Author

@luisenp luisenp Feb 24, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A less esoteric but perhaps less safe alternative for add_ could be to check the tensor shape and if it looks like (...,3, 4) (for SE3) this is interpreted as an euclidean gradient, something with shape (..., 6) (for SE3) interpret this as a Riemannian graadient, and everything else throws an error.

@luisenp luisenp marked this pull request as ready for review February 27, 2023 14:53
@luisenp luisenp requested review from mhmukadam, albanD and fantaosha and removed request for albanD February 27, 2023 14:53
@luisenp luisenp changed the title Quick example of operator overloading in Lie class API Make LieTensor a subclass of torch.Tensor and override function/operators Mar 1, 2023
Copy link
Member

@mhmukadam mhmukadam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! Let's merge once the open comments are resolved.

Copy link
Member

@mhmukadam mhmukadam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@luisenp luisenp merged commit 16eb5b2 into main Mar 8, 2023
@luisenp luisenp deleted the lep.hacky_lie_example branch March 8, 2023 22:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants