Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix neighbor backward #179

Merged
merged 18 commits into from
Jun 14, 2023
Merged

Conversation

RaulPPelaez
Copy link
Collaborator

Trying to integrate OptimizedDistance into the equivariant transformer (ET) I realized my module did not supported double backwards (DB), which is required in ET when training with forces.

This PR adds support for double gradient to OptimizedDistance, and adds some tests for it using torch.autograd.gradgradcheck.

After some research I learned about how to support DB in a custom extension. I rewrote the backwards pass in full pytorch to take advantage of autograd for the second gradient. Then I discovered Python custom torch.Autograd.Function are not compatible with jit.script , but it turns out it is compatible if you do it in C++. So I translated the full pytorch backwards to C++. Quite the journey -.-

@RaulPPelaez
Copy link
Collaborator Author

This is ready for review

@RaulPPelaez RaulPPelaez merged commit 0ea0b5b into torchmd:main Jun 14, 2023
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants