New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation #9
Comments
Hi, |
Yes, I got same error using only s2conv in following code.
|
The problem comes from s2_rft when we use torch.einsum. The problem can be reproduced by the following code: x = torch.randn(3, 3, requires_grad=True)
z1 = torch.einsum("ij,jk->ik", (x, torch.randn(3, 3)))
z2 = torch.einsum("ij,jk->ik", (x, torch.randn(3, 3)))
z1.sum().backward() |
I can fix it with |
Thank you so much! |
Hi, I tried to use your s2conv/so3conv in multi model like following.
(Model includes your s2conv/so3conv)
Then I got following error.
There are no error when I use mono-model like following
So I think this error is not caused from inplace operation.
Do you know this error's detail?
P.S.
I found this error doesn't occur when I use past version of your s2conv/so3conv.
(maybe this is for Pytorch v0.3.1)
If you can, please republish past version of s2cnn (for Pytorch v0.3.1).
The text was updated successfully, but these errors were encountered: