New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
print
statement causes inplace error
#99968
Comments
I'm not sure from the top of my head. |
Smaller repro: a = torch.rand(1, requires_grad=True)
with torch.no_grad():
b = a[:]
b += 1
# Doing any of the of the following produces an error
b.sin() # (1)
b.grad_fn # (2)
print(b) # (3) the reason this fails is because it calls into t.grad_fn for printing purposes The easy fix for (3), is just to special case on views in no-grad in the printing code, but maybe there is a more general fix. See also #11390 |
Hi, we are running into a similar issue as we are implementing an updated version of Distributed Shampoo and seeking to apply When we print the list of parameter views or apply Any suggestions on how to proceed? Thanks in advance! Interestingly, we found that logging the tensor does not trigger this issue, but printing does. cc: @tsunghsienlee @shintaro-iwasaki @minddrummer @csmiler @mlazos @bdhirsh @yuchenhao |
Note that here we can easily fix printing (try/except around access to grad_fn and print an invalid grad_fn). The other errors are expected: This is undefined behavior to do this so we rather raise an error. |
Actionable to fix |
…d in-place in no-grad" Fixes #99968 [ghstack-poisoned]
…d in-place in no-grad" Fixes #99968 [ghstack-poisoned]
…d in-place in no-grad" Fixes #99968 [ghstack-poisoned]
…d in-place in no-grad" Fixes #99968 [ghstack-poisoned]
…d in-place in no-grad" Fixes #99968 [ghstack-poisoned]
🐛 Describe the bug
Reported in: https://discuss.pytorch.org/t/error-with-view-no-grad-and-inplace-modify/173082
but I couldn't find the created GitHub issue and the author didn't follow up.
Code to reproduce the issue:
Comment the
print
statement in and the code will fail with:I would assume the inplace operation is allowed as it's in a
no_grad
block and no computation graph was ever created.Also, maybe related to: https://discuss.pytorch.org/t/old-problem-but-strange-things-trying-to-backward-through-the-graph-a-second-time/178369
but no executable code snippet was posted yet.
Versions
Reproduced in a nightly build:
2.1.0.dev20230407+cu118
.CC @albanD as we talked about this issue before.
cc @ezyang @gchanan @zou3519 @kadeng @albanD @gqchen @pearu @nikitaved @soulitzer @lezcano @Varal7
The text was updated successfully, but these errors were encountered: