-
Notifications
You must be signed in to change notification settings - Fork 25.3k
Call grad_mode.py context managers as decorators #7737
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
torch/autograd/grad_mode.py
Outdated
@@ -29,13 +38,23 @@ def __exit__(self, *args): | |||
torch.set_grad_enabled(self.prev) | |||
return False | |||
|
|||
def __call__(self, func): | |||
def decorate_no_grad(*args, **kwargs): |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM if test passes
test/test_autograd.py
Outdated
return x * 2 | ||
|
||
y = doubler_with(x) | ||
self.assertTrue(y.requires_grad) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
@pytorchbot retest this please |
@pytorchbot retest this please |
@pytorchbot retest this please |
@pytorchbot test this please |
test failure looks legit:
|
ah, indeed. going to have to change up the set_grad_enabled class. nice catch |
so, question. in order to maintain the current behavior, e.g. to use torch.set_grad_enabled(False) imperatively, I need to be able to save the input to set_grad_enabled as an attribute, then change enter to set the grad mode to that attribute's value:
however, if I do that, then any time it's instantiated, including when it wraps a function, the underlying grad mode will be changed. this seems like unwanted behavior, and maybe it's best to not use this one as a decorator? open to other suggestions. |
I think it's ok to forbid using |
* origin: [Caffe2] Enabling AMD GPU Backend for Caffe2 (pytorch#7566) Call grad_mode.py context managers as decorators (pytorch#7737) catch CPU tensors in checkSameGPU (fixes pytorch#7689) (pytorch#7767) Mark stack as non-executable in NNPACK (pytorch#7752) small fixes in fusion_compiler (pytorch#7776) Run clang-format on c10d (pytorch#7791)
* call grad_mode.py context managers as decorators * flake fixes * switch to using context manager in wrapper * fix set_grad_enabled test * removed dumb github UI whitespace * revert set_grad_enabled to normal, update tests
Extends the context manager classes
torch.no_grad
,torch.enable_grad
, andtorch.set_grad_mode
to function as decorators, so that users can wrap functions that will never require a call to.backward()
downstream. I've modified the docs to reflect this change, and I've also added tests for the new functionality in each mode's respective test intest_autograd.py
.I also didn't find a unit test specifically for
torch.enable_grad
. Assuming that's intended, unless my ctrl+f missed it.