-
Couldn't load subscription status.
- Fork 25.7k
Closed
Labels
module: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generalmodule: numpyRelated to numpy support, and also numpy compatibility of our operatorsRelated to numpy support, and also numpy compatibility of our operatorstriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
The code below raises an error:
import torch
a = torch.rand(10, requires_grad=True)
with torch.no_grad():
b = a.numpy()
# RuntimeError: Can't call numpy() on Variable that requires grad. Use var.detach().numpy() instead.Two issues here:
- Variable should not be mentioned in that error message
- Given that this is exactly what happens in custom autograd.Function, I think this should be allowed. The fact that we run in no_grad mode means that the user expects no gradients to flow.
Metadata
Metadata
Assignees
Labels
module: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generalmodule: numpyRelated to numpy support, and also numpy compatibility of our operatorsRelated to numpy support, and also numpy compatibility of our operatorstriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module