Skip to content

Bad interaction between no_grad and numpy conversion #37000

@albanD

Description

@albanD

The code below raises an error:

import torch

a = torch.rand(10, requires_grad=True)

with torch.no_grad():
    b = a.numpy()
# RuntimeError: Can't call numpy() on Variable that requires grad. Use var.detach().numpy() instead.

Two issues here:

  • Variable should not be mentioned in that error message
  • Given that this is exactly what happens in custom autograd.Function, I think this should be allowed. The fact that we run in no_grad mode means that the user expects no gradients to flow.

cc @ezyang @ssnl @albanD @zou3519 @gqchen

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: autogradRelated to torch.autograd, and the autograd engine in generalmodule: numpyRelated to numpy support, and also numpy compatibility of our operatorstriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions