Skip to content

Backward engine computes unnecessary dependencies #726

@apaszke

Description

@apaszke
import torch
from torch.autograd import Variable

x = Variable(torch.randn(5, 5), requires_grad=True)
y = Variable(torch.randn(5, 5), requires_grad=True)

a = x + y
b = torch.max(a, 1)[1].repeat(1, 5).float()
o = (b + a).sum()
o.backward()

The error is raised, because the non-differentiable path is included in dependency calculation (we only check requires_grad locally at the moment). It doesn't affect correctness and should never cause problems in most use cases. I'm putting this on hold until #662 is merged.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions