Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

autograd.Function should be consistent about returning the same Tensor object if mark_dirty was used. #90209

Closed
zou3519 opened this issue Dec 5, 2022 · 0 comments
Labels
module: autograd Related to torch.autograd, and the autograd engine in general module: functorch Pertaining to torch.func or pytorch/functorch triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@zou3519
Copy link
Contributor

zou3519 commented Dec 5, 2022

🐛 Describe the bug

import torch

class MyRelu(torch.autograd.Function):
    @staticmethod
    def forward(ctx, x):
        result = x.relu_()
        ctx.mark_dirty(result)
        return result

    @staticmethod
    def backward(ctx, grad_output):
        pass

z = torch.tensor(1., requires_grad=True)
x = z.clone()
y = MyRelu.apply(x)
y is x

If z requires_grad, then y is x returns True.
If z does not require grad, then y is x returns False.

This is inconsistent. In-place PyTorch operators always return the same object, no matter the require_grad-ness.

NB: this might be difficult to actually do.

Context

The consistency makes autograd.Function + ctx.mark_dirty with functorch transforms inconsistent.

Versions

main

cc @ezyang @albanD @gqchen @pearu @nikitaved @soulitzer @lezcano @Varal7 @Chillee @samdow @soumith

@jbschlosser jbschlosser added module: autograd Related to torch.autograd, and the autograd engine in general triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module module: functorch Pertaining to torch.func or pytorch/functorch labels Dec 6, 2022
ShisuiUzumaki pushed a commit to ShisuiUzumaki/pytorch that referenced this issue Dec 23, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: autograd Related to torch.autograd, and the autograd engine in general module: functorch Pertaining to torch.func or pytorch/functorch triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

2 participants