Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove custom function in no_grad block error message #33896

Closed
wants to merge 2 commits into from

Conversation

zou3519
Copy link
Contributor

@zou3519 zou3519 commented Feb 27, 2020

Stack from ghstack:

Fixes #32625. Previously, we'd receive an error message if we have a
custom function return a view of an input in a no_grad block:

class Alias(Function):
    @staticmethod
    def forward(ctx, x):
        return x[:]

    @staticmethod
    def backward(ctx, gx):
        return gx

inp = torch.rand(2, requires_grad=True)

with torch.no_grad():
    # Used to error out
    output = Alias.apply(inp)

After this change, the error no longer happens. The behavior changes to
become consistent to if we had implemented an operator that does the
same thing as the custom function:

  • the output requires_grad
  • we are able to detect (and error out) if the user tries to modify the
    output in-place outside of the no_grad block.

Test Plan:

  • new test

Differential Revision: D20345601

Fixes #32625. Previously, we'd receive an error message if we have a
custom function return a view of an input in a no_grad block:
```
class Alias(Function):
    @staticmethod
    def forward(ctx, x):
        return x[:]

    @staticmethod
    def backward(ctx, gx):
        return gx

inp = torch.rand(2, requires_grad=True)

with torch.no_grad():
    # Used to error out
    output = Alias.apply(inp)
```

After this change, the error no longer happens. The behavior changes to
become consistent to if we had implemented an operator that does the
same thing as the custom function:
- the output requires_grad
- we are able to detect (and error out) if the user tries to modify the
output in-place outside of the no_grad block.

Test Plan:
- new test

[ghstack-poisoned]
zou3519 added a commit that referenced this pull request Feb 27, 2020
Fixes #32625. Previously, we'd receive an error message if we have a
custom function return a view of an input in a no_grad block:
```
class Alias(Function):
    @staticmethod
    def forward(ctx, x):
        return x[:]

    @staticmethod
    def backward(ctx, gx):
        return gx

inp = torch.rand(2, requires_grad=True)

with torch.no_grad():
    # Used to error out
    output = Alias.apply(inp)
```

After this change, the error no longer happens. The behavior changes to
become consistent to if we had implemented an operator that does the
same thing as the custom function:
- the output requires_grad
- we are able to detect (and error out) if the user tries to modify the
output in-place outside of the no_grad block.

Test Plan:
- new test

ghstack-source-id: 233a597f6e42069208a934316b34146db1073984
Pull Request resolved: #33896
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good

@dr-ci
Copy link

dr-ci bot commented Feb 27, 2020

💊 CircleCI build failures summary and remediations

As of commit 4cc7919 (more details on the Dr. CI page):


  • 2/2 failures introduced in this PR

🕵️ 2 new failures recognized by patterns

The following build failures do not appear to be due to upstream breakages:

See CircleCI build pytorch_linux_xenial_py3_clang5_asan_test (1/2)

Step: "Test" (full log | pattern match details)

Mar 09 19:10:15 caused by: Connection refused (os error 111)
Mar 09 19:10:15 +++ eval 'extract_trap_cmd ' 
Mar 09 19:10:15 ++++ extract_trap_cmd 
Mar 09 19:10:15 ++++ printf '%s\n' '' 
Mar 09 19:10:15 +++ printf '%s\n' cleanup 
Mar 09 19:10:15 ++ trap -- ' 
Mar 09 19:10:15 cleanup' EXIT 
Mar 09 19:10:15 ++ which sccache 
Mar 09 19:10:15 ++ sccache --stop-server 
Mar 09 19:10:15 Stopping sccache server... 
Mar 09 19:10:15 error: couldn't connect to server 
Mar 09 19:10:15 caused by: Connection refused (os error 111) 
Mar 09 19:10:15 ++ true 
Mar 09 19:10:15 ++ rm /var/lib/jenkins/sccache_error.log 
Mar 09 19:10:15 ++ SCCACHE_ERROR_LOG=/var/lib/jenkins/sccache_error.log 
Mar 09 19:10:15 ++ SCCACHE_IDLE_TIMEOUT=1200 
Mar 09 19:10:15 ++ RUST_LOG=sccache::server=error 
Mar 09 19:10:15 ++ sccache --start-server 
Mar 09 19:10:15 Starting sccache server... 
Mar 09 19:10:15 ++ sccache --zero-stats 
Mar 09 19:10:15 Compile requests                 0 
Mar 09 19:10:15 Compile requests executed        0 

See CircleCI build pytorch_windows_vs2019_py36_cuda10.1_test2 (2/2)

Step: "Test" (full log | pattern match details)

ERROR: test_ctc_loss_cuda (__main__.TestAutogradDeviceTypeCUDA)
test_where_broadcast_all_cuda (__main__.TestAutogradDeviceTypeCUDA) ... ok 
test_where_cuda (__main__.TestAutogradDeviceTypeCUDA) ... ok 
test_where_functional_cuda (__main__.TestAutogradDeviceTypeCUDA) ... ok 
test_where_scalar_broadcast_mask_cuda (__main__.TestAutogradDeviceTypeCUDA) ... ok 
test_where_scalar_broadcast_non_mask_cuda (__main__.TestAutogradDeviceTypeCUDA) ... ok 
test_where_scalar_cuda (__main__.TestAutogradDeviceTypeCUDA) ... ok 
test_zero__cuda (__main__.TestAutogradDeviceTypeCUDA) ... ok 
test_zero__scalar_cuda (__main__.TestAutogradDeviceTypeCUDA) ... ok 
 
====================================================================== 
ERROR: test_ctc_loss_cuda (__main__.TestAutogradDeviceTypeCUDA) 
---------------------------------------------------------------------- 
Traceback (most recent call last): 
  File "C:\Users\circleci\project\build\win_tmp\build\torch\testing\_internal\common_device_type.py", line 207, in instantiated_test 
    return test(self, device_arg) 
  File "C:\Users\circleci\project\build\win_tmp\build\torch\testing\_internal\common_device_type.py", line 365, in dep_fn 
    return fn(slf, device, *args, **kwargs) 
  File "test_autograd.py", line 4416, in test_ctc_loss 
    gradcheck(ctc_after_softmax, [x]) 
  File "C:\Users\circleci\project\build\win_tmp\build\torch\autograd\gradcheck.py", line 297, in gradcheck 
    'The tolerance for nondeterminism was {}.'.format(nondet_tol)) 

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker.

This comment has been revised 4 times.

Fixes #32625. Previously, we'd receive an error message if we have a
custom function return a view of an input in a no_grad block:
```
class Alias(Function):
    @staticmethod
    def forward(ctx, x):
        return x[:]

    @staticmethod
    def backward(ctx, gx):
        return gx

inp = torch.rand(2, requires_grad=True)

with torch.no_grad():
    # Used to error out
    output = Alias.apply(inp)
```

After this change, the error no longer happens. The behavior changes to
become consistent to if we had implemented an operator that does the
same thing as the custom function:
- the output requires_grad
- we are able to detect (and error out) if the user tries to modify the
output in-place outside of the no_grad block.

Test Plan:
- new test

[ghstack-poisoned]
zou3519 added a commit that referenced this pull request Mar 9, 2020
Fixes #32625. Previously, we'd receive an error message if we have a
custom function return a view of an input in a no_grad block:
```
class Alias(Function):
    @staticmethod
    def forward(ctx, x):
        return x[:]

    @staticmethod
    def backward(ctx, gx):
        return gx

inp = torch.rand(2, requires_grad=True)

with torch.no_grad():
    # Used to error out
    output = Alias.apply(inp)
```

After this change, the error no longer happens. The behavior changes to
become consistent to if we had implemented an operator that does the
same thing as the custom function:
- the output requires_grad
- we are able to detect (and error out) if the user tries to modify the
output in-place outside of the no_grad block.

Test Plan:
- new test

ghstack-source-id: 53a30c4975cec4f1637e813e72ec97b6c81f7d22
Pull Request resolved: #33896
@facebook-github-bot
Copy link
Contributor

@zou3519 merged this pull request in f5ee46f.

@facebook-github-bot facebook-github-bot deleted the gh/zou3519/239/head branch March 14, 2020 14:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants