Skip to content

Commit

Permalink
Make nonzero non differentiable as it supposed to be (#26980)
Browse files Browse the repository at this point in the history
Summary:
Fixes: #26038

Somewhere between v1.1 and master `nonzero` become `abstract` and was marked as differentiable (by mistake) we need to but them into TH section of `tools/autograd/derivatives.yaml ` to fix it.
Pull Request resolved: #26980

Differential Revision: D17632276

Pulled By: VitalyFedyunin

fbshipit-source-id: d6cabcc53348af6148cea5a1bd1af2ef12547373
  • Loading branch information
VitalyFedyunin authored and soumith committed Oct 4, 2019
1 parent f2080fb commit 3c8ce2a
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 0 deletions.
5 changes: 5 additions & 0 deletions test/test_torch.py
Expand Up @@ -11638,6 +11638,11 @@ def gen_nontrivial_input(num_src, dtype, device):
for i in range(len(t)):
self.assertEqual(t[i].cpu().numpy(), np1[i])

def test_nonzero_non_diff(self, device):
x = torch.randn(10, requires_grad=True)
nz = x.nonzero()
self.assertFalse(nz.requires_grad)

def test_pdist_norm(self, device):
def test_pdist_single(shape, device, p, dtype, trans):
x = torch.randn(shape, dtype=dtype, device=device)
Expand Down
3 changes: 3 additions & 0 deletions tools/autograd/derivatives.yaml
Expand Up @@ -1575,3 +1575,6 @@

- name: multinomial(Tensor self, int num_samples, bool replacement=False, *, Generator? generator=None) -> Tensor
output_differentiability: [False]

- name: nonzero(Tensor self) -> Tensor
output_differentiability: [False]

0 comments on commit 3c8ce2a

Please sign in to comment.