You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.FloatTensor [1, 257, 101, 2]], which is output 0 of SubBackward0, is at version 1; expected version 0 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True).
When using the suggested torch.autograd.set_detect_anomaly(True), the last few items on the call stack:
File "/network/home/plantinp/pytorch/lib/python3.7/site-packages/torchaudio/transforms.py", line 161, in forward
self.normalized, self.n_iter, self.momentum, self.length, self.rand_init)
File "/network/home/plantinp/pytorch/lib/python3.7/site-packages/torchaudio/functional.py", line 373, in griffinlim
angles = angles.div_(complex_norm(angles).add_(1e-16).unsqueeze(-1).expand_as(angles))
File "/network/home/plantinp/pytorch/lib/python3.7/site-packages/torchaudio/functional.py", line 581, in complex_norm
return torch.norm(complex_tensor, 2, -1)
File "/network/home/plantinp/pytorch/lib/python3.7/site-packages/torch/functional.py", line 882, in norm
return _VF.norm(input, p, _dim, keepdim=keepdim)
(print_stack at /pytorch/torch/csrc/autograd/python_anomaly_mode.cpp:60)
Expected behavior
Should allow backprop. Removing in-place ops in line 373 of functional.py seems to do the trick:
馃悰 Bug
I cannot backpropagate through GriffinLim due to in-place operations being used.
To Reproduce
Minimal example:
Results in:
When using the suggested
torch.autograd.set_detect_anomaly(True)
, the last few items on the call stack:Expected behavior
Should allow backprop. Removing in-place ops in line 373 of
functional.py
seems to do the trick:Environment
conda
,pip
, source): pipThe text was updated successfully, but these errors were encountered: