New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Autograd not working for torch.exp(1j * phase) #43349
Comments
As mentioned in the warning there, complex autograd is not fully supported right now. My guess here is that the formula for one of these ops has not been updated yet and does not produce a complex gradient like it should. @anjali411 you might want to open a single umbrella issue that tracks these as I expect there will be more... |
@jonashaag thanks for reporting the issue! It's because the
As Alban mentioned, complex backward is not fully supported yet but many of these functions will be fixed soon.
I have been adding these issues in the |
Having the same issue as above. How's this going? Any way I can help? |
same issue +1 |
This was fixed in #43208. |
This is still an issue on nightly >>> import torch
>>> print(torch.__version__)
>>> mag = torch.tensor(5., requires_grad=True, dtype=torch.complex128)
>>> phase = torch.tensor(3., requires_grad=True, dtype=torch.complex128)
>>>
>>> complex_good = mag * (torch.cos(phase) + 1.j * torch.sin(phase))
>>> complex_good.backward() # works
>>>
>>> complex_bad = mag * torch.exp(1j * phase)
>>> complex_bad.backward()
1.8.0.dev20201101
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-20-98d908b62931> in <module>
8 complex_good.backward() # works
9
---> 10 complex_bad = mag * torch.exp(1j * phase)
11 complex_bad.backward()
RuntimeError: exp does not support automatic differentiation for outputs with complex dtype. Lines 4935 to 4941 in 1cc1da5
torch.exp is not being tested for autograd backwards on master
|
fixed in #47194 |
馃悰 Bug
Complex tensor construction from magnitude and phase does not seem to support autograd when using
mag * torch.exp(1j * phase)
notation:=>
Torch version: 1.7.0.dev20200819
cc @ezyang @gchanan @zou3519 @ssnl @albanD @gqchen @anjali411 @dylanbespalko
The text was updated successfully, but these errors were encountered: