Closed
Description
🐛 Bug
Complex tensor construction from magnitude and phase does not seem to support autograd when using mag * torch.exp(1j * phase)
notation:
import torch
mag, phase = torch.tensor(5., requires_grad=True), torch.tensor(3., requires_grad=True)
complex_good = torch.view_as_complex(torch.stack([mag * torch.cos(phase), mag * torch.sin(phase)], dim=-1))
complex_good.backward() # works
complex_bad = mag * torch.exp(1j * phase)
complex_bad.backward()
=>
.../torch/autograd/__init__.py:125: UserWarning: Complex backward is not fully supported yet and could lead to wrong gradients for functions we have not fixed yet (Triggered internally at /opt/conda/conda-bld/pytorch_1597820903894/work/torch/csrc/autograd/python_engine.cpp:172.)
Variable._execution_engine.run_backward(
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File ".../torch/tensor.py", line 214, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph)
File ".../torch/autograd/__init__.py", line 125, in backward
Variable._execution_engine.run_backward(
RuntimeError: Expected isFloatingType(grad.scalar_type()) || (input_is_complex == grad_is_complex) to be true, but got false. (Could this error message be improved? If so, please report an enhancement request to PyTorch.)
...
Torch version: 1.7.0.dev20200819
cc @ezyang @gchanan @zou3519 @ssnl @albanD @gqchen @anjali411 @dylanbespalko