Skip to content

Add complex autograd support for torch.lerp #53606

@anjali411

Description

@anjali411
>>> start = start + 0.5j
>>> end = torch.empty(4, requires_grad=True).clone().fill_(10) + 2j
>>> torch.lerp(start, end, 0.5)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
RuntimeError: lerp does not support automatic differentiation for outputs with complex dtype.

cc @ezyang @anjali411 @dylanbespalko @mruberry @aocsa

Metadata

Metadata

Assignees

Labels

complex_autogradmodule: complexRelated to complex number support in PyTorchtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions