Skip to content

Add complex autograd support for rsub #53643

@anjali411

Description

@anjali411
>>> x=torch.randn(2,3, dtype=torch.complex64, requires_grad=True)
>>> y=1-x
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/chourdiaanjali/pytorch/torch/tensor.py", line 534, in __rsub__
    return _C._VariableFunctions.rsub(self, other)
RuntimeError: rsub does not support automatic differentiation for outputs with complex dtype.

cc @ezyang @anjali411 @dylanbespalko @mruberry @aocsa

Metadata

Metadata

Assignees

Labels

complex_autogradmodule: complexRelated to complex number support in PyTorchtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions