Skip to content

Inconsistency between torch.clamp() and numpy.clip() behavior for complex numbers #33568

@anjali411

Description

@anjali411

a=torch.tensor([3+4j])
torch.clamp(a, 2+3j, 1+5j)
tensor([(3.0000 + 4.0000j)], dtype=torch.complex128)
torch.clamp(a, 1+5j, 2)
tensor([(1.0000 + 5.0000j)], dtype=torch.complex128)
np.clip(a.numpy(), 1+5j, 2)
array([2.+0.j])

cc @ezyang @anjali411 @dylanbespalko

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: complexRelated to complex number support in PyTorchmodule: numpyRelated to numpy support, and also numpy compatibility of our operatorstriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions