Skip to content

Loss functions for complex tensors  #46642

@anjali411

Description

@anjali411

🚀 Feature

Loss functions in torch.nn module should support complex tensors whenever the operations make sense for complex numbers.

Motivation

Complex Neural Nets are an active area of research and there are a few issues on GitHub (for example, #46546 (comment)) which suggests that we should add complex number support for loss functions.

Pitch

NOTE: As of now, we have decided to add complex support for only real valued loss functions, so please make sure to check that property for your chosen loss function before you start working on a PR to add complex support.

These loss functions should be updated to add support for complex numbers (both forward and backward operations). If a loss function doesn't make sense for complex numbers, it should throw an error clearly stating that. I.e. this is a list of loss functions as of the time this issue was written, we still need to figure out which we want to support and which should throw errors.

If a loss function, uses an operation feasible but not supported for complex numbers right now, we should prioritize adding it.

cc @ezyang @anjali411 @dylanbespalko @mruberry @albanD

Metadata

Metadata

Assignees

No one assigned

    Labels

    complex_autogradmodule: complexRelated to complex number support in PyTorchmodule: nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions