Skip to content

[PyTorch][Feature Request] Label Smoothing for CrossEntropyLoss #7455

@kaiyuyue

Description

@kaiyuyue

Solved 🎉

Starting from v1.10.0, torch.nn.CrossEntropy()has an arg label_smoothing=0.0 - API link.


Hi, guys. The type torch.LongTensor of target will hinder the implementation like some methods in reference. So is there a possible to add a Arg: label_smoothing for torch.nn.CrossEntropyLoss(), or maybe simply add the docs to show how to convert the target into one-hot vector to work with torch.nn.CrossEntropyLoss() together, or any other simple ways? Thanks.

cc @ezyang @gchanan @zou3519 @bdhirsh @albanD @mruberry

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNot as big of a feature, but technically not a bug. Should be easy to fixhigh prioritymodule: lossProblem is related to loss functionmodule: nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions