-
Notifications
You must be signed in to change notification settings - Fork 27.4k
[PyTorch][Feature Request] Label Smoothing for CrossEntropyLoss #7455
Copy link
Copy link
Closed
Labels
enhancementNot as big of a feature, but technically not a bug. Should be easy to fixNot as big of a feature, but technically not a bug. Should be easy to fixhigh prioritymodule: lossProblem is related to loss functionProblem is related to loss functionmodule: nnRelated to torch.nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
Solved 🎉
Starting from v1.10.0, torch.nn.CrossEntropy()has an arg label_smoothing=0.0 - API link.
Hi, guys. The type torch.LongTensor of target will hinder the implementation like some methods in reference. So is there a possible to add a Arg: label_smoothing for torch.nn.CrossEntropyLoss(), or maybe simply add the docs to show how to convert the target into one-hot vector to work with torch.nn.CrossEntropyLoss() together, or any other simple ways? Thanks.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNot as big of a feature, but technically not a bug. Should be easy to fixNot as big of a feature, but technically not a bug. Should be easy to fixhigh prioritymodule: lossProblem is related to loss functionProblem is related to loss functionmodule: nnRelated to torch.nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module