-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Fix log_softmax and softmax if any dimension is 0-d #17651
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
ssnl
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pytorchbot merge this please
|
@pytorchbot merge this please |
apaszke
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of modifying device-specific kernels, can't we skip those cases in top-level functions?
|
@bhushan23 I agree with @apaszke. Do you think you could make this change? |
- Test added - test_dim_function_empty: softmax and log_softmax on last dimension
|
@apaszke, done |
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@gchanan has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
Would be nice to add test cases for this |
Only missing test case in forward pass was aceesing last dimension (which we have added now) Adding test case to ensure future coverage. |
|
@pytorchbot rebase this please |
Summary: - Test added - test_dim_function_empty: softmax and log_softmax on last dimension fixes: #17262 Pull Request resolved: pytorch/pytorch#17651 Differential Revision: D14349009 Pulled By: gchanan fbshipit-source-id: b6f728f5c6be8ae7615749e3f0c201886632923e
fixes: #17262