Skip to content

Conversation

@bhushan23
Copy link
Contributor

  • Test added
  • test_dim_function_empty: softmax and log_softmax on last dimension

fixes: #17262

Copy link
Collaborator

@ssnl ssnl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pytorchbot merge this please

@ssnl
Copy link
Collaborator

ssnl commented Mar 4, 2019

@pytorchbot merge this please

@pytorchbot pytorchbot added the merge-this-please Was marked for merge with @pytorchbot merge this please label Mar 4, 2019
Copy link
Contributor

@apaszke apaszke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of modifying device-specific kernels, can't we skip those cases in top-level functions?

@ssnl ssnl removed the merge-this-please Was marked for merge with @pytorchbot merge this please label Mar 4, 2019
@ezyang
Copy link
Contributor

ezyang commented Mar 4, 2019

@bhushan23 I agree with @apaszke. Do you think you could make this change?

- Test added
- test_dim_function_empty: softmax and log_softmax on last dimension
@bhushan23
Copy link
Contributor Author

bhushan23 commented Mar 4, 2019

@apaszke, done

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gchanan has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@apaszke
Copy link
Contributor

apaszke commented Mar 10, 2019

Would be nice to add test cases for this

@bhushan23
Copy link
Contributor Author

Would be nice to add test cases for this

Only missing test case in forward pass was aceesing last dimension (which we have added now)
Now, since we are returning 0-d tensor in forward pass, we will have 0-d tensor during backward and autograd give Runtime error RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

Adding test case to ensure future coverage.

@bhushan23
Copy link
Contributor Author

@pytorchbot rebase this please

@bhushan23 bhushan23 deleted the log-softmax branch March 10, 2019 22:53
zdevito pushed a commit to zdevito/ATen that referenced this pull request Mar 10, 2019
Summary:
- Test added
- test_dim_function_empty: softmax and log_softmax on last dimension

fixes: #17262
Pull Request resolved: pytorch/pytorch#17651

Differential Revision: D14349009

Pulled By: gchanan

fbshipit-source-id: b6f728f5c6be8ae7615749e3f0c201886632923e
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

CPU log_softmax on [N, 0] tensor is broken

8 participants