-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Make logsumexp_out inplace #9755
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| b = torch.zeros(5, 2) | ||
| c = b[:, 0] | ||
| torch.logsumexp(a, 1, out=c) | ||
| self.assertTrue(np.allclose(expected, b[:, 0].numpy())) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
Sorry did not mean to delete that comment. For context, I was asking why the |
aten/src/ATen/native/ReduceOps.cpp
Outdated
| AT_CHECK(self.numel() != 0, "logsumexp only works on nonempty tensors"); | ||
| auto maxes = at::max_values(self, dim, true); | ||
| auto maxes_squeezed = (keepdim ? maxes : maxes.squeeze(dim)); | ||
| maxes_squeezed.masked_scatter_(maxes_squeezed.abs() == INFINITY, at::zeros({1}, maxes_squeezed.options()).expand_as(maxes_squeezed)); |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
zou3519
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, would prefer at::zeros({1}, maxes_squeezed.options()).expand_as(maxes_squeezed) written as at::zeros_like(maxes_squeezed) if that works
zou3519
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm, thanks @t-vi!
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zou3519 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
@pytorchbot retest this please |
Summary: Fixes: #9754 Maybe this could also make its way into 0.4.1, it is a severe debugging headache if you hit this... Pull Request resolved: pytorch/pytorch#9755 Reviewed By: ezyang Differential Revision: D8967178 Pulled By: zou3519 fbshipit-source-id: 151ed24e3a15a0c67014e411ac808fb893929a42
pytorch#9755 broke this, but it was only tested if size zero dims were turned on.
pytorch#9755 broke this, but it was only tested if size zero dims were turned on.
|
Let's try not to reduce functionality (or question why and decide it's worth it); logsumexp worked on empty tensors before this and now it doesn't (see #9825). |
|
@gchanan sorry! I was still dizzy from the non-inplacedness... |
Summary: #9755 broke this, but it was only tested if size zero dims were turned on (it can still happen even if that isn't turned on, because we support size [0] tensors). Pull Request resolved: #9825 Differential Revision: D8997303 Pulled By: gchanan fbshipit-source-id: 911dce112f73fad0f3980a7f4f9423df0f2d923d
Summary: pytorch/pytorch#9755 broke this, but it was only tested if size zero dims were turned on (it can still happen even if that isn't turned on, because we support size [0] tensors). Pull Request resolved: pytorch/pytorch#9825 Differential Revision: D8997303 Pulled By: gchanan fbshipit-source-id: 911dce112f73fad0f3980a7f4f9423df0f2d923d
Summary: Fixes: pytorch#9754 Maybe this could also make its way into 0.4.1, it is a severe debugging headache if you hit this... Pull Request resolved: pytorch#9755 Reviewed By: ezyang Differential Revision: D8967178 Pulled By: zou3519 fbshipit-source-id: 151ed24e3a15a0c67014e411ac808fb893929a42
Summary: pytorch#9755 broke this, but it was only tested if size zero dims were turned on (it can still happen even if that isn't turned on, because we support size [0] tensors). Pull Request resolved: pytorch#9825 Differential Revision: D8997303 Pulled By: gchanan fbshipit-source-id: 911dce112f73fad0f3980a7f4f9423df0f2d923d
Summary: Fixes: pytorch#9754 Maybe this could also make its way into 0.4.1, it is a severe debugging headache if you hit this... Pull Request resolved: pytorch#9755 Reviewed By: ezyang Differential Revision: D8967178 Pulled By: zou3519 fbshipit-source-id: 151ed24e3a15a0c67014e411ac808fb893929a42
Summary: pytorch#9755 broke this, but it was only tested if size zero dims were turned on (it can still happen even if that isn't turned on, because we support size [0] tensors). Pull Request resolved: pytorch#9825 Differential Revision: D8997303 Pulled By: gchanan fbshipit-source-id: 911dce112f73fad0f3980a7f4f9423df0f2d923d
Fixes: #9754
Maybe this could also make its way into 0.4.1, it is a severe debugging headache if you hit this...