Skip to content

Conversation

@t-vi
Copy link
Collaborator

@t-vi t-vi commented Jul 24, 2018

Fixes: #9754

Maybe this could also make its way into 0.4.1, it is a severe debugging headache if you hit this...

b = torch.zeros(5, 2)
c = b[:, 0]
torch.logsumexp(a, 1, out=c)
self.assertTrue(np.allclose(expected, b[:, 0].numpy()))

This comment was marked as off-topic.

@pytorch pytorch deleted a comment from t-vi Jul 24, 2018
@zou3519
Copy link
Contributor

zou3519 commented Jul 24, 2018

Sorry did not mean to delete that comment. For context, I was asking why the expand_as was necessary, it was to match the maxes_squeezed size. Thanks @t-vi for replying!

AT_CHECK(self.numel() != 0, "logsumexp only works on nonempty tensors");
auto maxes = at::max_values(self, dim, true);
auto maxes_squeezed = (keepdim ? maxes : maxes.squeeze(dim));
maxes_squeezed.masked_scatter_(maxes_squeezed.abs() == INFINITY, at::zeros({1}, maxes_squeezed.options()).expand_as(maxes_squeezed));

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Contributor

@zou3519 zou3519 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, would prefer at::zeros({1}, maxes_squeezed.options()).expand_as(maxes_squeezed) written as at::zeros_like(maxes_squeezed) if that works

Copy link
Contributor

@zou3519 zou3519 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, thanks @t-vi!

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@zou3519 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@yf225
Copy link
Contributor

yf225 commented Jul 24, 2018

@pytorchbot retest this please

zdevito pushed a commit to zdevito/ATen that referenced this pull request Jul 24, 2018
Summary:
Fixes: #9754

Maybe this could also make its way into 0.4.1, it  is a severe debugging headache if you hit this...
Pull Request resolved: pytorch/pytorch#9755

Reviewed By: ezyang

Differential Revision: D8967178

Pulled By: zou3519

fbshipit-source-id: 151ed24e3a15a0c67014e411ac808fb893929a42
gchanan added a commit to gchanan/pytorch that referenced this pull request Jul 25, 2018
pytorch#9755 broke this, but it was only tested if size zero dims were turned on.
gchanan added a commit to gchanan/pytorch that referenced this pull request Jul 25, 2018
pytorch#9755 broke this, but it was only tested if size zero dims were turned on.
@gchanan
Copy link
Contributor

gchanan commented Jul 25, 2018

Let's try not to reduce functionality (or question why and decide it's worth it); logsumexp worked on empty tensors before this and now it doesn't (see #9825).

@t-vi
Copy link
Collaborator Author

t-vi commented Jul 25, 2018

@gchanan sorry! I was still dizzy from the non-inplacedness...

facebook-github-bot pushed a commit that referenced this pull request Jul 25, 2018
Summary:
#9755 broke this, but it was only tested if size zero dims were turned on (it can still happen even if that isn't turned on, because we support size [0] tensors).
Pull Request resolved: #9825

Differential Revision: D8997303

Pulled By: gchanan

fbshipit-source-id: 911dce112f73fad0f3980a7f4f9423df0f2d923d
zdevito pushed a commit to zdevito/ATen that referenced this pull request Jul 25, 2018
Summary:
pytorch/pytorch#9755 broke this, but it was only tested if size zero dims were turned on (it can still happen even if that isn't turned on, because we support size [0] tensors).
Pull Request resolved: pytorch/pytorch#9825

Differential Revision: D8997303

Pulled By: gchanan

fbshipit-source-id: 911dce112f73fad0f3980a7f4f9423df0f2d923d
jramseyer pushed a commit to jramseyer/pytorch that referenced this pull request Jul 30, 2018
Summary:
Fixes: pytorch#9754

Maybe this could also make its way into 0.4.1, it  is a severe debugging headache if you hit this...
Pull Request resolved: pytorch#9755

Reviewed By: ezyang

Differential Revision: D8967178

Pulled By: zou3519

fbshipit-source-id: 151ed24e3a15a0c67014e411ac808fb893929a42
jramseyer pushed a commit to jramseyer/pytorch that referenced this pull request Jul 30, 2018
Summary:
pytorch#9755 broke this, but it was only tested if size zero dims were turned on (it can still happen even if that isn't turned on, because we support size [0] tensors).
Pull Request resolved: pytorch#9825

Differential Revision: D8997303

Pulled By: gchanan

fbshipit-source-id: 911dce112f73fad0f3980a7f4f9423df0f2d923d
goodlux pushed a commit to goodlux/pytorch that referenced this pull request Aug 15, 2018
Summary:
Fixes: pytorch#9754

Maybe this could also make its way into 0.4.1, it  is a severe debugging headache if you hit this...
Pull Request resolved: pytorch#9755

Reviewed By: ezyang

Differential Revision: D8967178

Pulled By: zou3519

fbshipit-source-id: 151ed24e3a15a0c67014e411ac808fb893929a42
goodlux pushed a commit to goodlux/pytorch that referenced this pull request Aug 15, 2018
Summary:
pytorch#9755 broke this, but it was only tested if size zero dims were turned on (it can still happen even if that isn't turned on, because we support size [0] tensors).
Pull Request resolved: pytorch#9825

Differential Revision: D8997303

Pulled By: gchanan

fbshipit-source-id: 911dce112f73fad0f3980a7f4f9423df0f2d923d
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Logsumexp is broken with out, as it is not inplace

6 participants