Skip to content

Conversation

@macaodha
Copy link
Contributor

Previously it was not possible to set a value for the margin in the HingeEmbeddingLoss in the constructor. This patch fixes the issue and makes the loss behave as it is described in the docs.

A discussion of this issue can be viewed here:
https://discuss.pytorch.org/t/issue-with-setting-margin-for-hingeembeddingloss/2088

Previously it was not possible to set a value for the margin in the HingeEmbeddingLoss in the constructor. This patch fixes the issue and makes the loss behave as it is described in the docs. 

A discussion of this issue can be viewed here:
https://discuss.pytorch.org/t/issue-with-setting-margin-for-hingeembeddingloss/2088
@soumith soumith merged commit b93b525 into pytorch:master Apr 28, 2017
@soumith
Copy link
Contributor

soumith commented Apr 28, 2017

thanks for fixing this!

@fmassa
Copy link
Member

fmassa commented Apr 28, 2017

Hi,
Thanks for the bugfix!
Can I add two more comments?

  • could you make HingeEmbeddingLoss inherit from Module? I don't think there is a need anymore to make it inherit from _Loss.
  • could you assert that target don't require gradient? There is a helper function for that

@macaodha
Copy link
Contributor Author

I can look into this.

Do we also need to add _assert_no_grad to a bunch of the other losses as well? e.g. CosineEmbeddingLoss, MarginRankingLoss, MultiMarginLoss, TripletMarginLoss

@apaszke
Copy link
Contributor

apaszke commented Apr 28, 2017

Yes, they should all use it

Jiaming-Liu pushed a commit to Jiaming-Liu/pytorch that referenced this pull request May 18, 2017
Previously it was not possible to set a value for the margin in the HingeEmbeddingLoss in the constructor. This patch fixes the issue and makes the loss behave as it is described in the docs. 

A discussion of this issue can be viewed here:
https://discuss.pytorch.org/t/issue-with-setting-margin-for-hingeembeddingloss/2088
jjsjann123 added a commit to jjsjann123/pytorch that referenced this pull request Jan 26, 2022
1. extend buildShapeExpression for squeeze_copy/unsqueeze_copy ops.
2. patching broadcastSizes insertion point for buildShapeExpression to avoid graph::copy() linter assert.
3. adding tests
4. supports no-op squeeze (squeezing on dimension that's not size-1)

TODO (in follow up PRs):
1. extend buildShapeExpression to view_copy and reshape_copy as well
2. refactor broadcastSizesExpression to allow graceful failure instead of hard assert
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants