-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[ONNX] Export weight_norm #28618
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ONNX] Export weight_norm #28618
Conversation
torch/onnx/symbolic_opset9.py
Outdated
| # W = g * ((v) / ||v||) | ||
| # Compute norm_except_dim for l2 norm | ||
| axes = list(range(rank)) | ||
| axes.remove(dim) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should support dim=None as well (computes a norm over the entire weight tensor).
Could you verify if axes.remove(dim) fails when dim=None, and add a test case with dim=None as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Lara for mentioning that.
Looks like there is a conflicting behavior in torch.nn.utils.weight_norm regarding None dim.
Where dim could be negative for backwards axes, but when dim = None, it's overwitten to -1
pytorch/torch/nn/utils/weight_norm.py
Line 10 in 0c48092
| if dim is None: |
So:
weight_norm(x, dim=-2) computes norm over dimension 'rank-2'
weight_norm(x, dim=-1) computes norm over all dimensions (equivalent of dim=None)
cc @houseroad to review the issue please.
This is not blocking the export currently. For now I've updated the symbolic to match torch function.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@BowenBao Could you please take a look as well? Thanks.
lara-hdr
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
BowenBao
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
one slight typo
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
@pytorchbot rebase this please |
|
@houseroad I just rebased this PR. CI looks fine and it is approved. Could you please take a look? |
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
@houseroad Can we merge this? Thanks! |
|
@houseroad merged this pull request in 7da11f4. |
Export _weight_norm
Caffe2 tests are inplace
Looks like there is a conflicting behavior in torch.nn.utils.weight_norm regarding None dim.
Where dim could be negative for backwards axes, but when dim = None, it's overwitten to -1
pytorch/torch/nn/utils/weight_norm.py
Line 10 in 0c48092
For now, this symbolic to matches the current behavior. But this might need to be changed in the torch module.