Skip to content

Conversation

@neginraoof
Copy link
Contributor

@neginraoof neginraoof commented Oct 24, 2019

Export _weight_norm
Caffe2 tests are inplace

Looks like there is a conflicting behavior in torch.nn.utils.weight_norm regarding None dim.
Where dim could be negative for backwards axes, but when dim = None, it's overwitten to -1

if dim is None:

For now, this symbolic to matches the current behavior. But this might need to be changed in the torch module.

# W = g * ((v) / ||v||)
# Compute norm_except_dim for l2 norm
axes = list(range(rank))
axes.remove(dim)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should support dim=None as well (computes a norm over the entire weight tensor).
Could you verify if axes.remove(dim) fails when dim=None, and add a test case with dim=None as well?

Copy link
Contributor Author

@neginraoof neginraoof Oct 25, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Lara for mentioning that.
Looks like there is a conflicting behavior in torch.nn.utils.weight_norm regarding None dim.
Where dim could be negative for backwards axes, but when dim = None, it's overwitten to -1

if dim is None:

So:
weight_norm(x, dim=-2) computes norm over dimension 'rank-2'
weight_norm(x, dim=-1) computes norm over all dimensions (equivalent of dim=None)

cc @houseroad to review the issue please.
This is not blocking the export currently. For now I've updated the symbolic to match torch function.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@BowenBao Could you please take a look as well? Thanks.

Copy link
Contributor

@lara-hdr lara-hdr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Collaborator

@BowenBao BowenBao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one slight typo

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@neginraoof
Copy link
Contributor Author

@pytorchbot rebase this please

@neginraoof
Copy link
Contributor Author

@houseroad I just rebased this PR. CI looks fine and it is approved. Could you please take a look?

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@neginraoof
Copy link
Contributor Author

@houseroad Can we merge this? Thanks!

@facebook-github-bot
Copy link
Contributor

@houseroad merged this pull request in 7da11f4.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants