Skip to content

Conversation

@yf225
Copy link
Contributor

@yf225 yf225 commented Jun 17, 2019

When we pass fn to nn.Module._apply() and fn is an in-place operation, the correct behavior should also include bumping the parameters' and their gradients' version counters. This PR fixes the old incorrect behavior and makes sure the new behavior is right.

Note that this PR is BC-breaking in the following way:

Previously, passing an in-place operation to nn.Module._apply() does not bump the module's parameters' and their gradients' version counters. After this PR, the module's parameters' and their gradients' version counters will be correctly bumped by the in-place operation, which will invalidate them in any autograd graph they previously participate in.

@pytorchbot pytorchbot added the module: nn Related to torch.nn label Jun 17, 2019
@yf225 yf225 changed the title [DO NOT MERGE] Test fn(param) Use fn(param) instead of fn(param.data) in nn.Module._apply Jun 18, 2019
@yf225 yf225 changed the title Use fn(param) instead of fn(param.data) in nn.Module._apply [BC-breaking] Use fn(param) instead of fn(param.data) in nn.Module._apply Jun 18, 2019
@yf225 yf225 added the module: bc-breaking Related to a BC-breaking change label Jun 18, 2019
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@yf225 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@yf225 merged this pull request in 4b1df5c.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged module: bc-breaking Related to a BC-breaking change module: nn Related to torch.nn

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants