Skip to content

Conversation

@yf225
Copy link
Contributor

@yf225 yf225 commented Jul 12, 2019

As part of the Variable/Tensor merge, variable.tensor_data() should be removed in favor of variable.detach(). This PR removes tensor_data() call sites in Python Variable() and nn.Parameter() constructor paths.

Note that this PR is BC-breaking in the following way:

  • For Python Variable() constructor:
    Previously, in-place updating a tensor after it's been used to create a Variable does not bump the Variable's version counter, which causes the following problem:
t = torch.ones(2, 3)
v = torch.autograd.Variable(t).requires_grad_()
y = v * v
t.add_(1)  # This bumps version counter of `t`
y.sum().backward()  # This computes `v`'s gradient incorrectly before this patch, and throws error after this patch

After this patch, in-place updating a tensor after it's been used to create a Variable will also bump the Variable's version counter, thus preserving the correctness of the Variable's version counter.

  • For Python nn.Parameter() constructor:
    Previously, in-place updating a tensor after it's been used to create an nn.Parameter does not bump the nn.Parameter's version counter, which causes the following problem:
t = torch.ones(2, 3)
v = torch.nn.Parameter(t)
y = v * v
t.add_(1)  # This bumps version counter of `t`
y.sum().backward()  # This computes `v`'s gradient incorrectly before this patch, and throws error after this patch

After this patch, in-place updating a tensor after it's been used to create an nn.Parameter will also bump the nn.Parameter's version counter, thus preserving the correctness of the nn.Parameter's version counter.

@yf225 yf225 requested a review from gchanan July 12, 2019 20:47
@pytorchbot pytorchbot added module: autograd Related to torch.autograd, and the autograd engine in general module: pybind Related to our Python bindings / interactions with other Python libraries labels Jul 12, 2019
@gchanan gchanan added the module: bc-breaking Related to a BC-breaking change label Jul 12, 2019
@yf225
Copy link
Contributor Author

yf225 commented Jul 15, 2019

@pytorchbot rebase this please

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@yf225 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@yf225 merged this pull request in 317cf7c.

yf225 pushed a commit to yf225/pytorch that referenced this pull request Jul 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged module: autograd Related to torch.autograd, and the autograd engine in general module: bc-breaking Related to a BC-breaking change module: pybind Related to our Python bindings / interactions with other Python libraries

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants