-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Remove usage of legacy autograd function #22925
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
You should kill uses NestedIOFunction too, as it is also in legacy style. |
|
I would like to scope out |
0f5b10d to
d85b098
Compare
| rx, ry = NoneGradientFunction.apply(x, y) | ||
| rx.register_hook(hook) | ||
| ry.register_hook(hook) | ||
| sum(rx, ry).sum().backward() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good!
ezyang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, thank you. Note that this PR reduces our test coverage of legacy autograd functions, which means that we are less likely to notice breakages of legacy once we land it.
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@yf225 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
We are planning to put up a deprecation warning for legacy autograd function in 1.2: #22922. This PR removes all usage of legacy function in PyTorch core and test suite, to prepare for the eventual removal of legacy function.