-
Notifications
You must be signed in to change notification settings - Fork 26.3k
fix as_strided_scatter_backward #87646
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/87646
Note: Links to docs will display an error until the docs builds have been completed. ❌ 2 FailuresAs of commit 874f298: The following jobs have failed:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
albanD
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great catch!
Also I guess this means we don't have opinfo (and thus autograd testing) for these ops... We should add them!
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. [ghstack-poisoned]
|
we had a few skipped's OpInfo's for |
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. cc albanD [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. Fixes #88105 cc albanD [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. Fixes #88105 cc albanD [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. Fixes #88105 cc albanD [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. Fixes #88105 cc albanD [ghstack-poisoned]
|
If this goes in after https://github.com/pytorch/pytorch/pull/85583/files then it should review the skips on the OpInfo |
|
@pytorchbot revert -m 'Sorry for reverting your PR but I think this one or one of the PR in the stack break bionic-cuda11.7 on trunk https://hud.pytorch.org/pytorch/pytorch/commit/70782981f06a042796d4604df2ec1491f4f5b194' -c nosignal |
|
@pytorchbot successfully started a revert job. Check the current status here. |
|
@bdhirsh your PR has been successfully reverted. |
This reverts commit f9d7985. Reverted #87646 on behalf of https://github.com/huydhn due to Sorry for reverting your PR but I think this one or one of the PR in the stack break bionic-cuda11.7 on trunk https://hud.pytorch.org/pytorch/pytorch/commit/70782981f06a042796d4604df2ec1491f4f5b194
This reverts commit 71fb763. [ghstack-poisoned]
This reverts commit 71fb763. [ghstack-poisoned]
This reverts commit 71fb763. [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. Fixes pytorch#88105 Pull Request resolved: pytorch#87646 Approved by: https://github.com/albanD
This reverts commit f9d7985. Reverted pytorch#87646 on behalf of https://github.com/huydhn due to Sorry for reverting your PR but I think this one or one of the PR in the stack break bionic-cuda11.7 on trunk https://hud.pytorch.org/pytorch/pytorch/commit/70782981f06a042796d4604df2ec1491f4f5b194
This reverts commit 71fb763. Pull Request resolved: #88342 Approved by: https://github.com/zou3519
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. Fixes pytorch#88105 Pull Request resolved: pytorch#87646 Approved by: https://github.com/albanD
This reverts commit f9d7985. Reverted pytorch#87646 on behalf of https://github.com/huydhn due to Sorry for reverting your PR but I think this one or one of the PR in the stack break bionic-cuda11.7 on trunk https://hud.pytorch.org/pytorch/pytorch/commit/70782981f06a042796d4604df2ec1491f4f5b194
This reverts commit 71fb763. Pull Request resolved: pytorch#88342 Approved by: https://github.com/zou3519
|
Looks like this PR hasn't been updated in a while so we're going to go ahead and mark this as |
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory.
Fixes #88105
Stack from ghstack (oldest at bottom):
cc @albanD