Skip to content

Conversation

@t-vi
Copy link
Collaborator

@t-vi t-vi commented May 27, 2019

gradcheck currently includes a determinism check (although only trying twice and seeing if results match).
This can lead to flaky tests, e.g. in #20971, but also #13818.
This adds nondet_tol for both gradcheck and gradgradcheck. It does not change / reenable any tests yet.

@pytorchbot pytorchbot added the module: autograd Related to torch.autograd, and the autograd engine in general label May 27, 2019
@ssnl
Copy link
Collaborator

ssnl commented May 27, 2019

Hmm why should a backward not be reentrant?

@t-vi
Copy link
Collaborator Author

t-vi commented May 27, 2019 via email

@t-vi
Copy link
Collaborator Author

t-vi commented May 28, 2019

@pytorchbot rebase this please

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@soumith is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@soumith merged this pull request in d23d04f.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged module: autograd Related to torch.autograd, and the autograd engine in general open source

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants