-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Open
Labels
module: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generalmodule: data parallelmodule: norms and normalizationtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
I got this on a run:
Nov 09 21:56:23 ======================================================================
Nov 09 21:56:23 ERROR: test_spectral_norm (__main__.TestNN)
Nov 09 21:56:23 ----------------------------------------------------------------------
Nov 09 21:56:23 Traceback (most recent call last):
Nov 09 21:56:23 File "/var/lib/jenkins/workspace/test/common_utils.py", line 116, in wrapper
Nov 09 21:56:23 fn(*args, **kwargs)
Nov 09 21:56:23 File "test_nn.py", line 1864, in test_spectral_norm
Nov 09 21:56:23 torch.autograd.gradcheck(fn, (input.clone().requires_grad_(),))
Nov 09 21:56:23 File "/opt/conda/lib/python3.6/site-packages/torch/autograd/gradcheck.py", line 208, in gradcheck
Nov 09 21:56:23 return fail_test('Backward is not reentrant, i.e., running backward with same '
Nov 09 21:56:23 File "/opt/conda/lib/python3.6/site-packages/torch/autograd/gradcheck.py", line 185, in fail_test
Nov 09 21:56:23 raise RuntimeError(msg)
Nov 09 21:56:23 RuntimeError: Backward is not reentrant, i.e., running backward with same input and grad_output multiple times gives different values, although analytical gradient matches numerical gradient
Nov 09 21:56:23
cc @ezyang @albanD @zou3519 @gqchen @pearu @nikitaved @soulitzer @lezcano
colesbury
Metadata
Metadata
Assignees
Labels
module: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generalmodule: data parallelmodule: norms and normalizationtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module