Skip to content

Conversation

@vishwakftw
Copy link
Contributor

@vishwakftw vishwakftw commented Jan 29, 2019

Changelog:

  • Modify concantenation of [1] to a tuple by using cases for list and non-list types.

Test plan:

  • Added a test case to compare the results between list and tuple (here torch.Size())

Fixes #16486

Changelog:
- Modify concantenation of [1] to a tuple by wrapping it in the same type as that of the output_size passed
@soumith
Copy link
Contributor

soumith commented Jan 29, 2019

genuine test failures:

Jan 29 06:24:03 ======================================================================
Jan 29 06:24:03 ERROR: test_nn_max_unpool1d (__main__.TestJitGeneratedFunctional)
Jan 29 06:24:03 ----------------------------------------------------------------------
Jan 29 06:24:03 Traceback (most recent call last):
Jan 29 06:24:03   File "test_jit.py", line 11846, in wrapper
Jan 29 06:24:03     return fn(*args, **kwargs)
Jan 29 06:24:03   File "test_jit.py", line 11891, in do_test
Jan 29 06:24:03     run_test()
Jan 29 06:24:03   File "test_jit.py", line 11887, in run_test
Jan 29 06:24:03     check_against_reference(self, script_fn, fn, f_args_variable, kwargs_variable, no_grad=no_grad)
Jan 29 06:24:03   File "test_jit.py", line 10492, in check_against_reference
Jan 29 06:24:03     outputs_test = self.runAndSaveRNG(func, nograd_inputs, kwargs)
Jan 29 06:24:03   File "test_jit.py", line 540, in runAndSaveRNG
Jan 29 06:24:03     results = func(*inputs, **kwargs)
Jan 29 06:24:03   File "test_jit.py", line 10440, in script_fn
Jan 29 06:24:03     CU = torch.jit.CompilationUnit(script)
Jan 29 06:24:03   File "/opt/python/3.5/lib/python3.5/site-packages/torch/jit/__init__.py", line 654, in __init__
Jan 29 06:24:03     self.define(lang, _frames_up=_frames_up + 1)
Jan 29 06:24:03   File "/opt/python/3.5/lib/python3.5/site-packages/torch/jit/__init__.py", line 660, in define
Jan 29 06:24:03     self.module._define(lang, rcb, False)
Jan 29 06:24:03   File "/opt/python/3.5/lib/python3.5/site-packages/torch/jit/__init__.py", line 687, in _try_compile_weak_script
Jan 29 06:24:03     compiled_fn = torch.jit.script(fn, True, 0, entry["rcb"])
Jan 29 06:24:03   File "/opt/python/3.5/lib/python3.5/site-packages/torch/jit/__init__.py", line 703, in script
Jan 29 06:24:03     _jit_script_compile(mod, ast, _rcb, get_default_args(fn))
Jan 29 06:24:03 RuntimeError: 
Jan 29 06:24:03 
Jan 29 06:24:03 for operator (Tensor 0) -> Tensor:
Jan 29 06:24:03 expected a value of type Tensor for argument '0' but found int[]
Jan 29 06:24:03     kernel_size = _single(kernel_size)
Jan 29 06:24:03     if stride is not None:
Jan 29 06:24:03         _stride = _single(stride)
Jan 29 06:24:03     else:
Jan 29 06:24:03         _stride = kernel_size
Jan 29 06:24:03     padding = _single(padding)
Jan 29 06:24:03     output_size = _unpool_output_size(input, kernel_size, _stride, padding,
Jan 29 06:24:03                                       output_size)
Jan 29 06:24:03     return torch._C._nn.max_unpool2d(input.unsqueeze(3), indices.unsqueeze(3),
Jan 29 06:24:03                                      output_size + type(output_size)([1])).squeeze(3)
Jan 29 06:24:03                                                         ~~~~~~~~~~~ <--- HERE
Jan 29 06:24:03 :
Jan 29 06:24:03     kernel_size = _single(kernel_size)
Jan 29 06:24:03     if stride is not None:
Jan 29 06:24:03         _stride = _single(stride)
Jan 29 06:24:03     else:
Jan 29 06:24:03         _stride = kernel_size
Jan 29 06:24:03     padding = _single(padding)
Jan 29 06:24:03     output_size = _unpool_output_size(input, kernel_size, _stride, padding,
Jan 29 06:24:03                                       output_size)
Jan 29 06:24:03     return torch._C._nn.max_unpool2d(input.unsqueeze(3), indices.unsqueeze(3),
Jan 29 06:24:03                                      output_size + type(output_size)([1])).squeeze(3)
Jan 29 06:24:03                                                    ~~~~ <--- HERE
Jan 29 06:24:03 

@vishwakftw
Copy link
Contributor Author

Thanks for letting me know, I'm working on them.

@vishwakftw
Copy link
Contributor Author

All tests should pass now, however the builds have failed due to a lock issue.

@vishwakftw vishwakftw force-pushed the max_unpool1d-output-size-fix branch from 501040e to f0169e1 Compare January 29, 2019 07:11
@vishwakftw
Copy link
Contributor Author

@soumith is this good to go?

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@soumith is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@vishwakftw vishwakftw deleted the max_unpool1d-output-size-fix branch February 3, 2019 17:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

TypeError when specifying output_shape in nn.MaxUnpool1d()

4 participants