Skip to content

Conversation

@jamesr66a
Copy link
Collaborator

@jamesr66a jamesr66a commented Jun 25, 2019

Stack from ghstack:

Resolves #21183

Demo:

import torch
import torch.nn as nn

class M(torch.jit.ScriptModule):
    def __init__(self):
        super(M, self).__init__()
        self.conv = nn.Conv2d(5, 5, 2)

    @torch.jit.script_method
    def forward(self, x):
        return self.conv(x)

M().save("m.pt")
loaded = torch.jit.load("m.pt")
loaded(torch.ones(20, 20))
Traceback (most recent call last):
  File "demo.py", line 15, in <module>
    loaded(torch.ones(20, 20))
  File "/home/jamesreed/pytorch/torch/nn/modules/module.py", line 531, in __call__
    result = self.forward(*input, **kwargs)
RuntimeError: Expected 4-dimensional input for 4-dimensional weight 5 5, but got 2-dimensional input of size [20, 20] instead
The above operation failed in interpreter, with the following stack trace:
at code/m.py:248:10
            ret10 = x
          ret5 = ret10
        ret0 = ret5
      ret = ret0
    _122 = _0.weight
    _123 = _0.bias
    _124 = [0, 0]
    _1 = torch.conv2d(ret, _122, _123, [1, 1], _124, [1, 1], 1)
  else:
    _1 = torch.conv2d(x, _0.weight, _0.bias, [1, 1], [0, 0], [1, 1], 1)
         ~~~~~~~~~~~~ <--- HERE
  return _1
Compiled from code at /home/jamesreed/pytorch/torch/nn/modules/conv.py:344:16
    @weak_script_method
    def forward(self, input):
        if self.padding_mode == 'circular':
            expanded_padding = ((self.padding[1] + 1) // 2, self.padding[1] // 2,
                                (self.padding[0] + 1) // 2, self.padding[0] // 2)
            return F.conv2d(F.pad(input, expanded_padding, mode='circular'),
                            self.weight, self.bias, self.stride,
                            _pair(0), self.dilation, self.groups)
        return F.conv2d(input, self.weight, self.bias, self.stride,
               ~~~~~~~~ <--- HERE
                        self.padding, self.dilation, self.groups)

Differential Revision: D15981425

@pytorchbot pytorchbot added oncall: jit Add this issue/PR to JIT oncall triage queue module: pybind Related to our Python bindings / interactions with other Python libraries labels Jun 25, 2019
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
@jamesr66a jamesr66a requested review from suo and zdevito June 25, 2019 01:56
James Reed added 6 commits June 24, 2019 19:03
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
Copy link
Contributor

@zdevito zdevito left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool -- see inline comments for some suggestions. Need to be careful about SourceRanges continuing to be value types (caching has to go at the Source level if at all). Also, I didn't immediately see it in the code, but what happens if you query a source range where the originating code comes from many different locations? Is it returning the source location at the beginning of the source range?

[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
@pytorchbot pytorchbot added caffe2 module: build Build system issues labels Jun 26, 2019
@jamesr66a jamesr66a changed the title [JIT] Load original SourceRanges on import [WIP][JIT] Load original SourceRanges on import Jun 26, 2019
James Reed added 3 commits June 26, 2019 10:16
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
@jamesr66a jamesr66a changed the title [WIP][JIT] Load original SourceRanges on import [JIT] Load original SourceRanges on import Jun 26, 2019
@jamesr66a jamesr66a requested a review from zdevito June 26, 2019 19:11
James Reed added 2 commits June 26, 2019 14:57
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
Copy link
Contributor

@zdevito zdevito left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! Combined with @ZolotukhinM's inlining remembering stack changes this will enable a full description of the stack. Keep in mind that We are going to end up in a situation where there are two separate traces: the one from the loaded code and the original one. Their depth is not the same and we need to do a good job reporting it all. We should resolve this when we merge the components together.

out << getImports();
int64_t source_offset = out.tellp();
out << body_.str();
for (const auto& x : body_.ranges()) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know this is small but this repeating the internal details of source range tracking outside of the the TaggedStream object. This will be a nasty and hard-to-find bug if someone changes the representation and doesn't realize the logic is also repeated here.


private:
std::shared_ptr<Source> deserialize_source(const c10::IValue& iv) {
// TODO: cache these?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, we need to do this one because otherwise it is copy the entire source string over and over again.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The key is just the pointer that is being stored inside in the IValue. I think we have a method that accesses it (see the pickler, which does something similar).

James Reed added 2 commits June 28, 2019 10:33
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
James Reed added 4 commits June 28, 2019 12:04
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
@pytorchbot pytorchbot added the module: internals Related to internal abstractions in c10 and ATen label Jun 28, 2019
James Reed added 6 commits June 28, 2019 17:21
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
[JIT] Load original SourceRanges on import

gh-metadata: pytorch pytorch 22180 gh/jamesr66a/9/head
@zou3519 zou3519 deleted the gh/jamesr66a/9/head branch July 2, 2019 04:17
@facebook-github-bot
Copy link
Contributor

@jamesr66a merged this pull request in ffa15d2.

xzhu1900 pushed a commit to xzhu1900/pytorch that referenced this pull request Jul 5, 2019
Summary:
Pull Request resolved: pytorch#22180
ghimport-source-id: efa46dc

Test Plan: Imported from OSS

Differential Revision: D15981425

Pulled By: jamesr66a

fbshipit-source-id: bef682bd13c1a5be95bdb97e025690c6f2d523d3
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

caffe2 Merged module: build Build system issues module: internals Related to internal abstractions in c10 and ATen module: pybind Related to our Python bindings / interactions with other Python libraries oncall: jit Add this issue/PR to JIT oncall triage queue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants