Skip to content

Conversation

@suo
Copy link
Member

@suo suo commented Sep 23, 2019

Stack from ghstack:

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:

  • Introduce a ConcreteModuleType concept. This acts both as the key into the type
    cache, and as the source of truth for ModuleValue::attr queries. It needs
    to do both jobs because that's how we ensure correctness (if the types are
    different, it's because ModuleValue::attr would return different things).
  • Now recursive_script will first construct a ConcreteModuleType and search for a
    pre-existing type before starting compilation.
  • All previous paths to creating a ScriptModule (including inheriting from
    ScriptModule) are now rewritten to go through create_script_module, so
    that we have only a single place where construction happens.

Behavioral changes:

  • Big change to torch.jit.ScriptModule inheritance: all attributes are now
    recursively scripted if possible, matching recursive scripting semantics.
    This makes it hard to keep something from being scripted (for example, a
    Python submodule). Possibly we'll need an ignore() type thing for
    attributes. In particular, this adds self.training to every ScriptModule, since
    it's present on every nn.Module.

TODO

  • Actually add tests verifying types are shared.
  • Make it clearer when you're constructing a bogus ScriptModule.
  • test for attributes that differ only on parameter status
  • Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
  • Test overloads + static inheritance
  • Test constants + static inheritance
  • Test that tracing a function in init properly generates fresh types
  • torch.jit.export cannot find types. Add the following to test_nn_lstm:
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)

PackedSequence will not be found

  • Fix up all the random TODOs here.

Differential Revision: D17551196

@suo suo requested a review from apaszke as a code owner September 23, 2019 20:55
@pytorchbot pytorchbot added oncall: jit Add this issue/PR to JIT oncall triage queue module: cpp Related to C++ API module: internals Related to internal abstractions in c10 and ATen module: pybind Related to our Python bindings / interactions with other Python libraries labels Sep 23, 2019
suo added a commit that referenced this pull request Sep 23, 2019
ghstack-source-id: 74b149f
Pull Request resolved: #26666
@suo suo requested a review from zdevito September 23, 2019 21:24

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens. Unfortunately
  they are all pretty different now but hopefully we can refactor that away.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes.
  - In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
  recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
  already was warned against in the docs, but I need to figure out a way to
  make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
  `ScriptModule`" path.
  - `define()` can no longer be used in a ScriptModule's `__init__`. I added
    `lazy_define()` to make up for it.
  - Constants/attributes/etc. are now lazily resolved after `__init__` instead
    of being processed eagerly on `__setattr__`. This moves some failures
    around but otherwise doens't matter.


## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.
suo added a commit that referenced this pull request Sep 23, 2019
ghstack-source-id: 50153cf
Pull Request resolved: #26666
@suo
Copy link
Member Author

suo commented Sep 23, 2019

Notes from offline review with @zdevito:

  • ModuleMeta name is not meta enough. MetaModuleMeta preferred.
  • Try to simplify callees of create_script_module by having an "impl" object that is always created out of place. Potentially this allows us to retain the existing ScriptModule behavior.
  • Need to work out @ignore and @parameter list in a way that lets us avoid using pyModule_. Potentially it should create a temporary python object that wraps the cpp module and use that as self.

@suo
Copy link
Member Author

suo commented Sep 23, 2019

cc @ZolotukhinM this is the PR that I was talking about


Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens. Unfortunately
  they are all pretty different now but hopefully we can refactor that away.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes.
  - In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
  recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
  already was warned against in the docs, but I need to figure out a way to
  make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
  `ScriptModule`" path.
  - `define()` can no longer be used in a ScriptModule's `__init__`. I added
    `lazy_define()` to make up for it.
  - Constants/attributes/etc. are now lazily resolved after `__init__` instead
    of being processed eagerly on `__setattr__`. This moves some failures
    around but otherwise doens't matter.


## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens. Unfortunately
  they are all pretty different now but hopefully we can refactor that away.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes.
  - In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
  recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
  already was warned against in the docs, but I need to figure out a way to
  make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
  `ScriptModule`" path.
  - `define()` can no longer be used in a ScriptModule's `__init__`. I added
    `lazy_define()` to make up for it.
  - Constants/attributes/etc. are now lazily resolved after `__init__` instead
    of being processed eagerly on `__setattr__`. This moves some failures
    around but otherwise doens't matter.


## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.
suo added a commit that referenced this pull request Sep 24, 2019
ghstack-source-id: d2accfa
Pull Request resolved: #26666

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens. Unfortunately
  they are all pretty different now but hopefully we can refactor that away.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes.
  - In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
  recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
  already was warned against in the docs, but I need to figure out a way to
  make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
  `ScriptModule`" path.
  - `define()` can no longer be used in a ScriptModule's `__init__`. I added
    `lazy_define()` to make up for it.
  - Constants/attributes/etc. are now lazily resolved after `__init__` instead
    of being processed eagerly on `__setattr__`. This moves some failures
    around but otherwise doens't matter.


## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.
@pytorchbot pytorchbot added the module: nn Related to torch.nn label Sep 24, 2019
suo added a commit that referenced this pull request Sep 24, 2019
ghstack-source-id: 381185d
Pull Request resolved: #26666

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens. Unfortunately
  they are all pretty different now but hopefully we can refactor that away.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes.
  - In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
  recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
  already was warned against in the docs, but I need to figure out a way to
  make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
  `ScriptModule`" path.
  - `define()` can no longer be used in a ScriptModule's `__init__`. I added
    `lazy_define()` to make up for it.
  - Constants/attributes/etc. are now lazily resolved after `__init__` instead
    of being processed eagerly on `__setattr__`. This moves some failures
    around but otherwise doens't matter.


## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens. Unfortunately
  they are all pretty different now but hopefully we can refactor that away.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes.
  - In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
  recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
  already was warned against in the docs, but I need to figure out a way to
  make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
  `ScriptModule`" path.
  - `define()` can no longer be used in a ScriptModule's `__init__`. I added
    `lazy_define()` to make up for it.
  - Constants/attributes/etc. are now lazily resolved after `__init__` instead
    of being processed eagerly on `__setattr__`. This moves some failures
    around but otherwise doens't matter.


## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.

Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens. Unfortunately
  they are all pretty different now but hopefully we can refactor that away.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes.
  - In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
  recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
  already was warned against in the docs, but I need to figure out a way to
  make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
  `ScriptModule`" path.
  - `define()` can no longer be used in a ScriptModule's `__init__`. I added
    `lazy_define()` to make up for it.
  - Constants/attributes/etc. are now lazily resolved after `__init__` instead
    of being processed eagerly on `__setattr__`. This moves some failures
    around but otherwise doens't matter.


## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.

Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
suo added a commit that referenced this pull request Sep 24, 2019
ghstack-source-id: 8e227f7
Pull Request resolved: #26666

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens. Unfortunately
  they are all pretty different now but hopefully we can refactor that away.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes.
  - In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
  recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
  already was warned against in the docs, but I need to figure out a way to
  make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
  `ScriptModule`" path.
  - `define()` can no longer be used in a ScriptModule's `__init__`. I added
    `lazy_define()` to make up for it.
  - Constants/attributes/etc. are now lazily resolved after `__init__` instead
    of being processed eagerly on `__setattr__`. This moves some failures
    around but otherwise doens't matter.


## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.

Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
suo added a commit that referenced this pull request Sep 25, 2019
ghstack-source-id: d313fc5
Pull Request resolved: #26666
suo added a commit that referenced this pull request Oct 10, 2019
ghstack-source-id: ef5a9ce
Pull Request resolved: #26666

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes. In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
## TODO
- [x] Actually add tests verifying types are shared.
- [x] Make it clearer when you're constructing a bogus ScriptModule.
- [x] test for attributes that differ only on parameter status
- [x] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [x] Test that tracing a function in __init__ properly generates fresh types
- [x] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [x] Fix up all the random TODOs here.

Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes. In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
## TODO
- [x] Actually add tests verifying types are shared.
- [x] Make it clearer when you're constructing a bogus ScriptModule.
- [x] test for attributes that differ only on parameter status
- [x] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [x] Test that tracing a function in __init__ properly generates fresh types
- [x] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [x] Fix up all the random TODOs here.

Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
suo added a commit that referenced this pull request Oct 10, 2019
ghstack-source-id: 1904618
Pull Request resolved: #26666

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes. In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
## TODO
- [x] Actually add tests verifying types are shared.
- [x] Make it clearer when you're constructing a bogus ScriptModule.
- [x] test for attributes that differ only on parameter status
- [x] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [x] Test that tracing a function in __init__ properly generates fresh types
- [x] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [x] Fix up all the random TODOs here.

Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
suo added a commit that referenced this pull request Oct 10, 2019
ghstack-source-id: f987c2e
Pull Request resolved: #26666

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes. In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
## TODO
- [x] Actually add tests verifying types are shared.
- [x] Make it clearer when you're constructing a bogus ScriptModule.
- [x] test for attributes that differ only on parameter status
- [x] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [x] Test that tracing a function in __init__ properly generates fresh types
- [x] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [x] Fix up all the random TODOs here.

Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
suo added a commit that referenced this pull request Oct 12, 2019
ghstack-source-id: adb5005
Pull Request resolved: #26666

Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.

Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes. In particular, this adds `self.training` to *every* ScriptModule, since
    it's present on every `nn.Module`.
## TODO
- [x] Actually add tests verifying types are shared.
- [x] Make it clearer when you're constructing a bogus ScriptModule.
- [x] test for attributes that differ only on parameter status
- [x] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [x] Test that tracing a function in __init__ properly generates fresh types
- [x] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
    # type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]]  # noqa
    return self.x(input)
```
PackedSequence will not be found
- [x] Fix up all the random TODOs here.

Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
suo added a commit that referenced this pull request Oct 12, 2019
ghstack-source-id: 659c4e9
Pull Request resolved: #26666
zdevito pushed a commit to zdevito/ATen that referenced this pull request Oct 12, 2019
Summary:
Pull Request resolved: pytorch/pytorch#26666

Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes. In particular, this adds `self.training` to *every* ScriptModule, since
  it's present on every `nn.Module`.
- I believe this change to be transparent to existing users of the inheritance API, since if you had an attribute that is unscriptable that you never used, there is no error. In some cases, we will create new attributes (even if they are unused), which will increase serialized model size from before.

Test Plan: Imported from OSS

Differential Revision: D17551196

Pulled By: suo

fbshipit-source-id: b476d1c9feb3ddfd63406d90989aaf9dfe890591
@facebook-github-bot
Copy link
Contributor

@suo merged this pull request in 3412627.

1 similar comment
@facebook-github-bot
Copy link
Contributor

@suo merged this pull request in 3412627.

@facebook-github-bot facebook-github-bot deleted the gh/suo/174/head branch October 28, 2019 22:20
jerryzh168 added a commit that referenced this pull request Nov 20, 2019
Summary:
Previous implementation of `clone` in `script::Module` copies both the module instance and the
class type, after we enabled type sharing #26666 we also
need to have a function to clone instance only and share the underlying class type.

Test Plan:
tbd

Reviewers:
mvz

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Nov 20, 2019
Summary:
Previous implementation of `clone` in `script::Module` copies both the module instance and the
class type, after we enabled type sharing #26666 we also
need to have a function to clone instance only and share the underlying class type.

Test Plan:
tbd

Reviewers:
mvz

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Nov 20, 2019
Summary:
Previous implementation of `clone` in `script::Module` copies both the module instance and the
class type, after we enabled type sharing #26666 we also
need to have a function to clone instance only and share the underlying class type.

Test Plan:
build/bin/test_jit

Reviewers:
mvz

Subscribers:

Tasks:

Tags:

ghstack-source-id: 5c8bc71
Pull Request resolved: #30168
jerryzh168 added a commit that referenced this pull request Nov 21, 2019
Summary:
Previous implementation of `clone` in `script::Module` copies both the module instance and the
class type, after we enabled type sharing #26666 we also
need to have a function to clone instance only and share the underlying class type.

Test Plan:
tbd

Reviewers:
mvz

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Nov 21, 2019
Summary:
Previous implementation of `clone` in `script::Module` copies both the module instance and the
class type, after we enabled type sharing #26666 we also
need to have a function to clone instance only and share the underlying class type.

Test Plan:
build/bin/test_jit

Reviewers:
mvz

Subscribers:

Tasks:

Tags:

ghstack-source-id: 452d038
Pull Request resolved: #30168
facebook-github-bot pushed a commit that referenced this pull request Nov 21, 2019
Summary:
Pull Request resolved: #30168

Previous implementation of `clone` in `script::Module` copies both the module instance and the
class type, after we enabled type sharing #26666 we also
need to have a function to clone instance only and share the underlying class type.

Test Plan:
tbd

Imported from OSS

Differential Revision: D18631324

fbshipit-source-id: dbadcf19695faee0f755f45093b24618c047b9d1
thiagocrepaldi pushed a commit to thiagocrepaldi/pytorch that referenced this pull request Feb 4, 2020
Summary:
Pull Request resolved: pytorch#26666

Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes. In particular, this adds `self.training` to *every* ScriptModule, since
  it's present on every `nn.Module`.
- I believe this change to be transparent to existing users of the inheritance API, since if you had an attribute that is unscriptable that you never used, there is no error. In some cases, we will create new attributes (even if they are unused), which will increase serialized model size from before.

Test Plan: Imported from OSS

Differential Revision: D17551196

Pulled By: suo

fbshipit-source-id: b476d1c9feb3ddfd63406d90989aaf9dfe890591
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged module: build Build system issues module: cpp Related to C++ API module: internals Related to internal abstractions in c10 and ATen module: nn Related to torch.nn module: pybind Related to our Python bindings / interactions with other Python libraries oncall: jit Add this issue/PR to JIT oncall triage queue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants