-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[jit] module dedupe #26666
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Closed
[jit] module dedupe #26666
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens. Unfortunately
they are all pretty different now but hopefully we can refactor that away.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes.
- In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
already was warned against in the docs, but I need to figure out a way to
make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
`ScriptModule`" path.
- `define()` can no longer be used in a ScriptModule's `__init__`. I added
`lazy_define()` to make up for it.
- Constants/attributes/etc. are now lazily resolved after `__init__` instead
of being processed eagerly on `__setattr__`. This moves some failures
around but otherwise doens't matter.
## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.
Member
Author
|
Notes from offline review with @zdevito:
|
Member
Author
|
cc @ZolotukhinM this is the PR that I was talking about |
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens. Unfortunately
they are all pretty different now but hopefully we can refactor that away.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes.
- In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
already was warned against in the docs, but I need to figure out a way to
make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
`ScriptModule`" path.
- `define()` can no longer be used in a ScriptModule's `__init__`. I added
`lazy_define()` to make up for it.
- Constants/attributes/etc. are now lazily resolved after `__init__` instead
of being processed eagerly on `__setattr__`. This moves some failures
around but otherwise doens't matter.
## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens. Unfortunately
they are all pretty different now but hopefully we can refactor that away.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes.
- In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
already was warned against in the docs, but I need to figure out a way to
make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
`ScriptModule`" path.
- `define()` can no longer be used in a ScriptModule's `__init__`. I added
`lazy_define()` to make up for it.
- Constants/attributes/etc. are now lazily resolved after `__init__` instead
of being processed eagerly on `__setattr__`. This moves some failures
around but otherwise doens't matter.
## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens. Unfortunately
they are all pretty different now but hopefully we can refactor that away.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes.
- In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
already was warned against in the docs, but I need to figure out a way to
make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
`ScriptModule`" path.
- `define()` can no longer be used in a ScriptModule's `__init__`. I added
`lazy_define()` to make up for it.
- Constants/attributes/etc. are now lazily resolved after `__init__` instead
of being processed eagerly on `__setattr__`. This moves some failures
around but otherwise doens't matter.
## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens. Unfortunately
they are all pretty different now but hopefully we can refactor that away.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes.
- In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
already was warned against in the docs, but I need to figure out a way to
make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
`ScriptModule`" path.
- `define()` can no longer be used in a ScriptModule's `__init__`. I added
`lazy_define()` to make up for it.
- Constants/attributes/etc. are now lazily resolved after `__init__` instead
of being processed eagerly on `__setattr__`. This moves some failures
around but otherwise doens't matter.
## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens. Unfortunately
they are all pretty different now but hopefully we can refactor that away.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes.
- In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
already was warned against in the docs, but I need to figure out a way to
make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
`ScriptModule`" path.
- `define()` can no longer be used in a ScriptModule's `__init__`. I added
`lazy_define()` to make up for it.
- Constants/attributes/etc. are now lazily resolved after `__init__` instead
of being processed eagerly on `__setattr__`. This moves some failures
around but otherwise doens't matter.
## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.
Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens. Unfortunately
they are all pretty different now but hopefully we can refactor that away.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes.
- In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
already was warned against in the docs, but I need to figure out a way to
make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
`ScriptModule`" path.
- `define()` can no longer be used in a ScriptModule's `__init__`. I added
`lazy_define()` to make up for it.
- Constants/attributes/etc. are now lazily resolved after `__init__` instead
of being processed eagerly on `__setattr__`. This moves some failures
around but otherwise doens't matter.
## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.
Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ModuleMeta` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ModuleMeta` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens. Unfortunately
they are all pretty different now but hopefully we can refactor that away.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes.
- In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
- `forward()` is always compiled when inheriting from `ScriptModule`, matching
recursive scripting's semantics.
- Constructing `torch.jit.ScriptModule` will lead to an unusable object. It
already was warned against in the docs, but I need to figure out a way to
make this more obvious that this won't.
- Relating to the fact that we compile *after* `__init__` for the "inherit from
`ScriptModule`" path.
- `define()` can no longer be used in a ScriptModule's `__init__`. I added
`lazy_define()` to make up for it.
- Constants/attributes/etc. are now lazily resolved after `__init__` instead
of being processed eagerly on `__setattr__`. This moves some failures
around but otherwise doens't matter.
## TODO
- [ ] Actually add tests verifying types are shared.
- [ ] Make it clearer when you're constructing a bogus ScriptModule.
- [ ] test for attributes that differ only on parameter status
- [ ] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [ ] Test that tracing a function in __init__ properly generates fresh types
- [ ] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [ ] Fix up all the random TODOs here.
Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes. In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
## TODO
- [x] Actually add tests verifying types are shared.
- [x] Make it clearer when you're constructing a bogus ScriptModule.
- [x] test for attributes that differ only on parameter status
- [x] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [x] Test that tracing a function in __init__ properly generates fresh types
- [x] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [x] Fix up all the random TODOs here.
Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes. In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
## TODO
- [x] Actually add tests verifying types are shared.
- [x] Make it clearer when you're constructing a bogus ScriptModule.
- [x] test for attributes that differ only on parameter status
- [x] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [x] Test that tracing a function in __init__ properly generates fresh types
- [x] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [x] Fix up all the random TODOs here.
Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes. In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
## TODO
- [x] Actually add tests verifying types are shared.
- [x] Make it clearer when you're constructing a bogus ScriptModule.
- [x] test for attributes that differ only on parameter status
- [x] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [x] Test that tracing a function in __init__ properly generates fresh types
- [x] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [x] Fix up all the random TODOs here.
Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes. In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
## TODO
- [x] Actually add tests verifying types are shared.
- [x] Make it clearer when you're constructing a bogus ScriptModule.
- [x] test for attributes that differ only on parameter status
- [x] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [x] Test that tracing a function in __init__ properly generates fresh types
- [x] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [x] Fix up all the random TODOs here.
Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
cache, and as the source of truth for `ModuleValue::attr` queries. It needs
to do both jobs because that's how we ensure correctness (if the types are
different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
`ScriptModule`) are now rewritten to go through `create_script_module`, so
that we have only a single place where construction happens.
Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
recursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an `ignore()` type thing for
attributes. In particular, this adds `self.training` to *every* ScriptModule, since
it's present on every `nn.Module`.
## TODO
- [x] Actually add tests verifying types are shared.
- [x] Make it clearer when you're constructing a bogus ScriptModule.
- [x] test for attributes that differ only on parameter status
- [x] Fix check_trace. Currently we're not checking anything because we never re-compile the traced function.
- [ ] Test overloads + static inheritance
- [ ] Test constants + static inheritance
- [x] Test that tracing a function in __init__ properly generates fresh types
- [x] torch.jit.export cannot find types. Add the following to `test_nn_lstm`:
```
@torch.jit.export
def foo(self, input):
# type: (PackedSequence) -> Tuple[PackedSequence, Tuple[Tensor, Tensor]] # noqa
return self.x(input)
```
PackedSequence will not be found
- [x] Fix up all the random TODOs here.
Differential Revision: [D17551196](https://our.internmc.facebook.com/intern/diff/D17551196)
zdevito
pushed a commit
to zdevito/ATen
that referenced
this pull request
Oct 12, 2019
Summary: Pull Request resolved: pytorch/pytorch#26666 Changes: - Introduce a `ConcreteModuleType` concept. This acts both as the key into the type cache, and as the source of truth for `ModuleValue::attr` queries. It needs to do both jobs because that's how we ensure correctness (if the types are different, it's because `ModuleValue::attr` would return different things). - Now `recursive_script` will first construct a `ConcreteModuleType` and search for a pre-existing type before starting compilation. - All previous paths to creating a `ScriptModule` (including inheriting from `ScriptModule`) are now rewritten to go through `create_script_module`, so that we have only a single place where construction happens. Behavioral changes: - Big change to `torch.jit.ScriptModule` inheritance: all attributes are now recursively scripted if possible, matching recursive scripting semantics. This makes it hard to keep something from being scripted (for example, a Python submodule). Possibly we'll need an `ignore()` type thing for attributes. In particular, this adds `self.training` to *every* ScriptModule, since it's present on every `nn.Module`. - I believe this change to be transparent to existing users of the inheritance API, since if you had an attribute that is unscriptable that you never used, there is no error. In some cases, we will create new attributes (even if they are unused), which will increase serialized model size from before. Test Plan: Imported from OSS Differential Revision: D17551196 Pulled By: suo fbshipit-source-id: b476d1c9feb3ddfd63406d90989aaf9dfe890591
Contributor
1 similar comment
Contributor
jerryzh168
added a commit
that referenced
this pull request
Nov 20, 2019
Summary: Previous implementation of `clone` in `script::Module` copies both the module instance and the class type, after we enabled type sharing #26666 we also need to have a function to clone instance only and share the underlying class type. Test Plan: tbd Reviewers: mvz Subscribers: Tasks: Tags: [ghstack-poisoned]
jerryzh168
added a commit
that referenced
this pull request
Nov 20, 2019
Summary: Previous implementation of `clone` in `script::Module` copies both the module instance and the class type, after we enabled type sharing #26666 we also need to have a function to clone instance only and share the underlying class type. Test Plan: tbd Reviewers: mvz Subscribers: Tasks: Tags: [ghstack-poisoned]
jerryzh168
added a commit
that referenced
this pull request
Nov 20, 2019
Summary: Previous implementation of `clone` in `script::Module` copies both the module instance and the class type, after we enabled type sharing #26666 we also need to have a function to clone instance only and share the underlying class type. Test Plan: build/bin/test_jit Reviewers: mvz Subscribers: Tasks: Tags: ghstack-source-id: 5c8bc71 Pull Request resolved: #30168
jerryzh168
added a commit
that referenced
this pull request
Nov 21, 2019
Summary: Previous implementation of `clone` in `script::Module` copies both the module instance and the class type, after we enabled type sharing #26666 we also need to have a function to clone instance only and share the underlying class type. Test Plan: tbd Reviewers: mvz Subscribers: Tasks: Tags: [ghstack-poisoned]
jerryzh168
added a commit
that referenced
this pull request
Nov 21, 2019
Summary: Previous implementation of `clone` in `script::Module` copies both the module instance and the class type, after we enabled type sharing #26666 we also need to have a function to clone instance only and share the underlying class type. Test Plan: build/bin/test_jit Reviewers: mvz Subscribers: Tasks: Tags: ghstack-source-id: 452d038 Pull Request resolved: #30168
facebook-github-bot
pushed a commit
that referenced
this pull request
Nov 21, 2019
Summary: Pull Request resolved: #30168 Previous implementation of `clone` in `script::Module` copies both the module instance and the class type, after we enabled type sharing #26666 we also need to have a function to clone instance only and share the underlying class type. Test Plan: tbd Imported from OSS Differential Revision: D18631324 fbshipit-source-id: dbadcf19695faee0f755f45093b24618c047b9d1
thiagocrepaldi
pushed a commit
to thiagocrepaldi/pytorch
that referenced
this pull request
Feb 4, 2020
Summary: Pull Request resolved: pytorch#26666 Changes: - Introduce a `ConcreteModuleType` concept. This acts both as the key into the type cache, and as the source of truth for `ModuleValue::attr` queries. It needs to do both jobs because that's how we ensure correctness (if the types are different, it's because `ModuleValue::attr` would return different things). - Now `recursive_script` will first construct a `ConcreteModuleType` and search for a pre-existing type before starting compilation. - All previous paths to creating a `ScriptModule` (including inheriting from `ScriptModule`) are now rewritten to go through `create_script_module`, so that we have only a single place where construction happens. Behavioral changes: - Big change to `torch.jit.ScriptModule` inheritance: all attributes are now recursively scripted if possible, matching recursive scripting semantics. This makes it hard to keep something from being scripted (for example, a Python submodule). Possibly we'll need an `ignore()` type thing for attributes. In particular, this adds `self.training` to *every* ScriptModule, since it's present on every `nn.Module`. - I believe this change to be transparent to existing users of the inheritance API, since if you had an attribute that is unscriptable that you never used, there is no error. In some cases, we will create new attributes (even if they are unused), which will increase serialized model size from before. Test Plan: Imported from OSS Differential Revision: D17551196 Pulled By: suo fbshipit-source-id: b476d1c9feb3ddfd63406d90989aaf9dfe890591
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
Merged
module: build
Build system issues
module: cpp
Related to C++ API
module: internals
Related to internal abstractions in c10 and ATen
module: nn
Related to torch.nn
module: pybind
Related to our Python bindings / interactions with other Python libraries
oncall: jit
Add this issue/PR to JIT oncall triage queue
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Stack from ghstack:
Putting this up now that tests (should) pass. Still lots of cleanup to do, and we can
change the any of the actual concepts introduced here now that we have a working baseline.
Changes:
ConcreteModuleTypeconcept. This acts both as the key into the typecache, and as the source of truth for
ModuleValue::attrqueries. It needsto do both jobs because that's how we ensure correctness (if the types are
different, it's because
ModuleValue::attrwould return different things).recursive_scriptwill first construct aConcreteModuleTypeand search for apre-existing type before starting compilation.
ScriptModule(including inheriting fromScriptModule) are now rewritten to go throughcreate_script_module, sothat we have only a single place where construction happens.
Behavioral changes:
torch.jit.ScriptModuleinheritance: all attributes are nowrecursively scripted if possible, matching recursive scripting semantics.
This makes it hard to keep something from being scripted (for example, a
Python submodule). Possibly we'll need an
ignore()type thing forattributes. In particular, this adds
self.trainingto every ScriptModule, sinceit's present on every
nn.Module.TODO
test_nn_lstm:PackedSequence will not be found
Differential Revision: D17551196