Skip to content

Conversation

@driazati
Copy link
Contributor

@driazati driazati commented May 6, 2019

  • Constructs a new type at runtime so that isinstance checks work for
    weak modules assigned to ScriptModules
  • Fix some extraneous names in __constants__
  • Add in_features and out_features to nn.Linear __constants__

Fixes #19363

Differential Revision: D15302350

* Constructs a new type at runtime so that `isinstance` checks work for
weak modules assigned to `ScriptModule`s
* Register `int`, `float`, `str`, and `bool` properties of `nn.Module`
classes as attributes of the corresponding weak module
* Fix some issues where `bias` was added to `__constants__` even though
it's a parameter
@pytorchbot pytorchbot added oncall: jit Add this issue/PR to JIT oncall triage queue module: nn Related to torch.nn labels May 6, 2019
@driazati driazati requested review from eellison and suo May 6, 2019 23:15
@driazati driazati requested a review from eellison May 9, 2019 20:54
Copy link
Contributor

@eellison eellison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, have a couple of qauestions

>>> print(output.size())
torch.Size([128, 30])
"""
__constants__ = ['bias', 'in_features', 'out_features']
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not related to this diff but: how did this work without constants previously? Should there have been a test failure somewhere that didn't happen?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They're not actually used in nn.Linear except in __repr__

return getattr(self.__dict__["_original"](), attr)
# unwrap the original
original_module = self.__dict__["_original"]()
if original_module and self.__dict__["_initialized"]:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need the "initialized" check here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So the __getattr__ only looks at the original model after __init__ is done

@facebook-github-bot
Copy link
Contributor

@driazati merged this pull request in 3a39ce0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged module: nn Related to torch.nn oncall: jit Add this issue/PR to JIT oncall triage queue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[JIT] Cannot access nn.Linear.in_features in ScriptModule

6 participants