Skip to content

Conversation

@BowenBao
Copy link
Collaborator

@BowenBao BowenBao commented May 8, 2019

No description provided.

@pytorchbot pytorchbot added the module: onnx Related to torch.onnx label May 8, 2019
@BowenBao BowenBao force-pushed the onnx_script_tensor_factories2 branch from 2650c07 to 8470fc0 Compare May 8, 2019 00:44
@houseroad houseroad self-requested a review May 8, 2019 04:24
@BowenBao
Copy link
Collaborator Author

BowenBao commented May 8, 2019

There is one part of common change in test_pytorch_onnx_caffe2.py to enable tests for models using ScriptModule in this PR #20256. Let's first get that one in, and I'll rebase this one accordingly.

@BowenBao BowenBao changed the title [ONNX] Support exporting tensor factories from scripting [ONNX][WIP] Support exporting tensor factories from scripting May 9, 2019
@BowenBao BowenBao force-pushed the onnx_script_tensor_factories2 branch from 8470fc0 to 87b4beb Compare May 13, 2019 18:07
Copy link
Member

@houseroad houseroad May 20, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Confused here, why change the pin_memory to value instead of bool?

And why remove the pin_memory check?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In short, since pin_memory is not used, changing from bool to value matches how other unused parameters like layout and device are handled, and unblocks the above issue.

=============

In more details, this is related to the default argument issue we discussed regarding ScriptModule models.

If pin_memory is not provided, it will be created as an empty prim::Constant in PTIR graph.

For most of the ops, that part is not handled today. The onnx exporter has no idea what's the default argument value, so the default value has to be added explicitly in symbolic method.

Some ops tried to resolve it by setting that param to 'v', and explicitly checking if it is empty.

if weight is None or weight.node().mustBeNone():

if bias is None or bias.node().mustBeNone():

Again, since pin_memory is not supported in onnx anyway, the latter checking part is ignored.

@BowenBao BowenBao force-pushed the onnx_script_tensor_factories2 branch 2 times, most recently from c16ee07 to 3e067e9 Compare May 23, 2019 01:21
@BowenBao BowenBao changed the title [ONNX][WIP] Support exporting tensor factories from scripting [ONNX] Support exporting tensor factories from scripting May 23, 2019
@BowenBao BowenBao force-pushed the onnx_script_tensor_factories2 branch 2 times, most recently from 8e0fc04 to 27c443a Compare May 24, 2019 17:28
@BowenBao BowenBao force-pushed the onnx_script_tensor_factories2 branch from 27c443a to 2993c43 Compare May 28, 2019 16:57
Copy link
Member

@houseroad houseroad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@houseroad merged this pull request in 12b0ded.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants