Skip to content

Conversation

@neginraoof
Copy link
Contributor

When creating the onnx graph, we overwrite the output type with the output type of the PT graph.
In some special cases, when using scripting, the PT graph does not have type information. We want to avoid overwriting the input type is these cases.

@neginraoof neginraoof requested a review from apaszke as a code owner September 10, 2019 08:27
@pytorchbot pytorchbot added oncall: jit Add this issue/PR to JIT oncall triage queue module: onnx Related to torch.onnx labels Sep 10, 2019
@neginraoof neginraoof changed the title do not overwrite the ouput type for ONNX model if it already has a type [ONNX] Avoid overwriting the model output type Sep 11, 2019
@neginraoof
Copy link
Contributor Author

@pytorchbot rebase this please

@neginraoof neginraoof force-pushed the neraoof/fixSliceScripting branch from 7357faa to 0747ada Compare September 17, 2019 07:10
@pytorchbot pytorchbot added module: autograd Related to torch.autograd, and the autograd engine in general module: internals Related to internal abstractions in c10 and ATen module: pybind Related to our Python bindings / interactions with other Python libraries labels Sep 17, 2019
@neginraoof neginraoof force-pushed the neraoof/fixSliceScripting branch from 0747ada to 5a9d9b6 Compare September 17, 2019 07:24
@yf225 yf225 added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Sep 25, 2019
@BowenBao
Copy link
Collaborator

BowenBao commented Oct 1, 2019

Edit:
Consider adding an example test case in test_jit.py test_pytorch_onnx_onnxruntime.py

@BowenBao
Copy link
Collaborator

Seems there is a failed test case
test/onnx/test_onnx_opset.py::TestONNXOpset::test_std_along_dims FAILED [ 0%]
The caffe2_onnx test also segfaults at the end, not sure if it is related.

@neginraoof
Copy link
Contributor Author

@BowenBao I've updated the PR. This build failure:
ci/circleci: pytorch_xla_linux_xenial_py3_6_clang7_build
look unrelated

@neginraoof neginraoof changed the title [ONNX] Avoid overwriting the model output type [ONNX] Avoid overwriting model output type Nov 21, 2019
@neginraoof neginraoof changed the title [ONNX] Avoid overwriting model output type [ONNX] Avoid overwriting output type in onnx graph Nov 21, 2019
@houseroad houseroad requested a review from BowenBao November 21, 2019 21:57
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

module: autograd Related to torch.autograd, and the autograd engine in general module: internals Related to internal abstractions in c10 and ATen module: onnx Related to torch.onnx module: pybind Related to our Python bindings / interactions with other Python libraries oncall: jit Add this issue/PR to JIT oncall triage queue open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants