Skip to content

Conversation

@karljang
Copy link
Contributor

@karljang karljang commented Apr 2, 2019

Now, MaxPool operator supports the 'dilations' attribute with this commit:
onnx/onnx@b22041c

@ezyang ezyang added the module: onnx Related to torch.onnx label Apr 4, 2019
@ezyang ezyang requested a review from houseroad April 4, 2019 18:36
@spandantiwari
Copy link

@karljang - Could you add a test for this as well.

Copy link
Member

@houseroad houseroad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@karljang
Copy link
Contributor Author

karljang commented Apr 9, 2019

@houseroad
Thank you for the link to the test script file~.

I've added dilation test cases to the file, but they are failing because the ONNX export is using the 'opset' version 9, in which the dilation attribute is not supported.

I don't know how to solve it.
Can I choose 'opset' version while the tests are running?
Or, should I mark these tests to be skipped until ONNX opset 10 is available?
Please give me your opinion~.

Thank you,

@houseroad
Copy link
Member

Yes, you can specify the opset version when you export model from pytorch, here is an example: https://github.com/pytorch/pytorch/blob/master/test/onnx/test_operators.py#L552

@karljang
Copy link
Contributor Author

Thank you~, I'll take a look this week.
Actually, the link you shared was helpful to my work today~!

@karljang
Copy link
Contributor Author

karljang commented Apr 16, 2019

The failed test is due to a long line in a recent commit in test/test_jit.py, not mine :)
https://github.com/pytorch/pytorch/blob/master/test/test_jit.py#L11267

@houseroad
I've added a test case to test_operators.py, instead of test_pytorch_onnx_caffe2.py.
Does it make sense?

Actually, the Caffe2 seems not supporting the dilated pooling operations.

RuntimeError: [enforce fail at pool_op.h:25] dilation_[i] == 1. 2 vs 1. Pooling op does not support dilation right now.

But, still, I believe it makes sense to enable this feature because PyTorch supports it and dilated pooling has some application, just like the dilated convolutions.

@karljang
Copy link
Contributor Author

@houseroad @spandantiwari
Finally, all checks have passed~
Can I get this reviewed again~?

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@houseroad merged this pull request in 6f7a315.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

module: onnx Related to torch.onnx open source

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants