Skip to content
This repository was archived by the owner on Apr 8, 2025. It is now read-only.

Add ONNX support for Question Answering#288

Merged
tanaysoni merged 9 commits intomasterfrom
onnx-runtime
Mar 26, 2020
Merged

Add ONNX support for Question Answering#288
tanaysoni merged 9 commits intomasterfrom
onnx-runtime

Conversation

@tanaysoni
Copy link
Copy Markdown
Contributor

@tanaysoni tanaysoni commented Mar 20, 2020

This PR adds conversion of PyTorch model to ONNX and support for Inference using ONNX Runtime.

For converting a PyTorch based Question Answering AdaptiveModel, use AdpativeModel.convert_to_onnx(). It exports a converted ONNX model to the supplied path.

The exported ONNX model can be used with the FARM Inferencer. Under the hood, ONNXAdaptiveModel class implements ONNX Runtime for the forward pass on the model.

This example contains a basic guide for conversion to ONNX and Inference on the converted model.

This thread has preliminary benchmarks for inference with ONNX Runtime.

@tanaysoni tanaysoni requested a review from tholor March 20, 2020 11:51
Copy link
Copy Markdown
Member

@tholor tholor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good. Only add basic doc strings for the major classes / methods (BasicAdaptiveModel, ONNXAdaptiveModel, convert_to_onnx(), OnnxWrapper), please.

@tanaysoni tanaysoni changed the title WIP: Add ONNX support for Question Answering Add ONNX support for Question Answering Mar 25, 2020
@tanaysoni tanaysoni requested a review from tholor March 25, 2020 08:50
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants