Skip to content
This repository was archived by the owner on Apr 8, 2025. It is now read-only.

Fix batching in ONNX forward pass#559

Merged
tanaysoni merged 1 commit intomasterfrom
onnx-batch
Sep 28, 2020
Merged

Fix batching in ONNX forward pass#559
tanaysoni merged 1 commit intomasterfrom
onnx-batch

Conversation

@tanaysoni
Copy link
Copy Markdown
Contributor

The current NumPy transformation only works for a single input (model batch_size=1). This PR generalizes it for any batch size.

@tanaysoni tanaysoni requested a review from tholor September 28, 2020 09:39
@tanaysoni tanaysoni merged commit e8afb7f into master Sep 28, 2020
@tanaysoni tanaysoni deleted the onnx-batch branch September 28, 2020 10:44
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants