Skip to content
This repository was archived by the owner on Apr 8, 2025. It is now read-only.

Add model optimization to inference loading#415

Merged
Timoeller merged 3 commits intomasterfrom
multigpu_inference
Jun 23, 2020
Merged

Add model optimization to inference loading#415
Timoeller merged 3 commits intomasterfrom
multigpu_inference

Conversation

@Timoeller
Copy link
Copy Markdown
Contributor

No description provided.

@Timoeller Timoeller merged commit 56b6095 into master Jun 23, 2020
@Timoeller Timoeller deleted the multigpu_inference branch June 23, 2020 14:12
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant