Skip to content

inisis/OnnxSlim

OnnxSlim Logo OnnxSlim Logo

DeepWiki

OnnxSlim can help you slim your onnx model, with less operators, but same accuracy, better inference speed.

Benchmark

Image

Installation

Using Prebuilt

pip install onnxslim

Install From Source

pip install git+https://github.com/inisis/OnnxSlim@main

Install From Local

git clone https://github.com/inisis/OnnxSlim && cd OnnxSlim/
pip install .

How to use

Bash

onnxslim your_onnx_model slimmed_onnx_model

Inscript

import onnx
import onnxslim

model = onnx.load("model.onnx")
slimmed_model = onnxslim.slim(model)
onnx.save(slimmed_model, "slimmed_model.onnx")

For more usage, see onnxslim -h or refer to our examples

Projects using OnnxSlim

NVIDIA/TensorRT-Model-Optimizer alibaba/MNN
ultralytics/ultralytics Mozilla/smart_autofill
alibaba/MNN-LLM huggingface/transformers.js
huggingface/optimum PaddlePaddle/PaddleOCR
ModelScope/FunASR CVCUDA/CV-CUDA
THU-MIG/yolov10 sunsmarterjie/yolov12
nndeploy/nndeploy deepghs/imgutils

References

Contact

Discord: https://discord.gg/nRw2Fd3VUS QQ Group: 873569894

About

A Toolkit to Help Optimize Onnx Model

Topics

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Sponsor this project

 

Packages

No packages published

Languages