Skip to content

Pinned Loading

  1. vllm vllm Public

    A high-throughput and memory-efficient inference and serving engine for LLMs

    Python 65.8k 12.1k

  2. llm-compressor llm-compressor Public

    Transformers-compatible library for applying various compression algorithms to LLMs for optimized deployment with vLLM

    Python 2.4k 325

  3. recipes recipes Public

    Common recipes to run vLLM

    Jupyter Notebook 283 105

  4. speculators speculators Public

    A unified library for building, evaluating, and storing speculative decoding algorithms for LLM inference in vLLM

    Python 168 22

  5. semantic-router semantic-router Public

    Intelligent Router for Mixture-of-Models

    Go 2.5k 333

Repositories

Showing 10 of 30 repositories
  • vllm-gaudi Public

    Community maintained hardware plugin for vLLM on Intel Gaudi

    vllm-project/vllm-gaudi’s past year of commit activity
    Python 21 Apache-2.0 85 1 66 Updated Dec 19, 2025
  • vllm Public

    A high-throughput and memory-efficient inference and serving engine for LLMs

    vllm-project/vllm’s past year of commit activity
    Python 65,770 Apache-2.0 12,066 1,867 (37 issues need help) 1,290 Updated Dec 19, 2025
  • vllm-omni Public

    A framework for efficient model inference with omni-modality models

    vllm-project/vllm-omni’s past year of commit activity
    Python 1,000 Apache-2.0 136 78 (31 issues need help) 37 Updated Dec 19, 2025
  • speculators Public

    A unified library for building, evaluating, and storing speculative decoding algorithms for LLM inference in vLLM

    vllm-project/speculators’s past year of commit activity
    Python 168 Apache-2.0 22 8 (4 issues need help) 12 Updated Dec 19, 2025
  • semantic-router Public

    Intelligent Router for Mixture-of-Models

    vllm-project/semantic-router’s past year of commit activity
    Go 2,495 Apache-2.0 333 95 (13 issues need help) 30 Updated Dec 19, 2025
  • llm-compressor Public

    Transformers-compatible library for applying various compression algorithms to LLMs for optimized deployment with vLLM

    vllm-project/llm-compressor’s past year of commit activity
    Python 2,431 Apache-2.0 325 70 (17 issues need help) 50 Updated Dec 19, 2025
  • recipes Public

    Common recipes to run vLLM

    vllm-project/recipes’s past year of commit activity
    Jupyter Notebook 283 Apache-2.0 105 11 29 Updated Dec 19, 2025
  • vllm-ascend Public

    Community maintained hardware plugin for vLLM on Ascend

    vllm-project/vllm-ascend’s past year of commit activity
    Python 1,485 Apache-2.0 670 822 (8 issues need help) 262 Updated Dec 19, 2025
  • ci-infra Public

    This repo hosts code for vLLM CI & Performance Benchmark infrastructure.

    vllm-project/ci-infra’s past year of commit activity
    HCL 27 Apache-2.0 53 0 26 Updated Dec 19, 2025
  • tpu-inference Public

    TPU inference for vLLM, with unified JAX and PyTorch support.

    vllm-project/tpu-inference’s past year of commit activity
    Python 199 Apache-2.0 62 18 (1 issue needs help) 86 Updated Dec 19, 2025