-
Notifications
You must be signed in to change notification settings - Fork 487
Closed
Labels
feature requestNew feature or requestNew feature or request
Description
I'm currently exploring various model backends for agentic workflows and would love to see support for vLLM integrated into AgentIQ.
Motivation
vLLM's compatibility with HuggingFace models and support for features like speculative decoding and continuous batching make it an ideal fit for agentic systems that rely on real-time, multi-agent systems. Integrating vLLM could significantly enhance AgentIQ’s performance and flexibility in production environments.
Desired Outcome
- Option to configure AgentIQ to use vLLM as a backend for supported models
Please let me know if this work can be taken up or any other work related to this—happy to help where I can!
Metadata
Metadata
Assignees
Labels
feature requestNew feature or requestNew feature or request