-
Notifications
You must be signed in to change notification settings - Fork 2
fix(llm): /model list returns 404 — list_models_remote appends /v1/models to base_url that already contains /v1 #1903
Copy link
Copy link
Closed
Labels
bugSomething isn't workingSomething isn't workingllmzeph-llm crate (Ollama, Claude)zeph-llm crate (Ollama, Claude)
Description
Bug
/model (list models) always returns Error fetching models: OpenAI list models failed: 404 Not Found when using the standard OpenAI config.
Root Cause
In crates/zeph-llm/src/openai.rs:149:
let url = format!("{}/v1/models", self.base_url);base_url in config.toml is https://api.openai.com/v1 (already contains /v1), so the constructed URL becomes:
https://api.openai.com/v1/v1/models ← 404
All other endpoints in the same file use the correct pattern without doubling:
format!("{}/chat/completions", self.base_url)→https://api.openai.com/v1/chat/completions✓format!("{}/embeddings", self.base_url)→https://api.openai.com/v1/embeddings✓
Fix
Change line 149 from:
let url = format!("{}/v1/models", self.base_url);to:
let url = format!("{}/models", self.base_url);Also update the doc comment on line 141 which says GET {base_url}/v1/models``.
Affected
crates/zeph-llm/src/openai.rs:149CompatibleProvider::list_models_remotedelegates toOpenAiProvider— also affected- Reproduces 100% in
testing.tomlwithbase_url = "https://api.openai.com/v1"
Severity
Medium — /model command broken for all users with standard base_url containing /v1; chat/embeddings unaffected.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingllmzeph-llm crate (Ollama, Claude)zeph-llm crate (Ollama, Claude)