Skip to content

fix(llm): /model list returns 404 — list_models_remote appends /v1/models to base_url that already contains /v1 #1903

@bug-ops

Description

@bug-ops

Bug

/model (list models) always returns Error fetching models: OpenAI list models failed: 404 Not Found when using the standard OpenAI config.

Root Cause

In crates/zeph-llm/src/openai.rs:149:

let url = format!("{}/v1/models", self.base_url);

base_url in config.toml is https://api.openai.com/v1 (already contains /v1), so the constructed URL becomes:

https://api.openai.com/v1/v1/models  ← 404

All other endpoints in the same file use the correct pattern without doubling:

  • format!("{}/chat/completions", self.base_url)https://api.openai.com/v1/chat/completions
  • format!("{}/embeddings", self.base_url)https://api.openai.com/v1/embeddings

Fix

Change line 149 from:

let url = format!("{}/v1/models", self.base_url);

to:

let url = format!("{}/models", self.base_url);

Also update the doc comment on line 141 which says GET {base_url}/v1/models``.

Affected

  • crates/zeph-llm/src/openai.rs:149
  • CompatibleProvider::list_models_remote delegates to OpenAiProvider — also affected
  • Reproduces 100% in testing.toml with base_url = "https://api.openai.com/v1"

Severity

Medium — /model command broken for all users with standard base_url containing /v1; chat/embeddings unaffected.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingllmzeph-llm crate (Ollama, Claude)

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions