-
-
Notifications
You must be signed in to change notification settings - Fork 69.7k
[Bug]: Ollama model discovery always runs at startup even when explicit models are configured, causing 5s timeout for remote Ollama hosts #28762
Description
Description
When Ollama is configured with a remote host via models.providers.ollama.baseUrl, the gateway always attempts auto-discovery of models at startup via /api/tags, even when models.providers.ollama.models is already explicitly provided in the config.
This causes a 5-second startup delay on every gateway restart when the remote Ollama host is unreachable or slow to respond during the brief window of service startup (e.g. before Tailscale DNS is fully resolved in a systemd service context).
Steps to Reproduce
- Configure a remote Ollama host in
openclaw.json:
{
"models": {
"providers": {
"ollama": {
"baseUrl": "http://192.168.x.x:11434",
"models": [
{ "id": "qwen2.5:7b", "name": "qwen2.5:7b", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 128000, "maxTokens": 8192 }
]
}
}
}
}- Add an Ollama auth profile
- Restart the gateway
- Observe:
[agents/model-providers] Failed to discover Ollama models: TimeoutErrorlogged after 5 seconds, even though models are already defined statically
Expected Behaviour
When models.providers.ollama.models is explicitly provided in config (non-empty array), skip auto-discovery and use the static model list — same as how vllm is handled:
// vllm correctly skips discovery when explicitly configured:
if (!params.explicitProviders?.vllm) {
// ... discover vllm
}Ollama has no equivalent guard and always calls buildOllamaProvider() → discoverOllamaModels().
Suggested Fix
In model-selection-*.js, wrap the buildOllamaProvider call:
if (ollamaKey) {
const ollamaBaseUrl = params.explicitProviders?.ollama?.baseUrl;
if (params.explicitProviders?.ollama?.models?.length) {
// Use static config, skip network discovery
providers.ollama = { baseUrl: resolveOllamaApiBase(ollamaBaseUrl), api: "ollama", models: params.explicitProviders.ollama.models, apiKey: ollamaKey };
} else {
providers.ollama = { ...await buildOllamaProvider(ollamaBaseUrl), apiKey: ollamaKey };
}
}Environment
- openclaw v2026.2.25
- Ollama on a remote Windows host (Tailscale network), accessed via IP
- Gateway running as a systemd user service on Ubuntu 22.04