Description
When using the Ollama provider with ollama launch openclaw, the model runs successfully but produces no visible output to the user.
Steps to Reproduce
- Install and run
ollama serve as a systemd service
- Sign in to Ollama cloud:
ollama signin
- Pull a cloud model:
ollama run kimi-k2.5:cloud (works fine in CLI)
- Configure OpenClaw:
ollama launch openclaw
- Select
kimi-k2.5:cloud and confirm
- Restart gateway:
openclaw gateway restart
- Switch model:
/model kimi
- Send a message: "tell me a joke"
Expected Behavior
The model should respond with output visible to the user.
Actual Behavior
- Logs show run completes successfully (
durationMs=5236)
- No errors in logs
- User sees "(no output)"
- Session shows tokens used (47k/131k), confirming model ran
- Direct curl to Ollama API works fine:
curl http://localhost:11434/v1/chat/completions returns proper responses
Relevant Logs
{"subsystem":"agent/embedded"} embedded run start: runId=bf6d427c-8a3a-4c07-a02d-10eb93b9aadb provider=ollama model=kimi-k2.5:cloud thinking=off
{"subsystem":"agent/embedded"} embedded run agent start: runId=bf6d427c-8a3a-4c07-a02d-10eb93b9aadb
{"subsystem":"agent/embedded"} embedded run agent end: runId=bf6d427c-8a3a-4c07-a02d-10eb93b9aadb
{"subsystem":"agent/embedded"} embedded run prompt end: durationMs=5236
{"subsystem":"agent/embedded"} embedded run done: durationMs=5470 aborted=false
No errors, run completes normally but output never reaches the user.
Config
"ollama": {
"baseUrl": "http://127.0.0.1:11434/v1",
"apiKey": "ollama-local",
"api": "openai-completions",
"models": [
{
"id": "kimi-k2.5:cloud",
"name": "kimi-k2.5:cloud",
"reasoning": true,
"input": ["text"],
"contextWindow": 131072,
"maxTokens": 16384
}
]
}
Environment
- OpenClaw: 2026.2.1 (ed4529e)
- Node: v22.22.0
- OS: Linux 6.8.0-1044-aws (x64)
- Ollama: latest
- Model: kimi-k2.5:cloud (Ollama cloud model)
Workaround
Direct Ollama CLI usage works fine. Issue only affects OpenClaw integration.
Notes
Tried with both reasoning: true and reasoning: false. Same result.
Auth was initially missing (got No API key found for provider ollama error), added dummy key ollama-local which resolved auth error but output still doesn't appear.
Description
When using the Ollama provider with
ollama launch openclaw, the model runs successfully but produces no visible output to the user.Steps to Reproduce
ollama serveas a systemd serviceollama signinollama run kimi-k2.5:cloud(works fine in CLI)ollama launch openclawkimi-k2.5:cloudand confirmopenclaw gateway restart/model kimiExpected Behavior
The model should respond with output visible to the user.
Actual Behavior
durationMs=5236)curl http://localhost:11434/v1/chat/completionsreturns proper responsesRelevant Logs
No errors, run completes normally but output never reaches the user.
Config
Environment
Workaround
Direct Ollama CLI usage works fine. Issue only affects OpenClaw integration.
Notes
Tried with both
reasoning: trueandreasoning: false. Same result.Auth was initially missing (got
No API key found for provider ollamaerror), added dummy keyollama-localwhich resolved auth error but output still doesn't appear.