Skip to content

Ollama provider: model runs but produces no output #8505

@ewijaya

Description

@ewijaya

Description

When using the Ollama provider with ollama launch openclaw, the model runs successfully but produces no visible output to the user.

Steps to Reproduce

  1. Install and run ollama serve as a systemd service
  2. Sign in to Ollama cloud: ollama signin
  3. Pull a cloud model: ollama run kimi-k2.5:cloud (works fine in CLI)
  4. Configure OpenClaw: ollama launch openclaw
  5. Select kimi-k2.5:cloud and confirm
  6. Restart gateway: openclaw gateway restart
  7. Switch model: /model kimi
  8. Send a message: "tell me a joke"

Expected Behavior

The model should respond with output visible to the user.

Actual Behavior

  • Logs show run completes successfully (durationMs=5236)
  • No errors in logs
  • User sees "(no output)"
  • Session shows tokens used (47k/131k), confirming model ran
  • Direct curl to Ollama API works fine: curl http://localhost:11434/v1/chat/completions returns proper responses

Relevant Logs

{"subsystem":"agent/embedded"} embedded run start: runId=bf6d427c-8a3a-4c07-a02d-10eb93b9aadb provider=ollama model=kimi-k2.5:cloud thinking=off
{"subsystem":"agent/embedded"} embedded run agent start: runId=bf6d427c-8a3a-4c07-a02d-10eb93b9aadb
{"subsystem":"agent/embedded"} embedded run agent end: runId=bf6d427c-8a3a-4c07-a02d-10eb93b9aadb
{"subsystem":"agent/embedded"} embedded run prompt end: durationMs=5236
{"subsystem":"agent/embedded"} embedded run done: durationMs=5470 aborted=false

No errors, run completes normally but output never reaches the user.

Config

"ollama": {
  "baseUrl": "http://127.0.0.1:11434/v1",
  "apiKey": "ollama-local",
  "api": "openai-completions",
  "models": [
    {
      "id": "kimi-k2.5:cloud",
      "name": "kimi-k2.5:cloud",
      "reasoning": true,
      "input": ["text"],
      "contextWindow": 131072,
      "maxTokens": 16384
    }
  ]
}

Environment

  • OpenClaw: 2026.2.1 (ed4529e)
  • Node: v22.22.0
  • OS: Linux 6.8.0-1044-aws (x64)
  • Ollama: latest
  • Model: kimi-k2.5:cloud (Ollama cloud model)

Workaround

Direct Ollama CLI usage works fine. Issue only affects OpenClaw integration.

Notes

Tried with both reasoning: true and reasoning: false. Same result.
Auth was initially missing (got No API key found for provider ollama error), added dummy key ollama-local which resolved auth error but output still doesn't appear.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions