Skip to content

Ollama provider not working - Unknown model error #2021

@cuantics

Description

@cuantics

Description

Ollama integration doesn't work as documented. The provider doesn't appear in the configure wizard, and manual configuration fails.

Environment

  • Clawdbot: 2026.1.24-3
  • macOS
  • Ollama running on localhost:11434
  • Model: deepseek-r1:latest

Steps to Reproduce

  1. Set OLLAMA_API_KEY
  2. Configure model: ollama/deepseek-r1:latest
  3. Start gateway
  4. Send message

Error

Unknown model: ollama/deepseek-r1:latest

Notes

  • Ollama works via curl
  • Ollama NOT in configure wizard
  • OpenRouter works fine

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions