Skip to content

Feature request: Add Ollama as a local model provider in QuickStart #8239

@fabricioartur

Description

@fabricioartur

Feature request: Add Ollama as a local model provider in QuickStart

Context

I’m trying to set up OpenClaw locally on a Linux notebook.

The current QuickStart flow only allows selecting remote providers (OpenAI, Anthropic, etc.), which blocks a local-first setup.

Problem

  • No option to select a local model provider
  • No Ollama integration in the onboarding wizard
  • Difficult or unclear local setup on Linux

As a result, I haven’t been able to complete a fully local installation using the wizard.

Requested feature

Add Ollama as a first-class model provider in the QuickStart flow.

Suggested behavior

  • Allow selecting ollama as a provider during onboarding
  • Automatically detect a running Ollama instance at localhost:11434 (if available)
  • Provide basic local model selection (e.g. llama3, mistral, codellama)

Expected benefit

  • Improved Linux support
  • Fully local, offline-capable setup
  • Better alignment with a local-first workflow

Thanks for the great project 🚀

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions