-
-
Notifications
You must be signed in to change notification settings - Fork 69.2k
Feature request: Add Ollama as a local model provider in QuickStart #8239
Copy link
Copy link
Closed
Labels
enhancementNew feature or requestNew feature or request
Description
Feature request: Add Ollama as a local model provider in QuickStart
Context
I’m trying to set up OpenClaw locally on a Linux notebook.
The current QuickStart flow only allows selecting remote providers (OpenAI, Anthropic, etc.), which blocks a local-first setup.
Problem
- No option to select a local model provider
- No Ollama integration in the onboarding wizard
- Difficult or unclear local setup on Linux
As a result, I haven’t been able to complete a fully local installation using the wizard.
Requested feature
Add Ollama as a first-class model provider in the QuickStart flow.
Suggested behavior
- Allow selecting
ollamaas a provider during onboarding - Automatically detect a running Ollama instance at
localhost:11434(if available) - Provide basic local model selection (e.g. llama3, mistral, codellama)
Expected benefit
- Improved Linux support
- Fully local, offline-capable setup
- Better alignment with a local-first workflow
Thanks for the great project 🚀
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request
Type
Fields
Give feedbackNo fields configured for issues without a type.