-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Fix ollama support with AI SDK #1504
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Ollama models now work with the AI SDK provider format (e.g., ollama/qwen3:1.7b). The fix addresses two issues: 1. Filter out undefined values when checking if clientOptions has meaningful data 2. Skip error logging for providers that intentionally don't need API keys 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Haiku 4.5 <[email protected]>
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No issues found across 2 files
Greptile SummaryFixes Ollama support by properly detecting when Key Changes:
The root cause was that when using format like Confidence Score: 5/5
Important Files Changed
Sequence DiagramsequenceDiagram
participant User
participant LLMProvider
participant getAISDKLanguageModel
participant AISDKProviders
participant AISDKProvidersWithAPIKey
participant loadApiKeyFromEnv
User->>LLMProvider: getClient("ollama/qwen3:1.7b", clientOptions)
LLMProvider->>getAISDKLanguageModel: call with subProvider="ollama", clientOptions
Note over getAISDKLanguageModel: NEW: Check if clientOptions<br/>has meaningful values
getAISDKLanguageModel->>getAISDKLanguageModel: Object.values(clientOptions)<br/>.some(v !== undefined)
alt clientOptions has meaningful values
getAISDKLanguageModel->>AISDKProvidersWithAPIKey: lookup provider with API key
AISDKProvidersWithAPIKey-->>getAISDKLanguageModel: throw error (ollama not in map)
else no meaningful values (ollama case)
getAISDKLanguageModel->>AISDKProviders: lookup default provider
AISDKProviders-->>getAISDKLanguageModel: return ollama provider
end
getAISDKLanguageModel-->>LLMProvider: return language model
User->>loadApiKeyFromEnv: load API key for "ollama"
Note over loadApiKeyFromEnv: NEW: Check if provider<br/>needs API key
loadApiKeyFromEnv->>loadApiKeyFromEnv: providersWithoutApiKey.has("ollama")
alt provider needs API key
loadApiKeyFromEnv->>loadApiKeyFromEnv: log warning if env var not found
else provider doesn't need API key (ollama)
loadApiKeyFromEnv->>loadApiKeyFromEnv: skip warning log
end
loadApiKeyFromEnv-->>User: return undefined (no error)
|
Tests verify that getAISDKLanguageModel correctly handles: - ollama without clientOptions - ollama with empty clientOptions - ollama with undefined/null apiKey values 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Haiku 4.5 <[email protected]>
This ensures ollama works even when users mistakenly provide an API key or want to set a custom baseURL for a remote ollama server. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Haiku 4.5 <[email protected]>
cb197ac to
d3001d5
Compare
Summary
Ollama models now work with the AI SDK provider format (e.g.,
ollama/qwen3:1.7b). Previously, using ollama would fail withUnsupportedAISDKModelProviderError. Closes #1164Changes
clientOptionshas meaningful data ingetAISDKLanguageModelTesting
Tested with
model: "ollama/qwen3:1.7b"and verified Stagehand initializes successfully without errors.Summary by cubic
Enable Ollama models to work with the AI SDK provider format (e.g., ollama/qwen3:1.7b). Fixes initialization errors and removes unnecessary API key warnings.
Written for commit d3001d5. Summary will update on new commits.