fix: support local providers (ollama, new-api, aihubmix) for extended thinking and API server#12796
Merged
kangfenmao merged 6 commits intomainfrom Mar 12, 2026
Merged
Conversation
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What this PR does
Before this PR:
m.id.includes('claude'))After this PR:
anthropicApiHostconfiguration() => true), allowing all its Anthropic-compatible modelsFixes: #13247
Why we need it and why it was done in this way
Ollama extended thinking: Local AI models through Ollama should be able to access extended thinking features like other Anthropic-compatible providers. The implementation adds ollama to the Anthropic-compatible provider list and configures it with an
anthropicApiHostpointing to the local Ollama server.API server provider expansion:
new-apialready had Anthropic model support viagetProviderAnthropicModelCheckerbut was missing fromsupportedTypes, so its models weren't exposed through the API server. Similarly, ollama needs to be in the supported providers list forvalidateModelIdto resolve ollama models for agents.aihubmix model checker: aihubmix now supports all LLM models as Anthropic-compatible (not just Claude models), so the previous Claude-only filter was overly restrictive.
API key bypass: Local providers (ollama, lmstudio) don't require real API keys, but downstream SDK calls reject empty keys. The validation method now sets a placeholder key for these providers, documented via JSDoc.
Breaking changes
None — this is a backwards-compatible enhancement.
Special notes for your reviewer
anthropicApiHostfor existing ollama configs, with a fallback tohttp://localhost:11434ifapiHostis emptyvalidateAgentModelsJSDoc now documents the side-effect of mutatingprovider.apiKeyfor local providerslocalProvidersWithoutApiKeyis hoisted outside the loop to avoid per-iteration allocationChecklist
Release note