feat(code-tools): add LMStudio and Ollama support for Claude Code#13518
feat(code-tools): add LMStudio and Ollama support for Claude Code#13518
Conversation
There was a problem hiding this comment.
Reviewed the changes. This PR adds LMStudio and Ollama as local Anthropic-compatible providers for Claude Code. The implementation is clean and focused:1. Added constant
2. Updated provider filter in to include local services
3. Added model predicate check for local servicesThe change is small (2 files, +18/-2) with no breaking changes. The fix addresses Issue #13487. No blocking issues found.
|
Note This issue/comment/review was translated by Claude. Should we first add anthropic apihost to lmstudio? Original Content要不先给lmstudio添加anthropic apihost? |
|
Note This issue/comment/review was translated by Claude.
Mainly considering that the user's port/host may change, and using anthropic host may affect the judgment. Original Content
主要是考虑到用户的端口/Host 会变化,用 anthropic host 可能会影响判断 |
|
Note This issue/comment/review was translated by Claude.
It shouldn't change, it defaults to the apiHost filled by the user. Original Content
不会变化吧,默认就是用户填写的apiHost |
|
Note This issue/comment/review was translated by Claude.
OK, I misread it 😭 I thought it was static Original Content
OK,我看错了 😭 还以为是静态的 |
|
Is it sufficient to simply add the anthropicApiHost endpoint to ollama and lmstudio? |
60a7992 to
62f5d3b
Compare
62f5d3b to
fa3ce0c
Compare
- Add LOCAL_ANTHROPIC_COMPATIBLE_PROVIDERS constant for local services - Update CLI_TOOL_PROVIDER_MAP to include LMStudio and Ollama - Update modelPredicate to allow selecting LMStudio/Ollama models - Use apiHost directly for local services to support custom ports Fixes #13487
Remove as const from LOCAL_ANTHROPIC_COMPATIBLE_PROVIDERS to match other provider array patterns. This eliminates the need for as any type assertions when calling includes().
… provider list Replace LOCAL_ANTHROPIC_COMPATIBLE_PROVIDERS with dynamic check for anthropicApiHost presence. This is more maintainable and allows any provider with anthropicApiHost configured to work with Claude Code.
fa3ce0c to
78c7b48
Compare
…3518) ### What this PR does Before this PR: Users could not select LMStudio or Ollama models when using Claude Code in Code Tools feature, even though both local services now support Anthropic API endpoints (`/v1/messages`). After this PR: - LMStudio and Ollama providers are now available for selection in Claude Code tool - Environment variables are correctly generated using `apiHost` (respecting user-configured ports) - Users can configure custom ports and the settings will be properly passed to Claude Code Fixes #13487 ### Why we need it and why it was done in this way The following tradeoffs were made: - For local services (LMStudio, Ollama), we use `apiHost` directly instead of a separate `anthropicApiHost` because these services share the same host for both OpenAI and Anthropic APIs, and users often customize ports - Added explicit type assertions for readonly array checks to satisfy TypeScript strict mode The following alternatives were considered: - Using static `anthropicApiHost` for Ollama (rejected: doesn't reflect user port changes) - Adding `anthropicApiHost` to LMStudio config (rejected: redundant since it would duplicate `apiHost`) Links to places where the discussion took place: - Issue: #13487 - LMStudio docs: https://lmstudio.ai/docs/integrations/claude-code ### Breaking changes None. This is a pure addition - no existing functionality is affected. ### Special notes for your reviewer - LMStudio requires starting the server with Anthropic compatibility mode enabled - Ollama needs models that support Anthropic format (newer models like qwen3-coder, glm-4.7) - Users should use models with >25k context for optimal Claude Code experience ### Checklist This checklist is not enforcing, but it's a reminder of items that could be relevant to every PR. Approvers are expected to review this list. - [x] PR: The PR description is expressive enough and will help future contributors - [x] Code: [Write code that humans can understand](https://en.wikiquote.org/wiki/Martin_Fowler#code-for-humans) and [Keep it simple](https://en.wikipedia.org/wiki/KISS_principle) - [x] Refactor: You have [left the code cleaner than you found it (Boy Scout Rule)](https://learning.oreilly.com/library/view/97-things-every/9780596809515/ch08.html) - [x] Upgrade: Impact of this change on upgrade flows was considered and addressed if required - [x] Documentation: A [user-guide update](https://docs.cherry-ai.com) was considered and is present (link) or not required. Check this only when the PR introduces or changes a user-facing feature or behavior. - [x] Self-review: I have reviewed my own code (e.g., via [`/gh-pr-review`](/.claude/skills/gh-pr-review/SKILL.md), `gh pr diff`, or GitHub UI) before requesting review from others ### Release note ```release-note feat(code-tools): add LMStudio and Ollama support for Claude Code Users can now select LMStudio and Ollama models in Code Tools when using Claude Code. Both local AI services support Anthropic API endpoints, enabling seamless integration. The implementation correctly handles custom port configurations by using the provider's `apiHost` directly. ```
What this PR does
Before this PR:
Users could not select LMStudio or Ollama models when using Claude Code in Code Tools feature, even though both local services now support Anthropic API endpoints (
/v1/messages).After this PR:
apiHost(respecting user-configured ports)Fixes #13487
Why we need it and why it was done in this way
The following tradeoffs were made:
apiHostdirectly instead of a separateanthropicApiHostbecause these services share the same host for both OpenAI and Anthropic APIs, and users often customize portsThe following alternatives were considered:
anthropicApiHostfor Ollama (rejected: doesn't reflect user port changes)anthropicApiHostto LMStudio config (rejected: redundant since it would duplicateapiHost)Links to places where the discussion took place:
Breaking changes
None. This is a pure addition - no existing functionality is affected.
Special notes for your reviewer
Checklist
This checklist is not enforcing, but it's a reminder of items that could be relevant to every PR.
Approvers are expected to review this list.
/gh-pr-review,gh pr diff, or GitHub UI) before requesting review from othersRelease note