Skip to content

feat(code-tools): add LMStudio and Ollama support for Claude Code#13518

Merged
DeJeune merged 5 commits intomainfrom
feat/lmstudio-ollama-claude-code
Mar 25, 2026
Merged

feat(code-tools): add LMStudio and Ollama support for Claude Code#13518
DeJeune merged 5 commits intomainfrom
feat/lmstudio-ollama-claude-code

Conversation

@GeorgeDong32
Copy link
Copy Markdown
Collaborator

What this PR does

Before this PR:

Users could not select LMStudio or Ollama models when using Claude Code in Code Tools feature, even though both local services now support Anthropic API endpoints (/v1/messages).

After this PR:

  • LMStudio and Ollama providers are now available for selection in Claude Code tool
  • Environment variables are correctly generated using apiHost (respecting user-configured ports)
  • Users can configure custom ports and the settings will be properly passed to Claude Code

Fixes #13487

Why we need it and why it was done in this way

The following tradeoffs were made:

  • For local services (LMStudio, Ollama), we use apiHost directly instead of a separate anthropicApiHost because these services share the same host for both OpenAI and Anthropic APIs, and users often customize ports
  • Added explicit type assertions for readonly array checks to satisfy TypeScript strict mode

The following alternatives were considered:

  • Using static anthropicApiHost for Ollama (rejected: doesn't reflect user port changes)
  • Adding anthropicApiHost to LMStudio config (rejected: redundant since it would duplicate apiHost)

Links to places where the discussion took place:

Breaking changes

None. This is a pure addition - no existing functionality is affected.

Special notes for your reviewer

  • LMStudio requires starting the server with Anthropic compatibility mode enabled
  • Ollama needs models that support Anthropic format (newer models like qwen3-coder, glm-4.7)
  • Users should use models with >25k context for optimal Claude Code experience

Checklist

This checklist is not enforcing, but it's a reminder of items that could be relevant to every PR.
Approvers are expected to review this list.

Release note

feat(code-tools): add LMStudio and Ollama support for Claude Code

Users can now select LMStudio and Ollama models in Code Tools when using Claude Code. Both local AI services support Anthropic API endpoints, enabling seamless integration. The implementation correctly handles custom port configurations by using the provider's `apiHost` directly.

Copy link
Copy Markdown
Contributor

@cherry-ai-bot cherry-ai-bot bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewed the changes. This PR adds LMStudio and Ollama as local Anthropic-compatible providers for Claude Code. The implementation is clean and focused:1. Added constant
2. Updated provider filter in to include local services
3. Added model predicate check for local servicesThe change is small (2 files, +18/-2) with no breaking changes. The fix addresses Issue #13487. No blocking issues found.

@DeJeune
Copy link
Copy Markdown
Collaborator

DeJeune commented Mar 16, 2026

Note

This issue/comment/review was translated by Claude.

Should we first add anthropic apihost to lmstudio?


Original Content 要不先给lmstudio添加anthropic apihost?

@GeorgeDong32
Copy link
Copy Markdown
Collaborator Author

GeorgeDong32 commented Mar 16, 2026

Note

This issue/comment/review was translated by Claude.

Should we first add anthropic apihost to lmstudio?

Mainly considering that the user's port/host may change, and using anthropic host may affect the judgment.


Original Content

要不先给lmstudio添加anthropic apihost?

主要是考虑到用户的端口/Host 会变化,用 anthropic host 可能会影响判断

@DeJeune
Copy link
Copy Markdown
Collaborator

DeJeune commented Mar 16, 2026

Note

This issue/comment/review was translated by Claude.

Should we first add anthropic apihost to lmstudio?
Mainly considering that the user's port/host may change, and using anthropic host may affect the judgment.

It shouldn't change, it defaults to the apiHost filled by the user.


Original Content

Note

This issue/comment/review was translated by Claude.

Should we first add anthropic apihost to lmstudio?

Mainly considering that the user's port/host may change, and using anthropic host may affect the judgment.

Original Content

不会变化吧,默认就是用户填写的apiHost

@GeorgeDong32
Copy link
Copy Markdown
Collaborator Author

GeorgeDong32 commented Mar 16, 2026

Note

This issue/comment/review was translated by Claude.

Should we first add anthropic apihost to lmstudio?
Mainly considering that the user's port/host may change, and using anthropic host may affect the judgment.
It shouldn't change, it defaults to the apiHost filled by the user.

OK, I misread it 😭 I thought it was static


Original Content

Should we first add anthropic apihost to lmstudio?
Mainly considering that the user's port/host may change, and using anthropic host may affect the judgment.
It shouldn't change, it defaults to the apiHost filled by the user.

OK,我看错了 😭 还以为是静态的

@kangfenmao
Copy link
Copy Markdown
Collaborator

Is it sufficient to simply add the anthropicApiHost endpoint to ollama and lmstudio?

@GeorgeDong32 GeorgeDong32 requested a review from 0xfullex as a code owner March 19, 2026 06:26
@GeorgeDong32 GeorgeDong32 force-pushed the feat/lmstudio-ollama-claude-code branch from 60a7992 to 62f5d3b Compare March 19, 2026 10:36
@GeorgeDong32 GeorgeDong32 requested a review from kangfenmao March 19, 2026 10:55
@GeorgeDong32 GeorgeDong32 force-pushed the feat/lmstudio-ollama-claude-code branch from 62f5d3b to fa3ce0c Compare March 22, 2026 13:20
- Add LOCAL_ANTHROPIC_COMPATIBLE_PROVIDERS constant for local services
- Update CLI_TOOL_PROVIDER_MAP to include LMStudio and Ollama
- Update modelPredicate to allow selecting LMStudio/Ollama models
- Use apiHost directly for local services to support custom ports

Fixes #13487
Remove as const from LOCAL_ANTHROPIC_COMPATIBLE_PROVIDERS to match
other provider array patterns. This eliminates the need for as any
type assertions when calling includes().
… provider list

Replace LOCAL_ANTHROPIC_COMPATIBLE_PROVIDERS with dynamic check for
anthropicApiHost presence. This is more maintainable and allows any
provider with anthropicApiHost configured to work with Claude Code.
@GeorgeDong32 GeorgeDong32 force-pushed the feat/lmstudio-ollama-claude-code branch from fa3ce0c to 78c7b48 Compare March 22, 2026 13:49
@DeJeune DeJeune merged commit 816cfe6 into main Mar 25, 2026
7 checks passed
@DeJeune DeJeune deleted the feat/lmstudio-ollama-claude-code branch March 25, 2026 09:59
MyPrototypeWhat pushed a commit that referenced this pull request Mar 30, 2026
…3518)

### What this PR does

Before this PR:

Users could not select LMStudio or Ollama models when using Claude Code
in Code Tools feature, even though both local services now support
Anthropic API endpoints (`/v1/messages`).

After this PR:

- LMStudio and Ollama providers are now available for selection in
Claude Code tool
- Environment variables are correctly generated using `apiHost`
(respecting user-configured ports)
- Users can configure custom ports and the settings will be properly
passed to Claude Code

Fixes #13487

### Why we need it and why it was done in this way

The following tradeoffs were made:

- For local services (LMStudio, Ollama), we use `apiHost` directly
instead of a separate `anthropicApiHost` because these services share
the same host for both OpenAI and Anthropic APIs, and users often
customize ports
- Added explicit type assertions for readonly array checks to satisfy
TypeScript strict mode

The following alternatives were considered:

- Using static `anthropicApiHost` for Ollama (rejected: doesn't reflect
user port changes)
- Adding `anthropicApiHost` to LMStudio config (rejected: redundant
since it would duplicate `apiHost`)

Links to places where the discussion took place:

- Issue: #13487
- LMStudio docs: https://lmstudio.ai/docs/integrations/claude-code

### Breaking changes

None. This is a pure addition - no existing functionality is affected.

### Special notes for your reviewer

- LMStudio requires starting the server with Anthropic compatibility
mode enabled
- Ollama needs models that support Anthropic format (newer models like
qwen3-coder, glm-4.7)
- Users should use models with >25k context for optimal Claude Code
experience

### Checklist

This checklist is not enforcing, but it's a reminder of items that could
be relevant to every PR.
Approvers are expected to review this list.

- [x] PR: The PR description is expressive enough and will help future
contributors
- [x] Code: [Write code that humans can
understand](https://en.wikiquote.org/wiki/Martin_Fowler#code-for-humans)
and [Keep it simple](https://en.wikipedia.org/wiki/KISS_principle)
- [x] Refactor: You have [left the code cleaner than you found it (Boy
Scout
Rule)](https://learning.oreilly.com/library/view/97-things-every/9780596809515/ch08.html)
- [x] Upgrade: Impact of this change on upgrade flows was considered and
addressed if required
- [x] Documentation: A [user-guide update](https://docs.cherry-ai.com)
was considered and is present (link) or not required. Check this only
when the PR introduces or changes a user-facing feature or behavior.
- [x] Self-review: I have reviewed my own code (e.g., via
[`/gh-pr-review`](/.claude/skills/gh-pr-review/SKILL.md), `gh pr diff`,
or GitHub UI) before requesting review from others

### Release note

```release-note
feat(code-tools): add LMStudio and Ollama support for Claude Code

Users can now select LMStudio and Ollama models in Code Tools when using Claude Code. Both local AI services support Anthropic API endpoints, enabling seamless integration. The implementation correctly handles custom port configurations by using the provider's `apiHost` directly.
```
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature]: Can LM-Studio support be added for claude code in code tools?

3 participants