Skip to content

Tracking: Custom Provider Base URLs #3731

@sebslight

Description

@sebslight

Custom Provider Base URLs Tracking Issue

This issue tracks requests for custom API endpoint support across different model providers.

Related Issues

By Provider

Related TTS Issue

Use Cases

  1. Internal LLM gateways — Cost tracking, security, compliance, audit logging
  2. Local proxies — LiteLLM for smart routing (Ollama → Gemini → Claude fallback)
  3. Self-hosted models — Route to local inference servers
  4. Rate limit management — Distribute load across multiple endpoints

Current State

Provider baseUrl Support Status
OpenAI Partial OPENAI_BASE_URL env var works for some features
Anthropic No Crashes with "Unhandled API in mapOptionsForApi"
Ollama Yes Works but other bugs (#3597, #3153)
Custom Partial Works for tools, not always for model providers

Proposed Solution

  1. All providers should support baseUrl in config
  2. API type detection should not assume official endpoints
  3. Validation should be relaxed when custom baseUrl is configured
  4. Document which features work with custom endpoints

Acceptance Criteria

  • models.providers.<provider>.baseUrl works for all built-in providers
  • No crashes when using custom endpoints
  • Clear documentation for proxy/gateway setups

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions