Skip to content

Add a Mistral AI provider adapter #1

@jstuart0

Description

@jstuart0

Goal

Let users plug in a Mistral API key as an LLM provider (alongside Anthropic, OpenAI, OpenRouter, Google, and OpenAI-compatible).

Background

The AI watcher calls an LLM adapter to decide on next prompts / summaries / HITL asks. Adapters live in src/server/services/ai/llm/. Each one implements the LlmAdapter interface (types.ts). Mistral speaks a simple JSON API over HTTPS, similar to OpenAI chat completions.

Where to start

  • Read src/server/services/ai/llm/anthropic.ts as a reference (also openai-compatible.ts which covers the OpenAI chat shape).
  • Create src/server/services/ai/llm/mistral.ts exporting createMistralAdapter({ apiKey, baseUrl? }).
  • Wire it in src/server/services/ai/llm/registry.ts — add "mistral" to ProviderKind in types.ts, return the new adapter from getAdapter().
  • Add pricing in src/server/services/ai/llm/pricing.ts (mistral-small, mistral-large, mistral-nemo input/output cents/1M).
  • Add a PROVIDER_KINDS entry in src/server/routes/ai.ts and the UI dropdown in src/web/components/settings/AiSettingsPanel.tsx.

Acceptance criteria

  • POST /api/v1/ai/providers with kind: "mistral" persists cleanly.
  • The existing watcher test (src/server/services/ai/llm/llm.test.ts) is extended with a Mistral mock covering happy-path + 429 + 5xx.
  • "Load available models" in the provider form works (Mistral has GET /v1/models).
  • Typecheck + Biome + tests all green.

Metadata

Metadata

Assignees

No one assigned

    Labels

    ai-providerLLM provider adapters (Anthropic, OpenAI, Ollama, etc.)enhancementNew feature or requestgood first issueGood for newcomers

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions