Goal
Let users plug in a Mistral API key as an LLM provider (alongside Anthropic, OpenAI, OpenRouter, Google, and OpenAI-compatible).
Background
The AI watcher calls an LLM adapter to decide on next prompts / summaries / HITL asks. Adapters live in src/server/services/ai/llm/. Each one implements the LlmAdapter interface (types.ts). Mistral speaks a simple JSON API over HTTPS, similar to OpenAI chat completions.
Where to start
- Read
src/server/services/ai/llm/anthropic.ts as a reference (also openai-compatible.ts which covers the OpenAI chat shape).
- Create
src/server/services/ai/llm/mistral.ts exporting createMistralAdapter({ apiKey, baseUrl? }).
- Wire it in
src/server/services/ai/llm/registry.ts — add "mistral" to ProviderKind in types.ts, return the new adapter from getAdapter().
- Add pricing in
src/server/services/ai/llm/pricing.ts (mistral-small, mistral-large, mistral-nemo input/output cents/1M).
- Add a PROVIDER_KINDS entry in
src/server/routes/ai.ts and the UI dropdown in src/web/components/settings/AiSettingsPanel.tsx.
Acceptance criteria
Goal
Let users plug in a Mistral API key as an LLM provider (alongside Anthropic, OpenAI, OpenRouter, Google, and OpenAI-compatible).
Background
The AI watcher calls an LLM adapter to decide on next prompts / summaries / HITL asks. Adapters live in
src/server/services/ai/llm/. Each one implements theLlmAdapterinterface (types.ts). Mistral speaks a simple JSON API over HTTPS, similar to OpenAI chat completions.Where to start
src/server/services/ai/llm/anthropic.tsas a reference (alsoopenai-compatible.tswhich covers the OpenAI chat shape).src/server/services/ai/llm/mistral.tsexportingcreateMistralAdapter({ apiKey, baseUrl? }).src/server/services/ai/llm/registry.ts— add"mistral"toProviderKindintypes.ts, return the new adapter fromgetAdapter().src/server/services/ai/llm/pricing.ts(mistral-small, mistral-large, mistral-nemo input/output cents/1M).src/server/routes/ai.tsand the UI dropdown insrc/web/components/settings/AiSettingsPanel.tsx.Acceptance criteria
kind: "mistral"persists cleanly.src/server/services/ai/llm/llm.test.ts) is extended with a Mistral mock covering happy-path + 429 + 5xx.GET /v1/models).