This document provides a high-level introduction to DeepChat, explaining its purpose, core capabilities, architectural foundations, and major subsystems. It serves as an entry point for understanding the codebase structure and design philosophy.
For detailed information about specific subsystems, see:
DeepChat is a powerful open-source AI agent platform that unifies models, tools, and agent runtimes into a single desktop application README.md5-7 It provides a seamless interface for interacting with cloud APIs (OpenAI, Gemini, Anthropic, DeepSeek), local models (Ollama), and external agent workflows README.md50-54
The platform is built on a multi-process Electron architecture, utilizing a "Presenter" pattern to coordinate logic between the main process and the Vue-based renderer CONTRIBUTING.zh.md120-123 It emphasizes extensibility through standardized protocols like MCP (Model Context Protocol) and ACP (Agent Client Protocol) README.md54-55
For details on setting up the development environment and building the project, see Getting Started.
DeepChat integrates three primary capability domains:
The application supports a wide array of LLM providers, including DeepSeek, OpenAI, Kimi, Grok, Gemini, Anthropic, and Ollama README.md82-85 It manages provider metadata and model-specific capabilities through a unified AI SDK runtime layer introduced in v1.0.3 CHANGELOG.md4-5
The Model Context Protocol (MCP) integration allows LLMs to use external tools for tasks like code execution, web searching, and file manipulation README.md97-99 DeepChat supports multiple transport layers (stdio, SSE, HTTP) and includes built-in in-memory servers for immediate utility README.md104-105
DeepChat implements the Agent Client Protocol (ACP), enabling external agent runtimes to be treated as first-class "models" with dedicated workspace UIs for structured plans and terminal outputs README.md121-124 A modular Skills system further extends these agents through a managed installation and execution runtime CHANGELOG.md101-102
Sources: README.md50-131 CONTRIBUTING.zh.md120-158 CHANGELOG.md3-40
DeepChat follows Electron's security model with strict process isolation. The Presenter layer in the main process acts as the central orchestrator for all business logic and IPC management CONTRIBUTING.zh.md122-123
The following diagram associates high-level system concepts with the specific classes and files that implement them.
Sources: CONTRIBUTING.zh.md108-149 package.json65-116 src/renderer/src/main.ts60-75 CLAUDE.md3-10
Communication between main process modules and the renderer is largely event-driven, utilizing a centralized eventBus CLAUDE.md4 Events are categorized to ensure type safety and organized handling across configuration, session, and protocol layers.
Sources: CLAUDE.md4 CONTRIBUTING.zh.md123-125 AGENTS.md4
| Subsystem | Primary Code Entity | Role |
|---|---|---|
| Configuration | ConfigPresenter | Manages persistence for app settings, providers, and models using electron-store CONTRIBUTING.zh.md108 |
| LLM Logic | LLMProviderPresenter | Standardizes requests across different LLM APIs using AI SDK runtime CHANGELOG.md4 |
| Agent Execution | AgentRuntimePresenter | Core engine for agent logic, context building, and tool output guarding CONTRIBUTING.zh.md153 |
| MCP Management | McpPresenter | Handles MCP client lifecycles, tool discovery, and transport management CONTRIBUTING.zh.md154 |
| Storage | SQLitePresenter | Handles persistent message history and thread management using SQLite CONTRIBUTING.zh.md124 |
| Window/Tabs | WindowPresenter | Controls Electron BrowserWindow lifecycle and multi-window architecture CONTRIBUTING.zh.md123 |
DeepChat leverages modern web and desktop technologies:
better-sqlite3-multiple-ciphers (history), electron-store (config), @duckdb/node-api (vector index) package.json76-92electron-vite (build), vitest (testing), oxlint (linting) package.json24-33Sources: package.json1-117 electron.vite.config.ts1-118 src/renderer/src/main.ts1-75
Refresh this wiki
This wiki was recently refreshed. Please wait 3 days to refresh again.