-
Notifications
You must be signed in to change notification settings - Fork 614
fix: thinking time realtime realtime #1095
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThe LLM event handler now tracks reasoning content streaming duration by introducing a Changes
Sequence Diagram(s)sequenceDiagram
participant Handler
participant State
participant Message
Handler->>State: Receive reasoning_content start
State->>Message: Update reasoningStartTime & reasoningEndTime
State->>State: Set lastReasoningTime = currentTime
loop During streaming
Handler->>State: Re-fetch lastBlock (current state)
alt Block exists
State->>State: Append to reasoning_content
State->>State: Update reasoning_time.end = currentTime
else Create new block
State->>State: Initialize reasoning_time {start, end}
State->>State: Set reasoning_time.start from reasoningStartTime
end
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts (1)
104-107: Consider the performance impact of frequent metadata updates.Updating message metadata on every
reasoning_contentchunk could result in many database writes per second during active reasoning. While this enables real-time display of reasoning duration, consider whether the update frequency justifies the I/O cost.Potential optimizations:
- Debounce metadata updates (e.g., update at most every 100-200ms)
- Batch metadata writes with the message content update at line 218
- Move the final metadata update to
finalizeMessage(lines 341-343 already handle this)If real-time updates are essential for UX, document the performance trade-off.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts(2 hunks)
🧰 Additional context used
📓 Path-based instructions (10)
**/*.{js,jsx,ts,tsx}
📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)
**/*.{js,jsx,ts,tsx}: 使用 OxLint 进行代码检查
Log和注释使用英文书写
Files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
src/{main,renderer}/**/*.ts
📄 CodeRabbit inference engine (.cursor/rules/electron-best-practices.mdc)
src/{main,renderer}/**/*.ts: Use context isolation for improved security
Implement proper inter-process communication (IPC) patterns
Optimize application startup time with lazy loading
Implement proper error handling and logging for debugging
Files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
src/main/**/*.ts
📄 CodeRabbit inference engine (.cursor/rules/electron-best-practices.mdc)
Use Electron's built-in APIs for file system and native dialogs
Files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
**/*.{ts,tsx}
📄 CodeRabbit inference engine (.cursor/rules/error-logging.mdc)
**/*.{ts,tsx}: 始终使用 try-catch 处理可能的错误
提供有意义的错误信息
记录详细的错误日志
优雅降级处理
日志应包含时间戳、日志级别、错误代码、错误描述、堆栈跟踪(如适用)、相关上下文信息
日志级别应包括 ERROR、WARN、INFO、DEBUG
不要吞掉错误
提供用户友好的错误信息
实现错误重试机制
避免记录敏感信息
使用结构化日志
设置适当的日志级别
Files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
src/main/**/*.{ts,js,tsx,jsx}
📄 CodeRabbit inference engine (.cursor/rules/project-structure.mdc)
主进程代码放在
src/main
Files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
**/*.{ts,tsx,js,vue}
📄 CodeRabbit inference engine (CLAUDE.md)
Use English for all logs and comments
Files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
**/*.{ts,tsx,vue}
📄 CodeRabbit inference engine (CLAUDE.md)
Enable and adhere to strict TypeScript typing (avoid implicit any, prefer precise types)
Use PascalCase for TypeScript types and classes
Files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
src/main/presenter/**/*.ts
📄 CodeRabbit inference engine (AGENTS.md)
Place Electron main-process presenters under src/main/presenter/ (Window, Tab, Thread, Mcp, Config, LLMProvider)
Files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
**/*.{ts,tsx,js,jsx,vue,css,scss,md,json,yml,yaml}
📄 CodeRabbit inference engine (AGENTS.md)
Prettier style: single quotes, no semicolons, print width 100; run pnpm run format
Files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
**/*.{ts,tsx,js,jsx,vue}
📄 CodeRabbit inference engine (AGENTS.md)
**/*.{ts,tsx,js,jsx,vue}: Use OxLint for JS/TS code; keep lint clean
Use camelCase for variables and functions
Use SCREAMING_SNAKE_CASE for constants
Files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
🧠 Learnings (10)
📓 Common learnings
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : Provider implementations should yield reasoning events in the standardized format when applicable.
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/provider-guidelines.mdc:0-0
Timestamp: 2025-09-04T11:03:30.184Z
Learning: Reasoning events are optional; if present, they should contain the complete chain
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : Provider implementations should yield reasoning events in the standardized format when applicable.
Applied to files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : All provider implementations must parse provider-specific data chunks and yield standardized events for text, reasoning, tool calls, usage, errors, stop reasons, and image data.
Applied to files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/index.ts : The main Agent loop should buffer text content, handle tool call events, format tool results for the next LLM call, and manage conversation continuation logic.
Applied to files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/index.ts : The main Agent loop should send standardized `STREAM_EVENTS` (`RESPONSE`, `END`, `ERROR`) to the frontend via `eventBus`.
Applied to files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/streamEvents.ts : Standardized stream events should conform to the `LLMCoreStreamEvent` interface, ideally defined in a shared file such as `src/main/presenter/llmProviderPresenter/streamEvents.ts`.
Applied to files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-10-14T08:02:59.495Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: AGENTS.md:0-0
Timestamp: 2025-10-14T08:02:59.495Z
Learning: Applies to src/main/presenter/LLMProvider/**/*.ts : Implement the two-layer LLM provider (Agent Loop + Provider) under src/main/presenter/LLMProvider
Applied to files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : Provider implementations should aggregate and yield usage events as part of the standardized stream.
Applied to files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : Provider implementations must use a `coreStream` method that yields standardized stream events to decouple the main loop from provider-specific details.
Applied to files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/index.ts : `src/main/presenter/llmProviderPresenter/index.ts` should manage the overall Agent loop, conversation history, tool execution via `McpPresenter`, and frontend communication via `eventBus`.
Applied to files:
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: build-check (x64)
🔇 Additional comments (1)
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts (1)
187-216: Well-designed real-time reasoning duration tracking.The implementation correctly handles multiple scenarios:
- Re-fetching the last block accounts for new image or content blocks added earlier
reasoning_timeinitialization uses appropriate fallbacks (state.reasoningStartTime→currentTimefor new blocks,currentLastBlock.timestampfor existing blocks)- Real-time updates to
reasoning_time.endduring streaming provide live feedback
Summary by CodeRabbit
Release Notes