Skip to content

Conversation

@zerob13
Copy link
Collaborator

@zerob13 zerob13 commented Nov 13, 2025

Summary by CodeRabbit

Release Notes

  • Improvements
    • Enhanced accuracy of AI reasoning timing information during streaming operations by implementing better state tracking and block management.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Nov 13, 2025

Walkthrough

The LLM event handler now tracks reasoning content streaming duration by introducing a reasoning_time field with start and end timestamps on reasoning blocks. The handler updates message metadata when reasoning begins and re-fetches the last block during streaming to maintain accurate state for content appending or block creation.

Changes

Cohort / File(s) Summary
Reasoning timing instrumentation
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
Introduces reasoning_time object on reasoning_content blocks to track streaming duration. Updates message metadata with reasoningEndTime when reasoning starts. Re-fetches last block before handling reasoning_content to ensure correct state. Appends to existing blocks or creates new ones with initialized start/end times.

Sequence Diagram(s)

sequenceDiagram
    participant Handler
    participant State
    participant Message

    Handler->>State: Receive reasoning_content start
    State->>Message: Update reasoningStartTime & reasoningEndTime
    State->>State: Set lastReasoningTime = currentTime
    
    loop During streaming
        Handler->>State: Re-fetch lastBlock (current state)
        alt Block exists
            State->>State: Append to reasoning_content
            State->>State: Update reasoning_time.end = currentTime
        else Create new block
            State->>State: Initialize reasoning_time {start, end}
            State->>State: Set reasoning_time.start from reasoningStartTime
        end
    end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

  • Focus areas:
    • Correctness of timing calculations and state synchronization between metadata and block-level reasoning_time objects
    • Ensure re-fetching lastBlock doesn't introduce race conditions or stale reference handling
    • Validation of fallback logic when reasoningStartTime is unavailable
    • Edge cases in append vs. create logic for reasoning_content blocks

Poem

🐰 A rabbit hops through reasoning streams,
Tracking time in coding dreams,
With start and end so carefully placed,
Each reasoning block finds its space,
Timing flows with graceful ease—
The handler's gift to set minds at ease! ✨

Pre-merge checks and finishing touches

❌ Failed checks (1 inconclusive)
Check name Status Explanation Resolution
Title check ❓ Inconclusive The title contains a repetitive and vague term 'realtime realtime' that appears to be a mistake, making it unclear what the actual fix addresses despite mentioning 'thinking time'. Revise the title to be clear and specific, such as 'fix: track reasoning time in real-time during LLM streaming' or similar, removing duplication and clarifying the exact fix.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch bugfix/thought-time

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts (1)

104-107: Consider the performance impact of frequent metadata updates.

Updating message metadata on every reasoning_content chunk could result in many database writes per second during active reasoning. While this enables real-time display of reasoning duration, consider whether the update frequency justifies the I/O cost.

Potential optimizations:

  • Debounce metadata updates (e.g., update at most every 100-200ms)
  • Batch metadata writes with the message content update at line 218
  • Move the final metadata update to finalizeMessage (lines 341-343 already handle this)

If real-time updates are essential for UX, document the performance trade-off.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between a9fb909 and ec459d8.

📒 Files selected for processing (1)
  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts (2 hunks)
🧰 Additional context used
📓 Path-based instructions (10)
**/*.{js,jsx,ts,tsx}

📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)

**/*.{js,jsx,ts,tsx}: 使用 OxLint 进行代码检查
Log和注释使用英文书写

Files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
src/{main,renderer}/**/*.ts

📄 CodeRabbit inference engine (.cursor/rules/electron-best-practices.mdc)

src/{main,renderer}/**/*.ts: Use context isolation for improved security
Implement proper inter-process communication (IPC) patterns
Optimize application startup time with lazy loading
Implement proper error handling and logging for debugging

Files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
src/main/**/*.ts

📄 CodeRabbit inference engine (.cursor/rules/electron-best-practices.mdc)

Use Electron's built-in APIs for file system and native dialogs

Files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
**/*.{ts,tsx}

📄 CodeRabbit inference engine (.cursor/rules/error-logging.mdc)

**/*.{ts,tsx}: 始终使用 try-catch 处理可能的错误
提供有意义的错误信息
记录详细的错误日志
优雅降级处理
日志应包含时间戳、日志级别、错误代码、错误描述、堆栈跟踪(如适用)、相关上下文信息
日志级别应包括 ERROR、WARN、INFO、DEBUG
不要吞掉错误
提供用户友好的错误信息
实现错误重试机制
避免记录敏感信息
使用结构化日志
设置适当的日志级别

Files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
src/main/**/*.{ts,js,tsx,jsx}

📄 CodeRabbit inference engine (.cursor/rules/project-structure.mdc)

主进程代码放在 src/main

Files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
**/*.{ts,tsx,js,vue}

📄 CodeRabbit inference engine (CLAUDE.md)

Use English for all logs and comments

Files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
**/*.{ts,tsx,vue}

📄 CodeRabbit inference engine (CLAUDE.md)

Enable and adhere to strict TypeScript typing (avoid implicit any, prefer precise types)

Use PascalCase for TypeScript types and classes

Files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
src/main/presenter/**/*.ts

📄 CodeRabbit inference engine (AGENTS.md)

Place Electron main-process presenters under src/main/presenter/ (Window, Tab, Thread, Mcp, Config, LLMProvider)

Files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
**/*.{ts,tsx,js,jsx,vue,css,scss,md,json,yml,yaml}

📄 CodeRabbit inference engine (AGENTS.md)

Prettier style: single quotes, no semicolons, print width 100; run pnpm run format

Files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
**/*.{ts,tsx,js,jsx,vue}

📄 CodeRabbit inference engine (AGENTS.md)

**/*.{ts,tsx,js,jsx,vue}: Use OxLint for JS/TS code; keep lint clean
Use camelCase for variables and functions
Use SCREAMING_SNAKE_CASE for constants

Files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
🧠 Learnings (10)
📓 Common learnings
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : Provider implementations should yield reasoning events in the standardized format when applicable.
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/provider-guidelines.mdc:0-0
Timestamp: 2025-09-04T11:03:30.184Z
Learning: Reasoning events are optional; if present, they should contain the complete chain
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : Provider implementations should yield reasoning events in the standardized format when applicable.

Applied to files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : All provider implementations must parse provider-specific data chunks and yield standardized events for text, reasoning, tool calls, usage, errors, stop reasons, and image data.

Applied to files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/index.ts : The main Agent loop should buffer text content, handle tool call events, format tool results for the next LLM call, and manage conversation continuation logic.

Applied to files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/index.ts : The main Agent loop should send standardized `STREAM_EVENTS` (`RESPONSE`, `END`, `ERROR`) to the frontend via `eventBus`.

Applied to files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/streamEvents.ts : Standardized stream events should conform to the `LLMCoreStreamEvent` interface, ideally defined in a shared file such as `src/main/presenter/llmProviderPresenter/streamEvents.ts`.

Applied to files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-10-14T08:02:59.495Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: AGENTS.md:0-0
Timestamp: 2025-10-14T08:02:59.495Z
Learning: Applies to src/main/presenter/LLMProvider/**/*.ts : Implement the two-layer LLM provider (Agent Loop + Provider) under src/main/presenter/LLMProvider

Applied to files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : Provider implementations should aggregate and yield usage events as part of the standardized stream.

Applied to files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : Provider implementations must use a `coreStream` method that yields standardized stream events to decouple the main loop from provider-specific details.

Applied to files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/index.ts : `src/main/presenter/llmProviderPresenter/index.ts` should manage the overall Agent loop, conversation history, tool execution via `McpPresenter`, and frontend communication via `eventBus`.

Applied to files:

  • src/main/presenter/threadPresenter/handlers/llmEventHandler.ts
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: build-check (x64)
🔇 Additional comments (1)
src/main/presenter/threadPresenter/handlers/llmEventHandler.ts (1)

187-216: Well-designed real-time reasoning duration tracking.

The implementation correctly handles multiple scenarios:

  • Re-fetching the last block accounts for new image or content blocks added earlier
  • reasoning_time initialization uses appropriate fallbacks (state.reasoningStartTimecurrentTime for new blocks, currentLastBlock.timestamp for existing blocks)
  • Real-time updates to reasoning_time.end during streaming provide live feedback

@zerob13 zerob13 merged commit ce66ab3 into dev Nov 13, 2025
2 checks passed
@zerob13 zerob13 deleted the bugfix/thought-time branch November 23, 2025 13:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants