fix(opencode): implement line buffering for reliable JSON parsing#185
fix(opencode): implement line buffering for reliable JSON parsing#185subsy merged 7 commits intosubsy:mainfrom
Conversation
|
@put101 is attempting to deploy a commit to the plgeek Team on Vercel. A member of the Team first needs to authorize it. |
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. Note Other AI code review bot(s) detectedCodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review. WalkthroughAdds a JSONL streaming buffer and public API, switching stdout handling from chunk-based to line-oriented parsing: partial data is buffered, complete lines parsed via Changes
Sequence Diagram(s)sequenceDiagram
participant Stream as Stream (data chunks)
participant Buffer as Line Buffer
participant Parser as parseOpenCodeJsonLine
participant JsonHandler as onJsonlMessage
participant DisplaySeg as onStdoutSegments
participant DisplayRaw as onStdout
Stream->>Buffer: push(data chunk)
Buffer->>Buffer: accumulate and split on '\n'
alt complete line available
Buffer->>Parser: parse(line)
alt valid JSON object
Parser->>JsonHandler: onJsonlMessage(message)
end
Parser->>DisplaySeg: onStdoutSegments(events)
Parser->>DisplayRaw: onStdout(string)
end
Buffer->>Buffer: retain trailing partial line
Buffer->>Buffer: onEnd -> flush remaining data
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~22 minutes Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 0
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/plugins/agents/builtin/opencode.ts (1)
390-439: Flush the trailing buffer on process end to avoid dropping the final JSONL record.If stdout ends without a newline, the last buffered line is never parsed, so the final message is lost. Add an
onEndwrapper (and reuse a shared line handler) to flush the remaining buffer before delegating to the original callback.🔧 Suggested fix
override execute( prompt: string, files?: AgentFileContext[], options?: AgentExecuteOptions ): AgentExecutionHandle { let buffer = ''; + const handleLine = (line: string) => { + const trimmed = line.trim(); + if (!trimmed) return; + + // Parse raw JSONL lines and forward to onJsonlMessage for subagent tracing + if (options?.onJsonlMessage) { + if (trimmed.startsWith('{')) { + try { + const parsed = JSON.parse(trimmed); + options.onJsonlMessage(parsed); + } catch { + // Not valid JSON, skip + } + } + } + + // Process for display events + const events = parseOpenCodeJsonLine(trimmed); + if (events.length > 0) { + if (options?.onStdoutSegments) { + const segments = processAgentEventsToSegments(events); + if (segments.length > 0) { + options.onStdoutSegments(segments); + } + } + if (options?.onStdout) { + const parsed = processAgentEvents(events); + if (parsed.length > 0) { + options.onStdout(parsed); + } + } + } + }; + + const flushBuffer = () => { + if (!buffer) return; + handleLine(buffer); + buffer = ''; + }; + // Wrap callbacks to parse JSON events const parsedOptions: AgentExecuteOptions = { ...options, onStdout: (options?.onStdout || options?.onStdoutSegments || options?.onJsonlMessage) ? (data: string) => { buffer += data; // If no newline, wait for more data if (!buffer.includes('\n')) return; const lines = buffer.split('\n'); buffer = lines.pop() ?? ''; // Keep the last partial line (or empty string) in buffer for (const line of lines) { - const trimmed = line.trim(); - if (!trimmed) continue; - - // Parse raw JSONL lines and forward to onJsonlMessage for subagent tracing - if (options?.onJsonlMessage) { - if (trimmed.startsWith('{')) { - try { - const parsed = JSON.parse(trimmed); - options.onJsonlMessage(parsed); - } catch { - // Not valid JSON, skip - } - } - } - - // Process for display events - const events = parseOpenCodeJsonLine(trimmed); - if (events.length > 0) { - // Call TUI-native segments callback if provided - if (options?.onStdoutSegments) { - const segments = processAgentEventsToSegments(events); - if (segments.length > 0) { - options.onStdoutSegments(segments); - } - } - // Also call legacy string callback if provided - if (options?.onStdout) { - const parsed = processAgentEvents(events); - if (parsed.length > 0) { - options.onStdout(parsed); - } - } - } + handleLine(line); } } : undefined, + onEnd: (result) => { + flushBuffer(); + options?.onEnd?.(result); + }, }; return super.execute(prompt, files, parsedOptions); }
There was a problem hiding this comment.
Pull request overview
This PR fixes an issue where partial JSON chunks from the opencode CLI caused JSON parse errors and data loss by implementing line buffering for reliable JSON parsing. The change ensures that stdout data is buffered and only complete lines (terminated by newlines) are parsed as JSON.
Changes:
- Added line buffering to accumulate stdout data until complete lines are available
- Refactored JSON parsing to process one line at a time instead of re-processing chunks
- Improved efficiency by calling
parseOpenCodeJsonLinedirectly instead ofparseOpenCodeOutputToEvents
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Add onEnd wrapper to process any remaining buffered content when the stdout stream closes. This ensures output that doesn't end with a newline is still processed correctly. Also extracts processLine helper to DRY up the line processing logic and removes unused parseOpenCodeOutputToEvents function.
|
The latest updates on your projects. Learn more about Vercel for GitHub. |
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #185 +/- ##
==========================================
+ Coverage 44.09% 44.50% +0.41%
==========================================
Files 76 76
Lines 21991 22027 +36
==========================================
+ Hits 9696 9804 +108
+ Misses 12295 12223 -72
🚀 New features to boost your workflow:
|
Add comprehensive tests for createOpenCodeJsonlBuffer to achieve >50% coverage on the new buffer flushing code. Tests cover: - Buffer flush on stream end without trailing newline - Processing complete lines during streaming - Handling multiple partial chunks - JSONL message forwarding - Display events callback - Invalid JSON handling Also refactors the buffering logic into an exported function for testability, following the pattern of createOpenCodeStreamingJsonlParser. docs: add PR requirements to README and CONTRIBUTING Document the PR requirements: - >50% test coverage on new/changed lines (Codecov patch check) - Documentation updates required for new/changed features
|
@put101 thanks for this PR! I made a quick fix for the buffer flushing issue, and added tests - merging 🤘 |
Add tests for: - getSandboxRequirements (auth paths, binary paths, network) - getPreflightSuggestion (suggestion text content) - buildModelString (provider/model formatting) Increases opencode.ts coverage from 63% to 65% line coverage.
Add integration tests that mock the base class execute to verify: - onStdout wrapping with buffer and JSON parsing - onEnd wrapping for buffer flush before callback - Conditional wrapping based on output callback presence - Preservation of other options during wrapping Also add more edge case tests for createOpenCodeJsonlBuffer: - Both onJsonlMessage and onDisplayEvents called for same line - Non-JSON lines skipped for onJsonlMessage - Whitespace-only buffer handled gracefully Increases opencode.ts coverage from 65% to 76% line coverage.
fix(opencode): implement line buffering for reliable JSON parsing
Fixes an issue where partial JSON chunks from the opencode CLI caused JSON parse errors and data loss. This change buffers stdout and only parses complete lines.
Summary by CodeRabbit
New Features
Bug Fixes
Tests
Documentation
✏️ Tip: You can customize this high-level summary in your review settings.