Skip to content

fix(opencode): implement line buffering for reliable JSON parsing#185

Merged
subsy merged 7 commits intosubsy:mainfrom
put101:fix/opencode-buffering
Jan 22, 2026
Merged

fix(opencode): implement line buffering for reliable JSON parsing#185
subsy merged 7 commits intosubsy:mainfrom
put101:fix/opencode-buffering

Conversation

@put101
Copy link
Contributor

@put101 put101 commented Jan 21, 2026

Fixes an issue where partial JSON chunks from the opencode CLI caused JSON parse errors and data loss. This change buffers stdout and only parses complete lines.

Summary by CodeRabbit

  • New Features

    • Added a JSONL streaming buffer to accumulate partial stream data and emit complete lines.
    • Exposed a minimal public API for the new buffer and callbacks.
  • Bug Fixes

    • Improved streaming output handling with per-line processing and flushing on stream end.
    • Malformed or non-JSON lines are skipped gracefully; display events remain consistent.
  • Tests

    • Added tests covering buffering, line handling, flushing and display-event forwarding.
  • Documentation

    • Updated PR and README guidance to require documentation, CI and minimum test coverage.

✏️ Tip: You can customize this high-level summary in your review settings.

Copilot AI review requested due to automatic review settings January 21, 2026 21:20
@vercel
Copy link

vercel bot commented Jan 21, 2026

@put101 is attempting to deploy a commit to the plgeek Team on Vercel.

A member of the Team first needs to authorize it.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 21, 2026

Warning

Rate limit exceeded

@subsy has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 14 minutes and 42 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

Note

Other AI code review bot(s) detected

CodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review.

Walkthrough

Adds a JSONL streaming buffer and public API, switching stdout handling from chunk-based to line-oriented parsing: partial data is buffered, complete lines parsed via parseOpenCodeJsonLine, parsed JSON lines emitted to onJsonlMessage, display events forwarded, and buffered remainder flushed on stream end. Tests added.

Changes

Cohort / File(s) Change Summary
Buffered JSONL Processing
src/plugins/agents/builtin/opencode.ts
Introduces JsonlStreamingBuffer, createOpenCodeJsonlBuffer and wiring in OpenCodeAgentPlugin.execute: stdout segments pushed into the buffer, complete lines parsed, per-line JSON forwarded to onJsonlMessage, display events emitted per line, and buffer flushed on end.
Public API Exports
src/plugins/agents/builtin/opencode.ts
Exports JsonlBufferCallbacks, JsonlStreamingBuffer, and createOpenCodeJsonlBuffer types/functions.
Tests
tests/plugins/opencode-agent.test.ts
New tests for the JSONL buffer: partial-chunk handling, multi-line chunks, invalid/empty JSON, flush-on-end behaviour, and forwarding of display events; tests rely on the new exported buffer API.
Docs / PR Guidance
CONTRIBUTING.md, README.md
Expanded Pull Request requirements: documentation updates required for new/changed functionality, CI must run tests (not just lint/typecheck), and Codecov patch check enforcing >50% test coverage on new/changed lines; PR requirements block duplicated in README.

Sequence Diagram(s)

sequenceDiagram
  participant Stream as Stream (data chunks)
  participant Buffer as Line Buffer
  participant Parser as parseOpenCodeJsonLine
  participant JsonHandler as onJsonlMessage
  participant DisplaySeg as onStdoutSegments
  participant DisplayRaw as onStdout

  Stream->>Buffer: push(data chunk)
  Buffer->>Buffer: accumulate and split on '\n'
  alt complete line available
    Buffer->>Parser: parse(line)
    alt valid JSON object
      Parser->>JsonHandler: onJsonlMessage(message)
    end
    Parser->>DisplaySeg: onStdoutSegments(events)
    Parser->>DisplayRaw: onStdout(string)
  end
  Buffer->>Buffer: retain trailing partial line
  Buffer->>Buffer: onEnd -> flush remaining data
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~22 minutes

Poem

🐰 I nibble bytes and stitch each line,

Crumbs of JSON now neatly align,
Lines hop out, no fragments stray,
Events prance in order each day,
Buffer snug — the stream’s OK.

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: implementing line buffering for reliable JSON parsing in the opencode plugin, which is the primary focus of the changeset.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/plugins/agents/builtin/opencode.ts (1)

390-439: Flush the trailing buffer on process end to avoid dropping the final JSONL record.

If stdout ends without a newline, the last buffered line is never parsed, so the final message is lost. Add an onEnd wrapper (and reuse a shared line handler) to flush the remaining buffer before delegating to the original callback.

🔧 Suggested fix
   override execute(
     prompt: string,
     files?: AgentFileContext[],
     options?: AgentExecuteOptions
   ): AgentExecutionHandle {
     let buffer = '';
 
+    const handleLine = (line: string) => {
+      const trimmed = line.trim();
+      if (!trimmed) return;
+
+      // Parse raw JSONL lines and forward to onJsonlMessage for subagent tracing
+      if (options?.onJsonlMessage) {
+        if (trimmed.startsWith('{')) {
+          try {
+            const parsed = JSON.parse(trimmed);
+            options.onJsonlMessage(parsed);
+          } catch {
+            // Not valid JSON, skip
+          }
+        }
+      }
+
+      // Process for display events
+      const events = parseOpenCodeJsonLine(trimmed);
+      if (events.length > 0) {
+        if (options?.onStdoutSegments) {
+          const segments = processAgentEventsToSegments(events);
+          if (segments.length > 0) {
+            options.onStdoutSegments(segments);
+          }
+        }
+        if (options?.onStdout) {
+          const parsed = processAgentEvents(events);
+          if (parsed.length > 0) {
+            options.onStdout(parsed);
+          }
+        }
+      }
+    };
+
+    const flushBuffer = () => {
+      if (!buffer) return;
+      handleLine(buffer);
+      buffer = '';
+    };
+
     // Wrap callbacks to parse JSON events
     const parsedOptions: AgentExecuteOptions = {
       ...options,
       onStdout: (options?.onStdout || options?.onStdoutSegments || options?.onJsonlMessage)
         ? (data: string) => {
             buffer += data;
 
             // If no newline, wait for more data
             if (!buffer.includes('\n')) return;
 
             const lines = buffer.split('\n');
             buffer = lines.pop() ?? ''; // Keep the last partial line (or empty string) in buffer
 
             for (const line of lines) {
-              const trimmed = line.trim();
-              if (!trimmed) continue;
-
-              // Parse raw JSONL lines and forward to onJsonlMessage for subagent tracing
-              if (options?.onJsonlMessage) {
-                if (trimmed.startsWith('{')) {
-                  try {
-                    const parsed = JSON.parse(trimmed);
-                    options.onJsonlMessage(parsed);
-                  } catch {
-                    // Not valid JSON, skip
-                  }
-                }
-              }
-
-              // Process for display events
-              const events = parseOpenCodeJsonLine(trimmed);
-              if (events.length > 0) {
-                // Call TUI-native segments callback if provided
-                if (options?.onStdoutSegments) {
-                  const segments = processAgentEventsToSegments(events);
-                  if (segments.length > 0) {
-                    options.onStdoutSegments(segments);
-                  }
-                }
-                // Also call legacy string callback if provided
-                if (options?.onStdout) {
-                  const parsed = processAgentEvents(events);
-                  if (parsed.length > 0) {
-                    options.onStdout(parsed);
-                  }
-                }
-              }
+              handleLine(line);
             }
           }
         : undefined,
+      onEnd: (result) => {
+        flushBuffer();
+        options?.onEnd?.(result);
+      },
     };
 
     return super.execute(prompt, files, parsedOptions);
   }

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes an issue where partial JSON chunks from the opencode CLI caused JSON parse errors and data loss by implementing line buffering for reliable JSON parsing. The change ensures that stdout data is buffered and only complete lines (terminated by newlines) are parsed as JSON.

Changes:

  • Added line buffering to accumulate stdout data until complete lines are available
  • Refactored JSON parsing to process one line at a time instead of re-processing chunks
  • Improved efficiency by calling parseOpenCodeJsonLine directly instead of parseOpenCodeOutputToEvents

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

subsy and others added 3 commits January 22, 2026 08:53
Add onEnd wrapper to process any remaining buffered content when the
stdout stream closes. This ensures output that doesn't end with a
newline is still processed correctly.

Also extracts processLine helper to DRY up the line processing logic
and removes unused parseOpenCodeOutputToEvents function.
@vercel
Copy link

vercel bot commented Jan 22, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

1 Skipped Deployment
Project Deployment Review Updated (UTC)
ralph-tui Ignored Ignored Preview Jan 22, 2026 11:00am

Request Review

@codecov
Copy link

codecov bot commented Jan 22, 2026

Codecov Report

❌ Patch coverage is 66.66667% with 26 lines in your changes missing coverage. Please review.
✅ Project coverage is 44.50%. Comparing base (1294e0c) to head (b6d1358).
⚠️ Report is 8 commits behind head on main.

Files with missing lines Patch % Lines
src/plugins/agents/builtin/opencode.ts 66.66% 26 Missing ⚠️
Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #185      +/-   ##
==========================================
+ Coverage   44.09%   44.50%   +0.41%     
==========================================
  Files          76       76              
  Lines       21991    22027      +36     
==========================================
+ Hits         9696     9804     +108     
+ Misses      12295    12223      -72     
Files with missing lines Coverage Δ
src/plugins/agents/builtin/opencode.ts 65.84% <66.66%> (+20.45%) ⬆️
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Add comprehensive tests for createOpenCodeJsonlBuffer to achieve >50%
coverage on the new buffer flushing code. Tests cover:
- Buffer flush on stream end without trailing newline
- Processing complete lines during streaming
- Handling multiple partial chunks
- JSONL message forwarding
- Display events callback
- Invalid JSON handling

Also refactors the buffering logic into an exported function for
testability, following the pattern of createOpenCodeStreamingJsonlParser.

docs: add PR requirements to README and CONTRIBUTING

Document the PR requirements:
- >50% test coverage on new/changed lines (Codecov patch check)
- Documentation updates required for new/changed features
@subsy
Copy link
Owner

subsy commented Jan 22, 2026

@put101 thanks for this PR! I made a quick fix for the buffer flushing issue, and added tests - merging 🤘

AI Agent added 2 commits January 22, 2026 10:31
Add tests for:
- getSandboxRequirements (auth paths, binary paths, network)
- getPreflightSuggestion (suggestion text content)
- buildModelString (provider/model formatting)

Increases opencode.ts coverage from 63% to 65% line coverage.
Add integration tests that mock the base class execute to verify:
- onStdout wrapping with buffer and JSON parsing
- onEnd wrapping for buffer flush before callback
- Conditional wrapping based on output callback presence
- Preservation of other options during wrapping

Also add more edge case tests for createOpenCodeJsonlBuffer:
- Both onJsonlMessage and onDisplayEvents called for same line
- Non-JSON lines skipped for onJsonlMessage
- Whitespace-only buffer handled gracefully

Increases opencode.ts coverage from 65% to 76% line coverage.
@subsy subsy merged commit b0ae4a9 into subsy:main Jan 22, 2026
9 checks passed
sakaman pushed a commit to sakaman/ralph-tui that referenced this pull request Feb 15, 2026
fix(opencode): implement line buffering for reliable JSON parsing
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

Comments