Skip to content

[ENHANCEMENT] Save oversized MCP responses to files with smart context preview #7042

@SleeperSmith

Description

@SleeperSmith

Problem / Value

Large MCP responses (e.g., DB queries, logs) can exceed the model’s context and stall the agent. Saving these to files instead keeps the AI responsive while preserving full data access.

Context

Affects users connecting to data sources that may return very large payloads. Today the full response is embedded in context, which can overwhelm the model and slow or halt iterations.

Proposal

  • Dynamically decide whether to inline or save to file based on available context, not a fixed byte threshold.
  • When saving to file:
    • Provide a short preview (first N lines/tokens) within the context.
    • Return a stable, accessible file path the agent can read via read_file.
    • Suggest follow-up actions (e.g., sampling, generating bash/python to process the file).

Acceptance Criteria

  • System no longer stalls on large MCP responses.
  • Decision uses the active model’s context, conversation token usage, and a safety margin.
  • Saved files are accessible to the agent and organized predictably (e.g., .roo/tmp/…).
  • Optional cap/override to limit extremely large inlining even on big-context models.

Notes

  • Align with existing dynamic context management (e.g., sliding window).
  • Avoid mandatory user configuration; provide an optional manual cap/override for power users.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    Status

    Issue [Unassigned]

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions