-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Open
Labels
EnhancementNew feature or requestNew feature or requestFeature RequestFeature request, not a bugFeature request, not a bugIssue - Unassigned / ActionableClear and approved. Available for contributors to pick up.Clear and approved. Available for contributors to pick up.proposal
Description
Problem / Value
Large MCP responses (e.g., DB queries, logs) can exceed the model’s context and stall the agent. Saving these to files instead keeps the AI responsive while preserving full data access.
Context
Affects users connecting to data sources that may return very large payloads. Today the full response is embedded in context, which can overwhelm the model and slow or halt iterations.
Proposal
- Dynamically decide whether to inline or save to file based on available context, not a fixed byte threshold.
- When saving to file:
- Provide a short preview (first N lines/tokens) within the context.
- Return a stable, accessible file path the agent can read via read_file.
- Suggest follow-up actions (e.g., sampling, generating bash/python to process the file).
Acceptance Criteria
- System no longer stalls on large MCP responses.
- Decision uses the active model’s context, conversation token usage, and a safety margin.
- Saved files are accessible to the agent and organized predictably (e.g., .roo/tmp/…).
- Optional cap/override to limit extremely large inlining even on big-context models.
Notes
- Align with existing dynamic context management (e.g., sliding window).
- Avoid mandatory user configuration; provide an optional manual cap/override for power users.
Metadata
Metadata
Assignees
Labels
EnhancementNew feature or requestNew feature or requestFeature RequestFeature request, not a bugFeature request, not a bugIssue - Unassigned / ActionableClear and approved. Available for contributors to pick up.Clear and approved. Available for contributors to pick up.proposal
Type
Projects
Status
Issue [Unassigned]