-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Feature Request: Direct MPC-to-MPC Communication
Background
Currently, when chaining multiple MPCs (such as Fetch and XPath), the process follows this flow:
- AI receives user request
- AI invokes Fetch MPC
- Fetch MPC returns data to AI
- AI processes the data (consuming tokens)
- AI invokes XPath MPC with the data
- XPath MPC processes and returns results to AI
- AI presents final results to user
This creates inefficiencies when handling large datasets:
- Large web page content from Fetch MPC must pass through the AI model
- This consumes significant tokens, increasing costs
- Can exceed token limits, causing failures with large web pages
Proposed Solution
Implement direct MPC-to-MPC communication to optimize data flow:
CopyUser Request → AI → Fetch MPC → XPath MPC → AI → User Response
Instead of:
CopyUser Request → AI → Fetch MPC → AI → XPath MPC → AI → User Response
Benefits
- Reduced token usage and costs
- Ability to handle larger web pages without hitting token limits
- Improved performance and response times
- More efficient resource utilization
Implementation Requirements
- Create a communication protocol between MPCs
- Allow the AI to specify chained MPC operations
- Implement data passing mechanisms between MPCs
- Update documentation for developers on how to create and use chainable MPCs
Use Case Example
A user wants to extract all links from a large website:
-
Current approach: Fetch returns the entire HTML to AI, consuming tokens, then AI must pass this to XPath
-
Proposed approach: AI instructs Fetch MPC to directly pass its output to XPath MPC, bypassing the token-consuming middle step
annjulyleon, rr-paras-patel, gunta, ianlivingstone, cartermp and 6 moredomainio, gyliu513 and lgemc
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request