-
Notifications
You must be signed in to change notification settings - Fork 32
DESIGN - Data Pipelining #11
Copy link
Copy link
Closed
Labels
Description
Problem
Conduit's data pipeline currently processes a single round of data at a time which means there is downtime for the exporter every time a new block of data is fetched and processed.
In one test, the overhead was only 1.7ms per round, but over 25 million rounds that adds up to nearly 12 hours.
The overhead for the filter processor is even more overhead. Perhaps much more for a complex filter.
Solution
Pipeline data processing so that each plugin is always executing the next round. If there are 5 plugins enabled there should be at least 5 rounds in the pipeline so that the slowest plugin is never starved for work.
Reactions are currently unavailable