Skip to content

DESIGN - Data Pipelining #11

@winder

Description

@winder

Problem

Conduit's data pipeline currently processes a single round of data at a time which means there is downtime for the exporter every time a new block of data is fetched and processed.

In one test, the overhead was only 1.7ms per round, but over 25 million rounds that adds up to nearly 12 hours.

The overhead for the filter processor is even more overhead. Perhaps much more for a complex filter.

Solution

Pipeline data processing so that each plugin is always executing the next round. If there are 5 plugins enabled there should be at least 5 rounds in the pipeline so that the slowest plugin is never starved for work.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions