Skip to content

Support encoding a single parquet file using multiple threads #1718

@alamb

Description

@alamb

Is your feature request related to a problem or challenge? Please describe what you are trying to do.
The encoding / compression is most often the bottleneck for increasing the throughput of writing parquet files. Even though the actual writing of bytes must be done serially, the encoding could be done in parallel (into memory buffers) before the actual write

Describe the solution you'd like
I would like a way (either an explicit API or an example) that allows using multiple cores to write ArrowRecord batches to a file.

Note that trying to parallelize writes today results in corrupted parquet files, see #1717

Describe alternatives you've considered
There is a high level description of parallel decoding in @jorgecarleitao 's parquet2 https://github.com/jorgecarleitao/parquet2#higher-parallelism (focused on reading)

Additional context
Mailing list https://lists.apache.org/thread/rbhfwcpd6qfk52rtzm2t6mo3fhvdpc91

Also, #1711 is possibly related

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementAny new improvement worthy of a entry in the changelog

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions