Respect Page Size Limits in ArrowWriter (#2853)#2890
Conversation
| } | ||
|
|
||
| fn write_gather(&mut self, values: &Self::Values, indices: &[usize]) -> Result<()> { | ||
| self.num_values += indices.len(); |
|
|
||
| fn num_values(&self) -> usize { | ||
| self.num_values | ||
| match &self.dict_encoder { |
There was a problem hiding this comment.
Without this, we would only ever have a single dictionary encoded page per column chunk. This is the other half of the fix in #2854
There was a problem hiding this comment.
The confusion was that we tracked num_values in two places, that is now gone so I'm not sure what to write in a comment...
|
#2853 is already marked as closed -- it isn't clear to me if that ticket is really still open or if this PR just fixes another issue with a similar outward symptom? |
It is really still open, #2854 only partially fixed it for the case of dictionary fallback pages |
alamb
left a comment
There was a problem hiding this comment.
I had a bunch of questions, but based on the tests alone I am approving this PR. Nice work @tustvold
I think we should get at least one more pair of eyes on this PR prior to merging it
cc @thinkharderdev and @Ted-Jiang @sunchao
|
|
||
| fn num_values(&self) -> usize { | ||
| self.num_values | ||
| match &self.dict_encoder { |
| let props = Arc::new( | ||
| WriterProperties::builder() | ||
| .set_data_pagesize_limit(15) // actually each page will have size 15-18 bytes | ||
| .set_data_pagesize_limit(10) |
There was a problem hiding this comment.
I don't understand this change
There was a problem hiding this comment.
The RLE size estimation has changed, and so this test needed updating
| // NOTE: The final size is almost the same because the dictionary entries are | ||
| // preserved after encoded values have been written. | ||
| run_test::<Int32Type>(Encoding::RLE_DICTIONARY, -1, &[123, 1024], 11, 68, 66); | ||
| run_test::<Int32Type>(Encoding::RLE_DICTIONARY, -1, &[123, 1024], 0, 2, 0); |
There was a problem hiding this comment.
can you explain these changes?
There was a problem hiding this comment.
The RLE size estimation was updated as part of #2889
| Encoding::BIT_PACKED => { | ||
| ceil((num_buffered_values * bit_width as usize) as i64, 8) as usize | ||
| } | ||
| Encoding::RLE => RleEncoder::max_buffer_size(bit_width, num_buffered_values), |
There was a problem hiding this comment.
why is this different than the other estimated sizes?
There was a problem hiding this comment.
What other estimated sizes, I think I updated them all?
|
I'm going to get this in so that I can base some changes off it, I'll address any further review comments in follow up PRs |
|
Benchmark runs are scheduled for baseline = e859f30 and contender = 7e5d4a1. 7e5d4a1 is a master commit associated with this PR. Results will be available as each benchmark for each run completes. |
Which issue does this PR close?
Closes #2853
Closes #2889
Rationale for this change
See tickets
What changes are included in this PR?
Fixes an issue where size estimation was broken for primitive columns and dictionary encoded pages, which prevented the page size limit from being respected. Similar to #2854
Also tweaks the RLE size estimation as it was overly pessimistic
Are there any user-facing changes?
No