Skip to content

S3 table function can't upload file while aws s3 CLI can #34244

@yonesko

Description

@yonesko

I have insert to s3 6bil rows table:

INSERT INTO FUNCTION
s3('https://s3.us-east-1.amazonaws.com/my-bucket/data.csv', 'DFGHJ', 'FGHJIJH, 'CSVWithNames', 'time DateTime, exchangeId UInt16, pairId UInt16, id String, price Decimal(38, 18), volume Decimal(38, 18)')
SELECT * FROM trade;

but it fails trying to upload part 10001 while aws s3 upload supports only 10k parts.

I had to export this select into CSV 600GB file on the disk and then perform successful aws s3 cp data.csv s3://my-bucket/data.csv

I expected Clickhouse would do it on his own successfully.

┌─version()─┐
│ 22.1.3.7 │
└───────────┘

Metadata

Metadata

Assignees

Labels

st-acceptedThe issue is in our backlog, ready to takeunexpected behaviourResult is unexpected, but not entirely wrong at the same time.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions