Skip to content
This repository was archived by the owner on Feb 7, 2026. It is now read-only.
This repository was archived by the owner on Feb 7, 2026. It is now read-only.

Allow to stream compressed data into BigQuery #75

@stephenplusplus

Description

@stephenplusplus
Copied from original issue: googleapis/google-cloud-node#2811

@xgalen
March 20, 2018 3:19 PM

Hi all,

In order to save data transfer (out) costs, we would want to stream the data compressed with gzip.

I have made some tries and it works, requesting directly to the API. Example (omitting values for the sake of simplicity):

#!/bin/bash

...

OBJECT="{'kind': 'bigquery#tableDataInsertAllRequest', 'skipInvalidRows': true, 'ignoreUnknownValues': true, 'rows': $ROWS}"

echo $OBJECT | gzip -cf > compressed.gz

curl -v -H "Authorization: Bearer $ACCESS_TOKEN" \
     -H "Content-Type: text/plain" \
     -H "Content-Encoding: gzip" \
     --data-binary @compressed.gz \
"https://www.googleapis.com/bigquery/v2/projects/$GOOGLE_CLOUD_PROJECT/datasets/$DATASET_ID/tables/$TABLE_ID/insertAll"

But I couldn't find where to set the header to change the content-encoding to gzip. I know it's an option to the responses and to store files (

* @param {boolean} options.gzip - Specify if you would like the file compressed
) but no for requesting.

Is it possible to add a new setting to allow compress or not? In that case, the responsibility of the compression is for the server, not for this module. I think it would be worth :)

Of course, I could help if needed.

Thanks!

Alfredo

Metadata

Metadata

Assignees

Labels

api: bigqueryIssues related to the googleapis/nodejs-bigquery API.type: feature request‘Nice-to-have’ improvement, new feature or different behavior or design.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions