Background is we have some tens of thousands tables in bigquery want to figure out which ones are the most expensive ones (using the most storage), so write a little script to enumerate all table metadata, but it runs slow because each API call need to retrieve all metadata, want to use the fields to get only the needed information; if anyone knows a workaround, let me know.
(the http command here is an advanced version of curl; from https://httpie.org/)
$ access_token=$(gcloud auth application-default print-access-token)
$ http https://www.googleapis.com/bigquery/v2/projects/<projectId>/datasets/<datasetId>/tables/<tableId> \
fields==id,numBytes,numLongTermBytes,numRows,creationTime,expirationTime,lastModifiedTime,type,location \
access_token==$access_token
[...]
{
"id": "<projectId>:<datasetId>.<tableId>",
"numBytes": "112728076",
"numLongTermBytes": "0",
"numRows": "28431",
"creationTime": "1505962477264",
"expirationTime": "1513738477264",
"lastModifiedTime": "1505962477264",
"type": "TABLE",
"location": "US"
}
From @c0b on October 20, 2017 5:49
the REST API reference mentioned
selectedFields[2], and all google cloud services support thefieldsparameter [3]; but from [1] I'm not seeing how can I pass these parameters?Background is we have some tens of thousands tables in bigquery want to figure out which ones are the most expensive ones (using the most storage), so write a little script to enumerate all table metadata, but it runs slow because each API call need to retrieve all metadata, want to use the
fieldsto get only the needed information; if anyone knows a workaround, let me know.(the http command here is an advanced version of curl; from https://httpie.org/)
Copied from original issue: googleapis/google-cloud-node#2684