Skip to content
This repository was archived by the owner on Mar 5, 2026. It is now read-only.
This repository was archived by the owner on Mar 5, 2026. It is now read-only.

Adapt: Serializing DATE field breaks #344

@solamanhuq

Description

@solamanhuq

At the moment, if I use adapt to serialize JSON with a DATE field as a string, e.g. 2023-01-01, the serialization will convert it to an INT value 0. Converting it to an epoch will allow it to serialize, but then the write API fails it with Could not convert value to date..

I'm not sure what the expectation is, but as of now there is an incompatibility between the client and the storage write API.

EDIT: It appears that if I pass in number of days since 1970-01-01, it will successfully serialize and save in storage write API.

bit unfortunate that the type is restricted; if strings worked for streaming insert it'd be nice to work for the storage write API. but all the same this is something we can live with.

Environment details

  • OS: Mac
  • Node.js version: 18.X
  • npm version: 9.5.1
  • @google-cloud/bigquery-storage version: adapt branch

Steps to reproduce

  1. Have a table with a DATE field
await bigquery.dataset('mydataset')
.createTable('mytable',
{
"schema": {
      "fields": [
        {
          "name": "coolthing",
          "type": "STRING"
        },
        {
          "name": "date",
          "type": "DATE"
        },
        {
          "name": "ts",
          "type": "DATETIME"
        }
      ]
    },
    "timePartitioning": {
      "type": "DAY",
      "expirationMS": "7776000000",
      "field": "date"
    }
})
  1. Use adapt and protojs to write JSON events to table:
const dataset = bigquery.dataset('mydataset')
const table = await dataset.table('mytable')
const [metadata] = await table.getMetadata()
const storageSchema = adapt.convertBigQuerySchemaToStorageTableSchema(
 metadata.schema
)
const protoDescriptor = adapt.convertStorageSchemaToProto2Descriptor(
 storageSchema,
 tableId
)
const protoType = Type.fromDescriptor(protoDescriptor)

const events = [{
 coolthing: 'test close goal',
 date: '2023-01-01',
 ts: '2023-01-01T06:24:56',
}]
const serializedRows = events.map(event =>
 protoType.encode(event).finish()
)

// error: date field is now 0
const deserializedRows = serializedRows.map(event =>
 protoType.decode(event))


const stream = bigQueryWriteClient.appendRows(
   // open connection to stream
 )

// throws 'error' event
stream.write({
 //something soemthing serialized rows
})
  1. This fails. It appears that the adapt and protojs libaries expect DATE to be formatted as an int. However if we convert it to an epoch then the API still throws an error: Could not convert value to date.

Making sure to follow these steps will guarantee the quickest resolution possible.

Thanks!

Metadata

Metadata

Assignees

Labels

api: bigquerystorageIssues related to the googleapis/nodejs-bigquery-storage API.priority: p2Moderately-important priority. Fix may not be included in next release.type: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions