This repository was archived by the owner on Nov 12, 2025. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 45
This repository was archived by the owner on Nov 12, 2025. It is now read-only.
tests.system.reader.test_reader_dataframe: test_read_rows_to_dataframe[v1-AVRO-avro_schema] failed #559
Copy link
Copy link
Closed
Labels
api: bigquerystorageIssues related to the googleapis/python-bigquery-storage API.Issues related to the googleapis/python-bigquery-storage API.flakybot: issueAn issue filed by the Flaky Bot. Should not be added manually.An issue filed by the Flaky Bot. Should not be added manually.priority: p1Important issue which blocks shipping the next release. Will be fixed prior to next release.Important issue which blocks shipping the next release. Will be fixed prior to next release.type: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Description
Note: #373 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
commit: 3b86c7e
buildURL: Build Status, Sponge
status: failed
Test output
client_and_types = (, ) project_id = 'precise-truck-742', data_format = 'AVRO' expected_schema_type = 'avro_schema'@pytest.mark.parametrize( "data_format,expected_schema_type", (("AVRO", "avro_schema"), ("ARROW", "arrow_schema")), ) def test_read_rows_to_dataframe( client_and_types, project_id, data_format, expected_schema_type ): client, types = client_and_types read_session = types.ReadSession() read_session.table = "projects/{}/datasets/{}/tables/{}".format( "bigquery-public-data", "new_york_citibike", "citibike_stations" ) read_session.data_format = data_format session = client.create_read_session( request={ "parent": "projects/{}".format(project_id), "read_session": read_session, "max_stream_count": 1, } ) schema_type = session._pb.WhichOneof("schema") assert schema_type == expected_schema_type stream = session.streams[0].name frame = client.read_rows(stream).to_dataframe( session, dtypes={"latitude": numpy.float16} ) # Station ID is a required field (no nulls), so the datatype should always # be integer.assert frame.station_id.dtype.name == "int64"E AssertionError: assert 'object' == 'int64'
E - int64
E + objecttests/system/reader/test_reader_dataframe.py:92: AssertionError
Metadata
Metadata
Assignees
Labels
api: bigquerystorageIssues related to the googleapis/python-bigquery-storage API.Issues related to the googleapis/python-bigquery-storage API.flakybot: issueAn issue filed by the Flaky Bot. Should not be added manually.An issue filed by the Flaky Bot. Should not be added manually.priority: p1Important issue which blocks shipping the next release. Will be fixed prior to next release.Important issue which blocks shipping the next release. Will be fixed prior to next release.type: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.Error or flaw in code with unintended results or allowing sub-optimal usage patterns.