Skip to content

Integration test test_database_glue/test.py::test_hide_sensitive_info is flaky #79719

@pamarcos

Description

@pamarcos

https://s3.amazonaws.com/clickhouse-test-reports/json.html?PR=77153&sha=c5133d61bfb8dae2abec987fb9662a69d24ba720&name_0=PR&name_1=Integration+tests+%28release%2C+1%2F4%29

=================================== FAILURES ===================================
___________________________ test_hide_sensitive_info ___________________________
[gw4] linux -- Python 3.10.12 /usr/bin/python3

started_cluster = <helpers.cluster.ClickHouseCluster object at 0x7fea8dc6b130>

    def test_hide_sensitive_info(started_cluster):
        node = started_cluster.instances["node1"]
    
        test_ref = f"test_hide_sensitive_info_{uuid.uuid4()}"
        table_name = f"{test_ref}_table"
        root_namespace = f"{test_ref}_namespace"
    
        namespace = f"{root_namespace}_A"
        catalog = load_catalog_impl(started_cluster)
        catalog.create_namespace(namespace)
    
>       table = create_table(catalog, namespace, table_name)

test_database_glue/test.py:288: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
test_database_glue/test.py:100: in create_table
    return catalog.create_table(
/usr/local/lib/python3.10/dist-packages/pyiceberg/catalog/glue.py:417: in create_table
    self._write_metadata(staged_table.metadata, staged_table.io, staged_table.metadata_location)
/usr/local/lib/python3.10/dist-packages/pyiceberg/catalog/__init__.py:946: in _write_metadata
    ToOutputFile.table_metadata(metadata, io.new_output(metadata_path))
/usr/local/lib/python3.10/dist-packages/pyiceberg/serializers.py:130: in table_metadata
    with output_file.create(overwrite=overwrite) as output_stream:
/usr/local/lib/python3.10/dist-packages/pyiceberg/io/pyarrow.py:315: in create
    output_file = self._filesystem.open_output_stream(self._path, buffer_size=self._buffer_size)
pyarrow/_fs.pyx:887: in pyarrow._fs.FileSystem.open_output_stream
    ???
pyarrow/error.pxi:155: in pyarrow.lib.pyarrow_internal_check_status
    ???
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>   ???
E   OSError: When initiating multiple part upload for key 'data/metadata/00000-c60a2644-2848-42bc-8882-af0a9d4b3025.metadata.json' in bucket 'warehouse': AWS Error NO_SUCH_BUCKET during CreateMultipartUpload operation: The specified bucket does not exist

pyarrow/error.pxi:92: OSError

Metadata

Metadata

Assignees

Labels

flaky testflaky test found by CI

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions