Skip to content

KeyError: 0 error with common-sql version 1.3.0 #27978

@park-peter

Description

@park-peter

Apache Airflow Provider(s)

common-sql

Versions of Apache Airflow Providers

apache-airflow-providers-amazon==6.0.0
apache-airflow-providers-apache-hive==4.0.1
apache-airflow-providers-apache-livy==3.1.0
apache-airflow-providers-celery==3.0.0
apache-airflow-providers-cncf-kubernetes==4.4.0
apache-airflow-providers-common-sql==1.3.0
apache-airflow-providers-databricks==3.3.0
apache-airflow-providers-dbt-cloud==2.2.0
apache-airflow-providers-elasticsearch==4.2.1
apache-airflow-providers-ftp==3.1.0
apache-airflow-providers-google==8.4.0
apache-airflow-providers-http==4.0.0
apache-airflow-providers-imap==3.0.0
apache-airflow-providers-microsoft-azure==4.3.0
apache-airflow-providers-postgres==5.2.2
apache-airflow-providers-redis==3.0.0
apache-airflow-providers-sftp==4.1.0
apache-airflow-providers-snowflake==3.3.0
apache-airflow-providers-sqlite==3.2.1
apache-airflow-providers-ssh==3.2.0

Apache Airflow version

2.4.3

Operating System

Debian Bullseye

Deployment

Astronomer

Deployment details

No response

What happened

With the latest version of common-sql provider, the get_records from hook is now a ordinary dictionary, causing this KeyError with SqlSensor:

[2022-11-29, 00:39:18 UTC] {taskinstance.py:1851} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/airflow/sensors/base.py", line 189, in execute
    poke_return = self.poke(context)
  File "/usr/local/lib/python3.9/site-packages/airflow/providers/common/sql/sensors/sql.py", line 98, in poke
    first_cell = records[0][0]
KeyError: 0

I have only tested with Snowflake, I haven't tested it with other databases. Reverting back to 1.2.0 solves the issue.

What you think should happen instead

It should return an iterable list as usual with the query.

How to reproduce

from datetime import datetime

from airflow import DAG
from airflow.providers.common.sql.sensors.sql import SqlSensor

with DAG(
    dag_id="sql_provider_snowflake_test",
    schedule=None,
    start_date=datetime(2022, 1, 1),
    catchup=False,
):
    t1 = SqlSensor(
        task_id="snowflake_test",
        conn_id="snowflake",
        sql="select 0",
        fail_on_empty=False,
        poke_interval=20,
        mode="poke",
        timeout=60 * 5,
    )

Anything else

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:providersduplicateIssue that is duplicatedkind:bugThis is a clearly a bugpriority:mediumBug that should be fixed before next release but would not block a release

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions