Skip to content

Conversation

@mobuchowski
Copy link
Contributor

Expose snowflake query id to snowflake hook and snowflake operator.
This will allow us to collect data lineage for snowflake operator in Marquez project.


^ Add meaningful description above

Read the Pull Request Guidelines for more information.
In case of fundamental code change, Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in UPDATING.md.

@boring-cyborg boring-cyborg bot added area:providers provider:snowflake Issues related to Snowflake provider labels Apr 26, 2021
@boring-cyborg
Copy link

boring-cyborg bot commented Apr 26, 2021

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
Here are some useful points:

  • Pay attention to the quality of your code (flake8, pylint and type annotations). Our pre-commits will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it’s a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: [email protected]
    Slack: https://s.apache.org/airflow-slack

Copy link
Member

@potiuk potiuk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting, and while I see why you want to copy the run method, it lacks support for multiple queries

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are two thigns I do not like here:

  1. Copying the logic of the DBApi run() method. I do not think however we can do too much about it though, because we do not want to modify the DBApi internals and make the provider depends on future version of Airflow.. So not much we can do here.

  2. We only store LAST query id even if we execute a sequence of queries. I think we should keep an array of those. rather than overwrite the query id with the last one

Copy link
Contributor

@eladkal eladkal Apr 26, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Related to point 1 - this problem (allow multiple queries) is not unique to Snowflake. This also applies to Presto and Trino so whatever solution that will be accepted here will most likely be copied to the other two providers as well.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the feedback, I agree with both points and will adjust.
I did not find better way than copying without changing DBApi semantics, also, when looking for similar solution saw that that's what some connectors do already - like Exasol one.

Copy link
Member

@potiuk potiuk Apr 26, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could potentially add some "callbacks" in the DBHook implementation, but this would mean Airflow 2.1+ compatibility. I am not sure it's worth it. Maybe we could figure out some ways of doing back-compatibility and implement it "properly" (like copying the method but use calbacks/original run with callbacks() when available? But I am not sure if it is worth it either.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That could possibly allow native OpenLineage integration in airflow, but that's separate topic 🙂

Maybe we could figure out some ways of doing back-compatibility and implement it "properly" (like copying the method but use calbacks/original run with callbacks() when available?

Yes. that makes sense - there's a lot of metadata that some databases expose in a non standard way. One example is BigQueryExecuteQueryOperator, which puts job id into xcoms.
For now, I don't think exposing query id in this particular way in this particular provider will make it in the future.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Open Lineage integration is a cool idea. @bolkedebruin might be interested :) . How about starting discussion about it at the devlist @mobuchowski ? Sounds like a very good idea.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're discussing it internally and will come back with some ideas surely 🙂

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to save the query_ids as the inlets/outlets defined for the LineageBackend? Anyone have an idea of how sending this lineage data to the LineageBackend would be accomplished?

Copy link
Member

@potiuk potiuk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please add support (and tests) for mulitple queries ?

Copy link
Contributor

@eladkal eladkal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR seems to incorporate also #15128 (and #11350 )

A side note: I'm not sure why my suggestion in the other PR to explore using execute_string
was neglected #11350 (comment) but mentioning it here as a possible alternative to explore if finding it useful.

Copy link
Contributor

@eladkal eladkal Apr 26, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Related to point 1 - this problem (allow multiple queries) is not unique to Snowflake. This also applies to Presto and Trino so whatever solution that will be accepted here will most likely be copied to the other two providers as well.

@mobuchowski mobuchowski changed the title Expose snowflake query_id in snowflake hook and operator Expose snowflake query_id in snowflake hook and operator, support multiple statements in sql string Apr 27, 2021
@mobuchowski
Copy link
Contributor Author

@potiuk @eladkal added requested changes.

Copy link
Member

@potiuk potiuk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. @eladkal - are you ok with it in the context of other dbs? I think any DBHook change will have to be done as part of 2.1 but maybe this is a good start to implement it "per operator" and then generalise the approach for 2.1 (and then we could release versions of providers that will be 2.1+ compatible only when we have the generalisation in 2.1- as we discussed that we will do at some point in time.

@github-actions github-actions bot added the okay to merge It's ok to merge this PR as it does not require more tests label Apr 27, 2021
@github-actions
Copy link

The PR is likely OK to be merged with just subset of tests for default Python and Database versions without running the full matrix of tests, because it does not modify the core of Airflow. If the committers decide that the full tests matrix is needed, they will add the label 'full tests needed'. Then you should rebase to the latest master or amend the last commit of the PR, and push it with --force-with-lease.

@github-actions
Copy link

The Workflow run is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*.

@mobuchowski
Copy link
Contributor Author

@potiuk one more CI run? 🙂

@mobuchowski mobuchowski force-pushed the expose-snowflake-query-id branch from 5fc8d29 to 907611f Compare April 30, 2021 13:47
@mobuchowski
Copy link
Contributor Author

@potiuk @eladkal rebased on master. Can we merge it if CI will pass?

@potiuk
Copy link
Member

potiuk commented Apr 30, 2021

🤞

@eladkal
Copy link
Contributor

eladkal commented Apr 30, 2021

LGTM. @eladkal - are you ok with it in the context of other dbs? I think any DBHook change will have to be done as part of 2.1 but maybe this is a good start to implement it "per operator" and then generalise the approach for 2.1 (and then we could release versions of providers that will be 2.1+ compatible only when we have the generalisation in 2.1- as we discussed that we will do at some point in time.

It's OK to do it per operator as well

@potiuk potiuk merged commit c6be8b1 into apache:master Apr 30, 2021
@boring-cyborg
Copy link

boring-cyborg bot commented Apr 30, 2021

Awesome work, congrats on your first merged pull request!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:providers okay to merge It's ok to merge this PR as it does not require more tests provider:snowflake Issues related to Snowflake provider

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants