Skip to content

Improve example docs around SQLExecuteQueryOperator in all SQLrelated providers  #44807

@eladkal

Description

@eladkal

Body

Context: #44707 (comment)

In Sqlite provider we have operators.rst which explains how to leverage SQLExecuteQueryOperator from common.sql provider to be used with Sqlite. We should have similar doc and examples for all SQL related operators that were removed as part of #44559

The Task:
verify that we have operators.rst in each of the above providers that explain how to use SQLExecuteQueryOperator with the specific provider See sqlite provider operators.rst for example template:

.. _howto/operator:SqliteOperator:

SQLExecuteQueryOperator to connect to Sqlite
============================================

Use the :class:`SQLExecuteQueryOperator<airflow.providers.common.sql.operators.sql>` to execute
Sqlite commands in a `Sqlite <https://sqlite.org/lang.html>`__ database.

.. note::
    Previously, ``SqliteOperator`` was used to perform this kind of operation. After deprecation this has been removed. Please use ``SQLExecuteQueryOperator`` instead.

Using the Operator
^^^^^^^^^^^^^^^^^^

Use the ``conn_id`` argument to connect to your Sqlite instance where
the connection metadata is structured as follows:

.. list-table:: Sqlite Airflow Connection Metadata
   :widths: 25 25
   :header-rows: 1

   * - Parameter
     - Input
   * - Host: string
     - Sqlite database file

An example usage of the SQLExecuteQueryOperator to connect to Sqlite is as follows:

.. exampleinclude:: /../../providers/tests/system/sqlite/example_sqlite.py
    :language: python
    :start-after: [START howto_operator_sqlite]
    :end-before: [END howto_operator_sqlite]

Furthermore, you can use an external file to execute the SQL commands. Script folder must be at the same level as DAG.py file.

.. exampleinclude:: /../../providers/tests/system/sqlite/example_sqlite.py
    :language: python
    :start-after: [START howto_operator_sqlite_external_file]
    :end-before: [END howto_operator_sqlite_external_file]

If possible accommodate for the specific parameters that can be passed from the provider to the hook.
For example in Postgres we can pass hook_params for enabling logging so we explain about this explicitly with example

Screenshot 2024-12-10 at 10 42 52

The items:

  • Exasol (Previously had ExasolOperator)
  • apache.drill (Previously had DrillOperator)
  • jdbc (Previously had JdbcOperator)
  • microsoft.mssql (Previously had MsSqlOperator)
  • mysql (Previously had MySQLOperator)
  • oracle (Previously had OracleOperator)
  • presto Previously had PrestoOperator)
  • snowflake (Previously had SnowflakeOperator)
  • trino (Previously had TrinoOperator)
  • vertica (Previously had VerticaOperator)
  • postgres (Previously had PostgresOperator) Docs already exists but requires tweaks to align the style

Committer

  • I acknowledge that I am a maintainer/committer of the Apache Airflow project.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions