@@ -410,20 +410,20 @@ The PROD image entrypoint works as follows:
410410 This is in order to accommodate the
411411 `OpenShift Guidelines <https://docs.openshift.com/enterprise/3.0/creating_images/guidelines.html >`_
412412
413- * If ``AIRFLOW__CORE__SQL_ALCHEMY_CONN `` variable is passed to the container and it is either mysql or postgres
414- SQL alchemy connection, then the connection is checked and the script waits until the database is reachable.
415-
416- * If no ``AIRFLOW__CORE__SQL_ALCHEMY_CONN `` variable is set or if it is set to sqlite SQL alchemy connection
417- then db reset is executed.
418-
419- * If ``AIRFLOW__CELERY__BROKER_URL `` variable is passed and scheduler, worker of flower command is used then
420- the connection is checked and the script waits until the Celery broker database is reachable.
421-
422413* The ``AIRFLOW_HOME `` is set by default to ``/opt/airflow/ `` - this means that DAGs
423414 are in default in the ``/opt/airflow/dags `` folder and logs are in the ``/opt/airflow/logs ``
424415
425416* The working directory is ``/opt/airflow `` by default.
426417
418+ * If ``AIRFLOW__CORE__SQL_ALCHEMY_CONN `` variable is passed to the container and it is either mysql or postgres
419+ SQL alchemy connection, then the connection is checked and the script waits until the database is reachable.
420+ If ``AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD `` variable is passed to the container, it is evaluated as a
421+ command to execute and result of this evaluation is used as ``AIRFLOW__CORE__SQL_ALCHEMY_CONN ``. The
422+ ``_CMD `` variable takes precedence over the ``AIRFLOW__CORE__SQL_ALCHEMY_CONN `` variable.
423+
424+ * If no ``AIRFLOW__CORE__SQL_ALCHEMY_CONN `` variable is set then SQLite database is created in
425+ ${AIRFLOW_HOME}/airflow.db and db reset is executed.
426+
427427* If first argument equals to "bash" - you are dropped to a bash shell or you can executes bash command
428428 if you specify extra arguments. For example:
429429
@@ -436,7 +436,6 @@ The PROD image entrypoint works as follows:
436436 drwxr-xr-x 2 airflow root 4096 Jun 5 18:12 dags
437437 drwxr-xr-x 2 airflow root 4096 Jun 5 18:12 logs
438438
439-
440439 * If first argument is equal to "python" - you are dropped in python shell or python commands are executed if
441440 you pass extra parameters. For example:
442441
@@ -445,13 +444,27 @@ The PROD image entrypoint works as follows:
445444 > docker run -it apache/airflow:master-python3.6 python -c " print('test')"
446445 test
447446
448- * If there are any other arguments - they are passed to "airflow" command
447+ * If first argument equals to "airflow" - the rest of the arguments is treated as an airflow command
448+ to execute. Example:
449449
450450.. code-block :: bash
451451
452- > docker run -it apache/airflow:master-python3.6
452+ docker run -it apache/airflow:master-python3.6 airflow webserver
453+
454+ * If there are any other arguments - they are simply passed to the "airflow" command
455+
456+ .. code-block :: bash
457+
458+ > docker run -it apache/airflow:master-python3.6 version
453459 2.0.0.dev0
454460
461+ * If ``AIRFLOW__CELERY__BROKER_URL `` variable is passed and airflow command with
462+ scheduler, worker of flower command is used, then the script checks the broker connection
463+ and waits until the Celery broker database is reachable.
464+ If ``AIRFLOW__CELERY__BROKER_URL_CMD `` variable is passed to the container, it is evaluated as a
465+ command to execute and result of this evaluation is used as ``AIRFLOW__CELERY__BROKER_URL ``. The
466+ ``_CMD `` variable takes precedence over the ``AIRFLOW__CELERY__BROKER_URL `` variable.
467+
455468Production image build arguments
456469--------------------------------
457470
0 commit comments