-
Notifications
You must be signed in to change notification settings - Fork 16.3k
Closed
Labels
Milestone
Description
Apache Airflow version
2.3.0 (latest released)
What happened
Standalone DagProcessor which run in Apache Airflow Production Docker Image failed with error
airflow-dag-processor_1 |
airflow-dag-processor_1 | Traceback (most recent call last):
airflow-dag-processor_1 | File "/home/airflow/.local/bin/airflow", line 8, in <module>
airflow-dag-processor_1 | sys.exit(main())
airflow-dag-processor_1 | File "/home/airflow/.local/lib/python3.7/site-packages/airflow/__main__.py", line 38, in main
airflow-dag-processor_1 | args.func(args)
airflow-dag-processor_1 | File "/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/cli_parser.py", line 51, in command
airflow-dag-processor_1 | return func(*args, **kwargs)
airflow-dag-processor_1 | File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/cli.py", line 99, in wrapper
airflow-dag-processor_1 | return f(*args, **kwargs)
airflow-dag-processor_1 | File "/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/commands/dag_processor_command.py", line 80, in dag_processor
airflow-dag-processor_1 | manager.start()
airflow-dag-processor_1 | File "/home/airflow/.local/lib/python3.7/site-packages/airflow/dag_processing/manager.py", line 475, in start
airflow-dag-processor_1 | os.setpgid(0, 0)
airflow-dag-processor_1 | PermissionError: [Errno 1] Operation not permitted
This error not happen if directly run in host system by airflow dag-processor
Seems like this issue happen because when we run in Apache Airflow Production Docker Image airflow process is session leader, and according to man setpgid
ERRORS
setpgid() will fail and the process group will not be altered if:
[EPERM] The process indicated by the pid argument is a session leader.
What you think should happen instead
dag-processor should start in docker without error
How to reproduce
- Use simple docker-compose file which use official Airflow 2.3.0 image
# docker-compose-dag-processor.yaml
version: '3'
volumes:
postgres-db-volume:
aiflow-logs-volume:
x-airflow-common:
&airflow-common
image: apache/airflow:2.3.0-python3.7
environment:
AIRFLOW__SCHEDULER__STANDALONE_DAG_PROCESSOR: 'True'
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:insecurepassword@postgres/airflow
volumes:
- aiflow-logs-volume:/opt/airflow/logs
- ${PWD}/dags:/opt/airflow/dags
user: "${AIRFLOW_UID:-50000}:0"
extra_hosts:
- "host.docker.internal:host-gateway"
services:
postgres:
image: postgres:13
environment:
POSTGRES_USER: airflow
POSTGRES_PASSWORD: insecurepassword
POSTGRES_DB: airflow
ports:
- 55432:5432
volumes:
- postgres-db-volume:/var/lib/postgresql/data
healthcheck:
test: ["CMD", "pg_isready", "-U", "airflow"]
interval: 5s
retries: 5
restart: unless-stopped
airflow-upgrade-db:
<<: *airflow-common
command: ["db", "upgrade"]
depends_on:
postgres:
condition: service_healthy
airflow-dag-processor:
<<: *airflow-common
command: dag-processor
restart: unless-stopped
depends_on:
airflow-upgrade-db:
condition: service_completed_successfullydocker-compose -f docker-compose-dag-processor.yaml up
Operating System
macOS Monterey 12.3.1
Versions of Apache Airflow Providers
No response
Deployment
Docker-Compose
Deployment details
Docker: 20.10.12
docker-compose: 1.29.2
Anything else
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct
vrashkevich and mivashkevich