This project provides Apache Airflow environments using Docker Compose with support for both Airflow 2.x and Airflow 3.x versions.
- The
docker-compose.yamlfiles are based on the content from https://chalchichi.tistory.com/114. - The original technical reference is https://airflow.apache.org/docs/apache-airflow/stable/howto/docker-compose/index.html.
This repository supports two separate Airflow environments:
- Directory:
airflow-2.x/ - Default Image:
apache/airflow:2.10.2 - Web UI Port:
38080(configurable viaAIRFLOW_WEBSERVER_PORT) - Recommended for: Production environments requiring stability
- Directory:
airflow-3.x/ - Default Image:
apache/airflow:3.0.6 - API Server Port:
48080(configurable viaAIRFLOW_APISERVER_PORT) - Recommended for: Development and testing new features
βββ airflow-2.x/ # Airflow 2.x environment
β βββ docker-compose.yaml # Docker Compose configuration for 2.x
β βββ .env # Environment variables for 2.x
β βββ .env.template # Template for environment variables
β βββ run-airflow-cluster.sh
β βββ cleanup-airflow-cluster.sh
β βββ dags/ # Airflow DAG files
β βββ plugins/ # Airflow plugins
β βββ logs/ # Log files
β βββ config/ # Configuration files
βββ airflow-3.x/ # Airflow 3.x environment
β βββ docker-compose.yaml # Docker Compose configuration for 3.x
β βββ .env # Environment variables for 3.x
β βββ .env.template # Template for environment variables
β βββ run-airflow-cluster.sh
β βββ cleanup-airflow-cluster.sh
β βββ dags/ # Airflow DAG files
β βββ plugins/ # Airflow plugins
β βββ logs/ # Log files
β βββ config/ # Configuration files
βββ docker-compose.yaml # Legacy configuration (deprecated)
- Docker and Docker Compose installed
- At least 4GB of RAM and 2 CPU cores recommended
-
Navigate to the Airflow 2.x directory:
cd airflow-2.x -
(Optional) Copy and customize environment variables:
cp .env.template .env # Edit .env file as needed -
Start the Airflow cluster:
./run-airflow-cluster.sh
Or manually:
docker-compose up airflow-init docker-compose up -d
-
Access Airflow Web UI:
- URL: http://localhost:38080 (or your configured port)
- Username:
airflow - Password:
airflow
-
Navigate to the Airflow 3.x directory:
cd airflow-3.x -
(Optional) Copy and customize environment variables:
cp .env.template .env # Edit .env file as needed -
Start the Airflow cluster:
./run-airflow-cluster.sh
-
Access Airflow API Server:
- URL: http://localhost:48080 (or your configured port)
- Username:
airflow - Password:
airflow
You can run both Airflow 2.x and 3.x environments simultaneously on different ports:
-
Start Airflow 2.x (port 38080):
cd airflow-2.x && ./run-airflow-cluster.sh
-
Start Airflow 3.x (port 48080):
cd airflow-3.x && ./run-airflow-cluster.sh
Each version has its own .env file with version-specific configurations:
AIRFLOW_UID: User ID for Airflow containers (default: 50000)AIRFLOW_PROJ_DIR: Base directory for volume mounts (default: .)_AIRFLOW_WWW_USER_USERNAME: Admin username (default: airflow)_AIRFLOW_WWW_USER_PASSWORD: Admin password (default: airflow)_PIP_ADDITIONAL_REQUIREMENTS: Additional Python packages
Airflow 2.x:
AIRFLOW_IMAGE_NAME:apache/airflow:2.10.2AIRFLOW_WEBSERVER_PORT: Web UI port (default: 8080, example: 38080)
Airflow 3.x:
AIRFLOW_IMAGE_NAME:apache/airflow:3.0.6AIRFLOW_APISERVER_PORT: API server port (default: 48080)
| Feature | Airflow 2.x | Airflow 3.x |
|---|---|---|
| Authentication | Basic Auth | FabAuthManager |
| Image Version | 2.10.2 | 3.0.6 |
| Default Port | 38080 | 48080 |
| Provider Packages | providers | distributions |
| Stability | Production Ready | Beta/Development |
Both environments use Docker volumes and bind mounts to persist data:
- postgres-db-volume: PostgreSQL database files
dags/β/opt/airflow/dags(DAG files)logs/β/opt/airflow/logs(Log files)plugins/β/opt/airflow/plugins(Plugin files)config/β/opt/airflow/config(Configuration files)
To stop and clean up the environment:
# For Airflow 2.x
cd airflow-2.x && ./cleanup-airflow-cluster.sh
# For Airflow 3.x
cd airflow-3.x && ./cleanup-airflow-cluster.shOr manually:
docker-compose down -v
docker volume pruneAfter starting the cluster, you can create additional admin users:
# Access the webserver container
docker-compose exec airflow-webserver /bin/bash
# Create a new admin user
airflow users create \
--username admin \
--firstname Admin \
--lastname User \
--role Admin \
--email [email protected] \
--password admin# Check running containers
docker-compose ps
# View logs
docker-compose logs airflow-webserver
docker-compose logs airflow-scheduler
# Access container shell
docker-compose exec airflow-webserver /bin/bash
# List Docker volumes
docker volume lsFeel free to suggest improvements or ask questions by opening an issue or pull request.
Note: The root-level docker-compose.yaml file is deprecated and maintained for backward compatibility. Please use the version-specific directories for new deployments.