This project provides a minimal single-instance Apache Airflow setup using Docker Compose.
It’s ideal for learning, experimentation, and small-scale DAG testing — running both the webserver and scheduler in the same container with a local dags/ folder.
airflow-ref/
├── docker-compose.yml
└── dags/
docker-compose.yml– Defines a single Airflow container (webserver + scheduler).dags/– Place your DAG Python files here.
Make sure you have the following installed:
From the project root directory, run:
docker compose upThis will:
- Initialize the Airflow database (Postgress)
- Start both the webserver and scheduler
Once started, open your browser and visit:
http://localhost:8080
Login credentials:
- Username:
admin - Password:
admin
- This setup is intended only for local development and testing.
- For production or multi-user environments, use the official Airflow docker-compose.yaml with a PostgreSQL and Redis setup.