Airflow is a Python-based workflow management tool for data engineering pipelines. It simplifies the development, deployment, and monitoring of data pipelines by using DAGs. Visit Airflow documentation for more information.
This is a template project to use Apache Airflow for scheduling and monitoring workflows. It covers how DAGs and operators are defined, scheduled, and run using Airflow. The deployment is also included using Docker Compose which also creates a separate container for PostgreSQL database.
Run the following command to build/re-build the project using Docker:
docker compose up -d --build
Run the following command to start the project when it's already built:
docker compose up
After deployment, visit the following link in your browser to access the Airflow UI:
localhost:8080