Airflow Docker with pip instalation using requirement.txt and using DAG from cloned repository
-
Updated
Jul 3, 2023 - Dockerfile
Airflow Docker with pip instalation using requirement.txt and using DAG from cloned repository
Airflow, Spark and Kafka example
BigData Pipeline is a local testing environment for experimenting with various storage solutions (RDB, HDFS), query engines (Trino), schedulers (Airflow), and ETL/ELT tools (DBT). It supports MySQL, Hadoop, Hive, Kudu, and more.
Add a description, image, and links to the airflow-dags topic page so that developers can more easily learn about it.
To associate your repository with the airflow-dags topic, visit your repo's landing page and select "manage topics."