Skip to content

GraphtyLove/airflow_with_docker

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Airflow with Docker

Airflow

Apache Airflow is a pipeline system to manage jobs. It’s mainly used to create data-pipeline systems.

This is used a lot in the industry because of its ability to structure code execution. You can also find a lot of tools that are based on Airflow, so mastering this tool will help you with others. Kubeflow is a great example of that.

We will use Docker to run Airflow. This is a great way to run Airflow because it will allow you to run it in any environment. You can run it in your local machine, in a server, or even in a cloud provider.

Let's see how we can do that.

Requirements

  • Docker installed and running
  • Docker Compose installed

Usage

Execute the starting script that will:

  1. Build the Docker image for each task
  2. Start Airflow
bash ./scripts/start.sh

Resources

About

Small demo of how to use airflow with Docker and DockerOperators

Topics

Resources

Stars

Watchers

Forks