Skip to content

orchestra-hq/public-orchestra-airflow

Repository files navigation

Getting started with Apache Airflow

Examples in Apache Airflow

Hello World:

The code defines a DAG named hello_world. The DAG is scheduled to run once. The code defines three tasks: test_start, test_python, and test_bash. The test_start task is a dummy task that does nothing. The test_python task is a Python task that calls the hello_world_loop function. The test_bash task is a Bash task that runs the command echo Hello World!. The tasks are linked together using the >> operator. This operator specifies that the output of test_start will be passed as input to test_python and test_bash. This means that test_python and test_bash will only run after test_start has finished successfully

My Dag Task:

The code defines three tasks: task1, task2, and task3. The tasks run the commands echo "This is Task 1", echo "This is Task 2", and echo "This is Task 3", respectively. The tasks are linked together using the >> operator. This operator specifies that the output of task1 will be passed as input to task2 and task3. This means that task2 and task3 will only run after task1 has finished successfully.

Docker and Apache Airflow 2.6.3

The docker-compose.yaml file for Apache Airflow contains the necessary configuration to set up the Airflow services and dependencies, making it easier to get Airflow up and running in a Dockerized environment.

You can launch Apache Airflow by running the docker-compose up command in the same directory where the file is located. Docker Compose will then handle the creation and configuration of the necessary containers, networks, and volumes based on the specified configuration.

About

Best-practice airflow repo

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages