Skip to content

Create and Automate a set of data pipelines with airflow

Notifications You must be signed in to change notification settings

arobai/Data-Pipelines-Airflow

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Data-Pipelines-Airflow

Create and Automate a set of data pipelines with airflow

This project will introduce you to the core concepts of Apache Airflow. The source data resides in S3 and needs to be processed in Sparkify's data warehouse in Amazon Redshift. The source datasets consist of CSV logs that tell about user activity in the application and JSON metadata about the songs the users listen to

About

Create and Automate a set of data pipelines with airflow

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages