Skip to content

Project was based on an interest in Data Engineering, ETL pipeline. It also provided a good opportunity to develop skills and experience in a range of tools. As such, project is more complex than required, utilising dbt, airflow, docker and cloud based storage.

AnMol12499/Reddit-Analytics-Integration-Platform

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Reddit ETL Pipeline

A data pipeline to extract Reddit data from r/dataengineering.

Output is a Google Data Studio report, providing insight into the Data Engineering official subreddit.

Motivation

Project was based on an interest in Data Engineering and the types of Q&A found on the official subreddit.

It also provided a good opportunity to develop skills and experience in a range of tools. As such, project is more complex than required, utilising dbt, airflow, docker and cloud based storage.

Architecture

  1. Extract data using Reddit API
  2. Load into AWS S3
  3. Copy into AWS Redshift
  4. Transform using dbt
  5. Create PowerBI or Google Data Studio Dashboard
  6. Orchestrate with Airflow in Docker
  7. Create AWS resources with Terraform

Output

Setup

Follow below steps to setup pipeline. I've tried to explain steps where I can. Feel free to make improvements/changes.

NOTE: This was developed using an M1 Macbook Pro. If you're on Windows or Linux, you may need to amend certain components if issues are encountered.

As AWS offer a free tier, this shouldn't cost you anything unless you amend the pipeline to extract large amounts of data, or keep infrastructure running for 2+ months. However, please check AWS free tier limits, as this may change.

First clone the repository into your home directory and follow the steps.

git clone https://github.com/AnMol12499/Reddit-Analytics-Integration-Platform.git

Getting Started

To begin using the project, follow these steps:

  1. Overview
  2. Reddit API Configuration
  3. AWS Account
  4. Infrastructure with Terraform
  5. Configuration Details
  6. Docker & Airflow
  7. dbt
  8. Dashboard
  9. Final Notes & Termination
  10. Improvements

More Details

Project Structure: The project's structure includes directories for infrastructure (Terraform), configuration (AWS and Airflow), data extraction (Python scripts), and optional steps like dbt and BI tools integration.

Customization: Feel free to customize the project by modifying configurations, adding new data sources, or integrating additional tools as needed.

About

Project was based on an interest in Data Engineering, ETL pipeline. It also provided a good opportunity to develop skills and experience in a range of tools. As such, project is more complex than required, utilising dbt, airflow, docker and cloud based storage.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published