Any Airflow project day 1, you can spin up a local desktop Kubernetes Airflow environment AND one in Google Cloud Composer with tested data pipelines(DAGs) 🖥️ >> [ 🚀, 🚢 ]
-
Updated
Sep 21, 2023 - HCL
Any Airflow project day 1, you can spin up a local desktop Kubernetes Airflow environment AND one in Google Cloud Composer with tested data pipelines(DAGs) 🖥️ >> [ 🚀, 🚢 ]
...an automated data pipeline that retrieves cryptocurrency data from the CoinCap API, processes and transforms it for analysis, and presents key metrics on a near-real-time dashboard
Completely Serverless ELT platform that can be used for any integration. We just so happen to focus on threat intelligence right now :)
Deploy an open source and modern data stack in no time with Terraform
The ELT pipeline we’ve developed leverages several Google Cloud Services including Google Cloud Storage (GCS), BigQuery, Pub/Sub, Cloud Workflows, Cloud Run, and Cloud Build. We also use dbt for data transformation and Terraform for infrastructure as code.
Final project for DataTalks.Club Data Engineering bootcamp
Refera Challenge (Part 2): dbt transformations for the data warehouse built on AWS Athena. This project models the raw data from the data lake into a star schema with fact and dimension tables. The transformations are deployed via AWS Lambda and scheduled with CloudWatch. Infrastructure is managed with Terraform.
GenAI data pipeline that performs data preparation, management and performance evaluation tasks for RAG systems using SQL as the primary development language. Please feel free to use this as a starting point for your own projects.
Add a description, image, and links to the dbt topic page so that developers can more easily learn about it.
To associate your repository with the dbt topic, visit your repo's landing page and select "manage topics."