Any Airflow project day 1, you can spin up a local desktop Kubernetes Airflow environment AND one in Google Cloud Composer with tested data pipelines(DAGs) 🖥️ >> [ 🚀, 🚢 ]
-
Updated
Sep 21, 2023 - HCL
Any Airflow project day 1, you can spin up a local desktop Kubernetes Airflow environment AND one in Google Cloud Composer with tested data pipelines(DAGs) 🖥️ >> [ 🚀, 🚢 ]
Completely Serverless ELT platform that can be used for any integration. We just so happen to focus on threat intelligence right now :)
Deploy an open source and modern data stack in no time with Terraform
...an automated data pipeline that retrieves cryptocurrency data from the CoinCap API, processes and transforms it for analysis, and presents key metrics on a near-real-time dashboard
Final project for DataTalks.Club Data Engineering bootcamp
The ELT pipeline we’ve developed leverages several Google Cloud Services including Google Cloud Storage (GCS), BigQuery, Pub/Sub, Cloud Workflows, Cloud Run, and Cloud Build. We also use dbt for data transformation and Terraform for infrastructure as code.
Add a description, image, and links to the dbt topic page so that developers can more easily learn about it.
To associate your repository with the dbt topic, visit your repo's landing page and select "manage topics."