This project implements a modern data pipeline using Azure Data Lake Storage Gen2, Azure Data Factory, and Databricks. The pipeline follows the Medallion Architecture (Bronze, Silver, Gold layers) to ingest, clean, and transform raw data into business-ready datasets for analytics and BI consumption.
-
Ingestion: Data is ingested from external sources (e.g., GitHub) into the Bronze layer via Azure Data Factory (ADF).
-
Bronze layer: Stores raw, unprocessed data in Azure Data Lake Storage Gen2.
-
Silver layer: Data is cleaned, standardized, and transformed in Databricks (PySpark/Delta).
-
Gold layer: Final, business-ready data stored in Delta format with external tables for fast querying.
-
Analytics/BI: The Gold layer is consumed by BI tools (e.g., Power BI, Synapse, Databricks SQL).
- Build a scalable, secure, and optimized data pipeline.
- Apply the Medallion Architecture to improve data quality step by step.
- Enable data analysts and BI tools to easily query business-ready data without dealing with raw/complex formats.
- Ensure data governance, performance, and interoperability through Delta Lake and external tables.
- Azure subscription with:
- Azure Data Lake Storage Gen2
- Azure Data Factory
- Azure Databricks workspace
- Service Principal for authentication (client ID, tenant ID, client secret).
- Use Azure Data Factory (ADF) to ingest data from GitHub (or other sources).
- Store raw files in the Bronze container (abfss://bronze@<storage_account>.dfs.core.windows.net/).
- Use Databricks (PySpark) to:
- Remove duplicates
- Handle missing values
- Standardize column names and formats
- Save as Delta files in the Silver container
- Apply business rules (e.g., extract categories, format dates, derive columns).
- Save transformed data in Delta format in the Gold container.
- Create an external table in Databricks for BI consumption:
- Connect Power BI, Synapse, or Databricks SQL to query data directly from the Gold layer tables.
- A robust pipeline that ingests, cleans, and transforms data.
- High-quality, business-ready datasets stored in Delta format.
- Fast and secure access for BI tools through external tables.
Hi! I'm Jordi Dangoh. I’m a Data Engineer, ready for new challenges. My goal is to improve myself, make complex/simple projects and give the best practice in the role of a data engineer. I don't have many years of experience, but my motivation and curiosity drive me to learn fast, adapt quickly, and contribute with impactful solutions.