End-to-end Azure Data Engineering pipeline using ADF, Databricks (PySpark), ADLS Gen2, Azure SQL, and Power BI for COVID-19 analytics
-
Updated
Dec 26, 2025 - Python
End-to-end Azure Data Engineering pipeline using ADF, Databricks (PySpark), ADLS Gen2, Azure SQL, and Power BI for COVID-19 analytics
End-to-end Azure Data Engineering project using ADF for incremental ingestion, Databricks (DLT) for Medallion Architecture, and Delta Lake for CDC (SCD Type 1). Managed via Databricks Asset Bundles (DABs) for professional CI/CD. Focuses on real-time streaming, scalability, and Star Schema modeling.
Real-time streaming data pipeline using Apache Kafka, Spark Structured Streaming, and Delta Lake on Azure. Secure SSL Kafka integration, ADLS storage with OAuth2, and ML-driven anomaly detection with automated email alerts. Modular, scalable, and configurable for IoT and log analytics pipelines.
Predictive maintenance alert pipeline (C-MAPSS): ingest → preprocess to Parquet/Delta → train & score failure risk in Databricks → write alerts.json → Logic App notifies Teams/Email → Power Automate creates Planner tasks.
Sepsis prediction ML platform — XGBoost model trained on 20M ICU records using Azure ML, Fabric, and Power BI
A cloud-native data engineering pipeline built on Microsoft Azure to ingest, transform, and visualize COVID-19 data for reporting and analysis.
Python automation scripts for Microsoft Azure — ADF reporting, ADLS Gen2 ACL auditing, and trigger performance analysis
Azure-first medallion lakehouse for NYC 311 service request analytics using ADF, ADLS Gen2, Databricks, PySpark, Delta Lake, data quality checks, dimensional modeling, and reporting marts.
End-to-end Azure Data Engineering pipeline using Databricks, Delta Lake and ADLS Gen2. Medallion architecture (Bronze/Silver/Gold) with automated scheduling, monitoring and cost-optimized orchestration.
Add a description, image, and links to the adls-gen2 topic page so that developers can more easily learn about it.
To associate your repository with the adls-gen2 topic, visit your repo's landing page and select "manage topics."