This is a guided project where I used Azure Data Factory to create a data pipeline and connected it to Azure Data Studio for data processing and visualization.
- Azure Data Factory
- Azure Data Studio
- Azure Blob Storage
- Azure SQL Database
- SQL
In this project, I built a data pipeline in Azure Data Factory to automate the process of extracting data, transforming it, and loading it into Azure Data Studio. The pipeline processes data from multiple sources and loads it into a destination for further analysis.
- Set up the Azure Data Factory environment.
- Create a pipeline to extract data from multiple sources using linked services.
- Use data flows to transform the extracted data.
- Load the transformed data into an Azure SQL Database.
- Connect the Azure SQL Database to Azure Data Studio.
- Schedule the pipeline to run automatically (optional).
- To replicate this project, you will need access to Azure and set up Azure Data Factory.
- You can use the provided ARM template to import the pipeline structure into your own Azure Data Factory instance.
This project demonstrates how to use Azure Data Factory to automate data pipelines that extract, transform and load (ETL) data from multiple sources into Azure SQL database and also connect your server to Azure data studio.