A Model Context Protocol (MCP) server for Microsoft Fabric resource discovery and information retrieval. This server provides tools for authentication and accessing Microsoft Fabric resources through a standardized MCP interface.
- Azure AD Authentication: Interactive and service principal authentication
- Gateway Management: List, retrieve, and create Microsoft Fabric gateways (including VNet gateways)
- Connection Management: List, retrieve, and create Microsoft Fabric connections (cloud, on-premises, and VNet)
- Workspace Management: List and retrieve Microsoft Fabric workspaces
- Dataflow Management: List, create, and retrieve Microsoft Fabric dataflows
- Pipeline Management: List, create, update, run, monitor, and schedule Microsoft Fabric pipelines
- Copy Job Management: List, create, update, run, monitor, and schedule Microsoft Fabric copy jobs
- Capacity Management: List and retrieve Microsoft Fabric capacities
- Microsoft Fabric Integration: Support for on-premises, personal, and virtual network gateways
- 📦 NuGet Distribution: Available as a NuGet package for easy integration
- 🔧 MCP Protocol: Built using the official MCP C# SDK
- Authentication:
authenticate_interactive,authenticate_service_principal,get_authentication_status,get_access_token,sign_out - Gateway Management:
list_gateways,get_gateway,create_virtualnetwork_gateway - Connection Management:
list_supported_connection_types,list_connections,get_connection,create_connection - Workspace Management:
list_workspaces - Dataflow Management:
list_dataflows,create_dataflow,get_dataflow_definition,add_connection_to_dataflow,add_or_update_query_in_dataflow,save_dataflow_definition - Dataflow Refresh:
refresh_dataflow_background,refresh_dataflow_status - Dataflow Query Execution:
execute_query(Preview) - Capacity Management:
list_capacities - Pipeline Management:
list_pipelines,create_pipeline,get_pipeline,update_pipeline,get_pipeline_definition,update_pipeline_definition,run_pipeline,get_pipeline_run_status,create_pipeline_schedule,list_pipeline_schedules(Preview) - Copy Job Management:
list_copy_jobs,create_copy_job,get_copy_job,update_copy_job,get_copy_job_definition,update_copy_job_definition,run_copy_job,get_copy_job_run_status,create_copy_job_schedule,list_copy_job_schedules(Preview)
The server exposes interactive UI forms as MCP App resources, rendered directly inside VS Code chat:
| Resource URI | Description |
|---|---|
ui://datafactory/create-connection |
Guided wizard for creating a new data source connection (details) |
-
Configure your IDE: Create an MCP configuration file in your workspace:
VS Code: Create
.vscode/mcp.jsonVisual Studio: Create.mcp.jsonin solution directory{ "servers": { "DataFactory.MCP": { "type": "stdio", "command": "dnx", "args": [ "Microsoft.DataFactory.MCP", "--version", "#{VERSION}#", "--yes" ] } } } -
Start using: The server will be automatically downloaded and available in your IDE's MCP-enabled chat interface.
To run the server locally during development:
{
"servers": {
"DataFactory.MCP": {
"type": "stdio",
"command": "dotnet",
"args": [
"run",
"--project",
"path/to/DataFactory.MCP"
]
}
}
}- .NET 10.0 or later
- Azure AD tenant and application registration with appropriate permissions
- Environment variables for authentication (see Authentication Guide for setup details)
See the detailed guides for comprehensive usage instructions:
- Authentication: See Authentication Guide
- Gateway Management: See Gateway Management Guide
- Connection Management: See Connection Management Guide
- Workspace Management: See Workspace Management Guide
- Dataflow Management: See Dataflow Management Guide
- Capacity Management: See Capacity Management Guide
- Pipeline Management: See Pipeline Management Guide
- Copy Job Management: See Copy Job Management Guide
# Restore dependencies
dotnet restore
# Build the project
dotnet build
# Create NuGet package
dotnet pack -c Release- Configure your IDE with the development configuration shown above
- Run the project:
dotnet run - Test the tools through your MCP-enabled chat interface
Enhance your Claude experience with pre-built Data Factory skills that provide operational tips and best practices.
Upload the skill files from the claude-skills/ folder to your Claude Project:
- Go to your Claude Project settings
- Add these files to Project Knowledge:
datafactory-SKILL.md- Index file (always loaded)datafactory-core.md- M basics, MCP tools overviewdatafactory-performance.md- Query optimization, timeouts, chunkingdatafactory-destinations.md- Output configuration, programmatic setupdatafactory-advanced.md- Fast Copy, Action.Sequence, Modern Evaluator
| Skill | Topics |
|---|---|
| Core | M (Power Query) fundamentals, Dataflow Gen2 overview, MCP tool reference, connection management |
| Performance | Query timeouts, chunking strategies, filter optimization, connector selection |
| Destinations | Lakehouse architecture, schema settings, programmatic destination configuration |
| Advanced | Action.Sequence for writes, Fast Copy, Modern Evaluator |
Once installed, Claude will automatically reference these skills based on your questions:
- "My query is timing out" → loads performance tips
- "How do I set the output destination programmatically?" → loads destination guide
- "What's Fast Copy?" → loads advanced features
Create a Custom GPT or use ChatGPT Projects with pre-built Data Factory knowledge.
- Go to ChatGPT → Explore GPTs → Create
- Copy instructions from
chatgpt-skills/gpt-instructions.md - Upload knowledge files from
chatgpt-skills/:knowledge-core.md- M basics, Dataflow Gen2 overviewknowledge-performance.md- Query optimization, timeoutsknowledge-destinations.md- Output configurationknowledge-advanced.md- Fast Copy, Action.Sequence
See chatgpt-skills/README.md for detailed setup options.
For complete documentation, see our Documentation Index.
- Authentication Guide - Complete authentication setup and usage
- Gateway Management Guide - Gateway operations and examples
- Connection Management Guide - Connection operations and examples
- Workspace Management Guide - Workspace operations and examples
- Dataflow Management Guide - Dataflow operations and examples
- Capacity Management Guide - Capacity operations and examples
- Pipeline Management Guide - Pipeline operations and examples
- Copy Job Management Guide - Copy job operations and examples
- Architecture Guide - Technical architecture and design details
We welcome contributions! To get started:
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
Please follow standard .NET coding conventions and ensure all tests pass before submitting.
The server is designed for extensibility. For detailed information on extending functionality, see the Extension Points section in our architecture documentation, which covers:
- Adding New Tools: Create custom MCP tools for additional operations
- Adding New Services: Implement new services following our patterns
- Service Registration: Proper dependency injection setup
This modular architecture makes it easy to add support for additional Azure services or custom business logic.
This project is licensed under the MIT License - see the LICENSE file for details.
For issues and questions:
- Create an issue in this repository
- Review the MCP documentation