A Model Context Protocol (MCP) server providing comprehensive file operations, parsing capabilities, and Scriban template processing. Built on .NET 10 with support for HTTP and Stdio transports.
This project is currently in active development and may undergo significant changes. Features and APIs are subject to change, and breaking changes may occur in future releases. Use at your own discretion.
FileScrubberMCP is a powerful, enterprise-ready MCP server that extends AI assistants with comprehensive file system operations, data parsing, and template rendering capabilities. It enables AI models to interact with local and remote files, query structured data across multiple formats, transform content, and generate documentation through Scriban templates.
-
📁 File System Operations: Read, write, and list files with full metadata including timestamps, sizes, and attributes. Supports recursive directory scanning and flexible search patterns.
-
🔍 Multi-Format Parsing: Query and extract data from JSON, XML, YAML, CSV, and Excel files using industry-standard query languages (JSONPath for JSON/YAML/CSV/Excel, XPath for XML).
-
🔄 Data Transformation: Transform XML documents using XSLT stylesheets with optional output to files.
-
📝 Template Processing: Render Scriban (.sbn) templates with JSON data to generate reports, documentation, or any text-based output. Process templates to files or return as strings.
-
🌐 HTTP Client Operations: Execute HTTP requests (GET, POST, PUT, PATCH, DELETE, HEAD, OPTIONS) with custom headers and JSON payloads.
-
⚙️ Workflow Automation: Execute complex multi-step workflows defined in JSON files. Chain operations together, pass data between steps, and automate entire data processing pipelines.
-
🤖 AI Integration: Built-in GitHub Copilot integration for AI-powered analysis and content generation within workflows.
-
🔌 Flexible Transport: Supports both HTTP and Stdio transports for seamless integration with various MCP clients and AI platforms.
- Data Analysis: Query and extract information from structured data files (JSON, XML, YAML, CSV, Excel)
- Report Generation: Create formatted reports from file listings, data queries, or any JSON data using Scriban templates
- File Management: Automate file operations, directory scanning, and content manipulation
- Data Transformation: Convert between formats, transform XML with XSLT, and process structured data
- API Integration: Make HTTP requests to REST APIs and process responses
- Documentation: Generate documentation from code, data, or file metadata
- Workflow Automation: Chain multiple operations together in declarative JSON workflows
- AI-Powered Analysis: Leverage GitHub Copilot for intelligent data analysis and content generation
Built on .NET 10 with a clean, modular architecture:
- Service Layer: Business logic for file, parser, template, and URI operations
- Tools Layer: MCP tool implementations exposing services to AI models
- Dependency Injection: Full DI support for testability and maintainability
- Structured Logging: Comprehensive logging with Serilog for monitoring and debugging
- Comprehensive Testing: 84 unit tests covering services and tools with edge cases
- Read Files - Read file contents from any path
- Write Files - Write content to files with automatic directory creation
- List Files - Comprehensive file listing with metadata (size, dates, attributes)
- Search patterns and recursive directory scanning
Support for multiple file formats with powerful query capabilities:
- JSON - JSONPath queries with key path preservation
- XML - XPath queries with namespace support
- YAML - JSONPath queries on YAML data
- CSV - JSONPath queries with header support
- Excel (.xlsx) - Multi-worksheet support with JSONPath queries
- XSLT - XML transformation capabilities (transform XML documents using XSLT stylesheets with optional file output)
- Process Templates - Render .sbn templates with JSON data and save to file
- Render Templates - Render templates and return output as string
- Supports loops, conditionals, filters, and custom functions
- Example templates for file listing reports included
Full HTTP client support for REST API integration:
- GET, POST, PUT, PATCH, DELETE - Standard HTTP methods
- HEAD, OPTIONS - HTTP metadata operations
- Custom headers and JSON payloads
- Parse and validate URIs
- Extract URI components (scheme, host, path, query, fragment)
Execute complex multi-step workflows with data flow between steps:
- Sequential Execution - Steps run in order with context sharing
- Data Transformation Pipelines - Chain file operations, HTTP requests, parsing, and templates
- Placeholder References - Use
{StepName.OutputName}to reference previous step outputs - Conditional Execution - Enable/disable steps dynamically
- GitHub Copilot Integration - AI-powered analysis and content generation within workflows
- See
Documents/WORKFLOW_SERVICE_README.mdfor details
- .NET 10.0 SDK or later
- Windows, Linux, or macOS
Note: Upgrading from .NET 9? See the Migration Guide for detailed upgrade instructions.
# Clone the repository
git clone <repository-url>
cd filescrubberMCP
# Restore dependencies
dotnet restore
# Build the project
dotnet build
# Run tests
dotnet testdotnet runThe server will start on http://localhost:5000 (or the configured port).
$env:FILESCRUBBER_MCP_TRANSPORT="Stdio"
dotnet runOr use the provided PowerShell scripts:
# HTTP mode
.\Scripts\Start-Http.ps1
# Stdio mode
.\Scripts\Start-Stdio.ps1{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"ConnectionStrings": {
"DefaultConnection": "Data Source=app.db"
},
"Cors": {
"AllowedOrigins": ["http://localhost:3000"],
"AllowedMethods": ["GET", "POST", "PUT", "DELETE"],
"AllowedHeaders": ["*"],
"ExposedHeaders": [],
"AllowCredentials": true
}
}FILESCRUBBER_MCP_TRANSPORT- Set to"Http"or"Stdio"(default:"Http")FILESCRUBBER_MCP_LOG_DIR- Custom directory for log files (default:Logsin application directory)FILESCRUBBER_MCP_ROOT_DIR- Root directory for file operations. When set, all relative file paths will be resolved relative to this directory. Useful for Docker deployments to set a working directory. (default: current working directory)
Reads the content of a file.
Parameters:
filePath(string) - Path to the file to read
Example:
{
"filePath": "C:\\Projects\\data.txt"
}Writes content to a file.
Parameters:
filePath(string) - Path to the file to writecontent(string) - Content to write to the file
Example:
{
"filePath": "C:\\Projects\\output.txt",
"content": "Hello, World!"
}Lists files in a directory with metadata.
Parameters:
directoryPath(string) - Directory to searchsearchPattern(string, optional) - File pattern (default: "*")recursive(bool, optional) - Search subdirectories (default: true)
Example:
{
"directoryPath": "C:\\Projects",
"searchPattern": "*.cs",
"recursive": true
}Search JSON files using JSONPath.
Parameters:
jsonFilePath(string) - Path to JSON filejsonPath(string) - JSONPath query (e.g., "$.users[*].email")indented(bool, optional) - Format output (default: true)showKeyPaths(bool, optional) - Include paths in results (default: false)
Search XML files using XPath.
Parameters:
xmlFilePath(string) - Path to XML filexPath(string) - XPath query (e.g., "//user/@email")indented(bool, optional) - Format output (default: true)showKeyPaths(bool, optional) - Include paths in results (default: false)
Search YAML files using JSONPath.
Parameters:
yamlFilePath(string) - Path to YAML filejsonPath(string) - JSONPath queryindented(bool, optional) - Format output (default: true)showKeyPaths(bool, optional) - Include paths in results (default: false)
Search CSV files using JSONPath.
Parameters:
csvFilePath(string) - Path to CSV filejsonPath(string) - JSONPath queryhasHeaderRecord(bool, optional) - First row is header (default: true)ignoreBlankLines(bool, optional) - Ignore blank lines (default: true)
Search Excel files using JSONPath.
Parameters:
excelFilePath(string) - Path to Excel file (.xlsx)jsonPath(string) - JSONPath query (e.g., "$.Sheet1[*].ColumnName")
Transform XML using XSLT stylesheet.
Parameters:
xmlFilePath(string) - Path to XML filexsltFilePath(string) - Path to XSLT stylesheetdestinationFilePath(string, optional) - Output file path
Process a Scriban template with JSON data and save to file.
Parameters:
templateFilePath(string) - Path to .sbn template filejsonData(string) - JSON data for templateoutputFilePath(string) - Output file path
Example:
{
"templateFilePath": "Examples/file_list_report.sbn",
"jsonData": "{\"title\":\"Report\",\"items\":[{\"name\":\"Item1\"}]}",
"outputFilePath": "output/report.md"
}Render a Scriban template and return the output.
Parameters:
templateFilePath(string) - Path to .sbn template filejsonData(string) - JSON data for template
Send a GET request to the specified URI.
Parameters:
uri(string) - URI to send GET request toheadersJson(string, optional) - JSON object of headers
Send a POST request to the specified URI.
Parameters:
uri(string) - URI to send POST request tojsonBody(string, optional) - JSON body for the requestheadersJson(string, optional) - JSON object of headers
Send a PUT request to the specified URI.
Parameters:
uri(string) - URI to send PUT request tojsonBody(string, optional) - JSON body for the requestheadersJson(string, optional) - JSON object of headers
Send a PATCH request to the specified URI.
Parameters:
uri(string) - URI to send PATCH request tojsonBody(string, optional) - JSON body for the requestheadersJson(string, optional) - JSON object of headers
Send a DELETE request to the specified URI.
Parameters:
uri(string) - URI to send DELETE request toheadersJson(string, optional) - JSON object of headers
Send a HEAD request to the specified URI.
Parameters:
uri(string) - URI to send HEAD request toheadersJson(string, optional) - JSON object of headers
Send an OPTIONS request to the specified URI.
Parameters:
uri(string) - URI to send OPTIONS request toheadersJson(string, optional) - JSON object of headers
Execute a multi-step workflow defined in a JSON file.
Parameters:
workflowFilePath(string) - Path to the workflow JSON file
Example:
{
"workflowFilePath": ".fscrub/workflows/data-pipeline.json"
}Workflow Features:
- Sequential step execution with data passing
- Reference previous step outputs using
{StepName.OutputName} - Support for all file, parser, template, and URI operations
- GitHub Copilot integration for AI-powered analysis
- Enable/disable individual steps
- Detailed execution metrics and error handling
See Documents/WORKFLOW_SERVICE_README.md for workflow definition format and examples.
Send a prompt to GitHub Copilot for AI-powered analysis or content generation.
Parameters:
prompt(string) - The prompt to send to GitHub Copilot
Example:
{
"prompt": "Analyze the following employee data and provide insights:\n\n{PreviousStep.Content}"
}Note: This tool is primarily used within workflows but can be called independently.
The Examples/ directory contains sample Scriban templates:
# File Listing Report
**Directory:** {{ directoryPath }}
**Total Files:** {{ fileCount }}
{{ for file in files }}
- {{ file.file_name }} ({{ file.size_in_bytes }} bytes)
{{ end }}
- List files:
fscrub_file_list("C:\\Projects", "*.cs", true)- Process template with output:
fscrub_scriban_process_template(
"Examples/file_list_report.sbn",
<json_from_step_1>,
"report.md"
)See Examples/SCRIBAN_TEMPLATES_README.md for more details.
filescrubberMCP/
├── Configuration/ # Configuration providers
├── Examples/ # Sample files and templates
│ ├── *.json # Sample JSON/XML/YAML files
│ ├── *.sbn # Scriban template examples
│ └── SCRIBAN_TEMPLATES_README.md
├── Extensions/ # Service collection extensions
├── Interfaces/ # Service and tool interfaces
├── Logs/ # Application logs
├── Models/ # Data models
├── Scripts/ # PowerShell startup scripts
├── Services/ # Business logic services
│ ├── FileService.cs
│ ├── ParserService.cs
│ ├── TemplateService.cs
│ └── UriService.cs
├── Tests/ # Unit tests
│ ├── Services/
│ └── Tools/
├── Tools/ # MCP tool implementations
│ ├── FileTools.cs
│ ├── ParserTools.cs
│ ├── TemplateTools.cs
│ └── UriTools.cs
├── Program.cs # Application entry point
├── appsettings.json # Configuration
└── README.md
# Run all tests
dotnet test
# Run specific test suite
dotnet test --filter "FullyQualifiedName~FileServiceTests"
# Run with coverage
dotnet test /p:CollectCoverage=true- Create interface in
Interfaces/ - Implement service in
Services/ - Implement MCP tool in
Tools/ - Register in
Program.csandServiceCollectionExtensions.cs - Add tests in
Tests/
- Use async/await for I/O operations
- Follow dependency injection patterns
- Comprehensive logging with Serilog
- Structured error handling
- XML documentation comments
The project includes comprehensive unit tests:
- 84 total tests
- Service layer tests with mocked dependencies
- Tool layer tests with mocked services
- Edge cases and error scenarios
- Integration scenarios
See Tests/TEMPLATE_TESTS_SUMMARY.md for detailed test coverage.
- .NET 10.0
- ModelContextProtocol.AspNetCore (0.1.0-preview.13)
- Serilog.AspNetCore (9.0.0)
- ClosedXML (0.105.0) - Excel files
- CsvHelper (33.1.0) - CSV files
- YamlDotNet (16.3.0) - YAML files
- Newtonsoft.Json (13.0.3) - JSON processing
- Scriban (5.10.0) - Template engine
- xUnit (2.9.3)
- Moq (4.20.72)
- Microsoft.NET.Test.Sdk (18.0.1)
Logs are written to the Logs/ directory (or custom directory specified by FILESCRUBBER_MCP_LOG_DIR environment variable):
- HTTP mode:
Logs/filescrubber-mcp-http-YYYYMMDD.log - Stdio mode:
Logs/filescrubber-mcp-stdio-YYYYMMDD.log
Log levels can be configured in appsettings.json.
All MCP tools return structured JSON responses:
Success:
{
"success": true,
"data": { ... }
}Error:
{
"success": false,
"error": "Error message"
}- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Ensure all tests pass
- Submit a pull request
[Your License Here]
For issues, questions, or contributions, please open an issue.
- Built with Model Context Protocol
- Powered by Scriban template engine
- Uses ClosedXML for Excel processing