This project is a template for building custom Model Context Protocol (MCP)-based agents that integrate with large language models (LLMs) via an OpenAI-compatible API. It provides a foundation for creating intelligent agents capable of processing natural language queries and executing tasks through predefined tools. For demonstration purposes, this template implements file system operations with robust safety mechanisms to prevent unintended file modifications or deletions. The MCP server handles JSON-RPC requests, while the client communicates with the LLM (e.g., DeepSeek, OpenAI, or Ollama) to determine which tools to invoke based on user input.
- Intelligent File Management: Leverages an LLM to interpret natural language queries and select appropriate file operations.
- MCP Server: Implements a JSON-RPC 2.0 server for handling tool calls, supporting batch requests and initialization.
- File System Tools:
analyze_logs: Searches log files for patterns (e.g., errors) using regular expressions.search_files: Finds files by name, content, or metadata (e.g., modification date).organize_files: Groups files by extension or creation date into subdirectories.replace_text: Performs bulk text replacement in files (e.g., changing "http" to "https").delete_file: Deletes files, with configurable restrictions to prevent unauthorized deletions.
- Multi-LLM Support: Compatible with any OpenAI-style API (DeepSeek, OpenAI, Ollama, etc.) via a configurable provider system.
- Security:
- Restricts file operations to a designated
WORKING_DIRto prevent unauthorized access. - Requires
MCP_SERVER_AUTH_TOKENfor MCP server authentication. - Disables file deletion by default (
ALLOW_DELETE=false) for safety.
- Restricts file operations to a designated
- Modular Design: Tools are defined in a separate module (
tools.js), enabling easy addition of new functionality. - Configurable: Settings (API provider, keys, directory, etc.) are managed in
config.json.
- Node.js (v18 or higher recommended).
- NPM packages:
axios,readline-sync,glob. - An API key for your chosen LLM provider (e.g., DeepSeek from platform.deepseek.com).
- Clone the repository or copy the project files.
- Install dependencies:
npm install axios readline-sync glob
- Edit
config.jsonin the project root to match your environment:API_PROVIDER: Options includedeepseek,openai,ollama, or custom providers.LLM_API_KEY: API key for the LLM provider (not required for Ollama).MODEL: Model name (e.g.,gpt-4for OpenAI,llama3for Ollama).MCP_SERVER_AUTH_TOKEN: Token for MCP server authentication.WORKING_DIR: Directory where file operations are allowed.ALLOW_DELETE: Set totrueto enable file deletion (default:false).
- Start the MCP server:
The server will run on
node mcp-server.js
http://localhost:3000/mcp. - Start the client:
node llm-client.js
- Enter queries in the client console, e.g.:
- "Find all files containing 'error' and replace 'http' with 'https'."
- "Organize files by extension."
- "Analyze access.log for '404' errors."
- To exit the client, type
exit.
- Query: "Find files with 'error' and replace 'http' with 'https'."
- Process:
- The client sends the query to the LLM (e.g., DeepSeek).
- The LLM decides to call
search_fileswithquery: "error",by: "content". - The MCP server executes the tool and returns matching file paths.
- The LLM then calls
replace_textfor each file. - The client displays the final response.
This template is designed for easy customization. Below are key areas for adding new functionality:
- Define the Tool:
- Open
tools.jsand add a new function tomodule.exports. For example:// EXTENSION POINT: Example tool for compressing files compress_files: async ({ filename }) => { // Implement compression logic (e.g., using 'archiver' package) const filePath = ensureSafePath(filename); // ... return `File ${filePath} compressed`; },
- Open
- Update Client Tool Definitions:
- In
llm-client.js, add the tool to thetoolsarray with its schema:{ name: 'compress_files', description: 'Compress a file into a zip archive', parameters: { type: 'object', properties: { filename: { type: 'string' } }, required: ['filename'], }, },
- In
- Test the new tool with a query like "Compress file.txt into a zip."
- Add a tool to interact with external APIs, such as the public JSONPlaceholder API (no token required):
// EXAMPLE: REST API CALL - Fetch data from public JSONPlaceholder API fetch_external_data: async ({ endpoint = 'posts' }) => { try { const response = await axios.get(`https://jsonplaceholder.typicode.com/${endpoint}`); return { data: response.data, count: Array.isArray(response.data) ? response.data.length : 1, message: `Fetched ${endpoint} data successfully` }; } catch (error) { throw new Error(`API call failed: ${error.message}`); } },
- Add the corresponding schema to
llm-client.js:{ name: 'fetch_external_data', description: 'Fetch user list for saving to files', parameters: { type: 'object', properties: { endpoint: { type: 'string', enum: ['posts', 'users', 'comments'] } }, required: ['endpoint'], }, },
- Test with queries like "Fetch posts from JSONPlaceholder and save to posts.json."
- Update Providers:
- In
llm-client.js, extend theprovidersobject:// EXTENSION POINT: Add new providers anthropic: { url: 'https://your-anthropic-adapter-url/v1/chat/completions', model: 'claude-3-opus', apiKey: config.LLM_API_KEY, },
- In
- Update
config.jsonwith the new provider:{ "API_PROVIDER": "anthropic", "LLM_API_KEY": "your_anthropic_key", "MODEL": "claude-3-opus" }
- Real-Time Monitoring: Add a tool in
tools.jsusingfs.watchto monitor file changes and notify the LLM. - Persistent History: Save message history to a file or database in
llm-client.js. - Enhanced Security: Add rate limiting or logging in
mcp-server.jsfor the MCP server. - Streaming Responses: Modify
callLLMinllm-client.jsto handle streaming responses if supported by the LLM provider.
- Ensure new tools respect
WORKING_DIRrestrictions usingensureSafePath. - Test thoroughly when enabling
ALLOW_DELETEto avoid accidental file loss. - Check LLM provider documentation for specific
tool_callsor model requirements.