A client for connecting to MCP (Machine Conversation Protocol) servers and interacting with them using OpenAI language models.
This MCP client allows you to:
- Connect to an MCP server (Python or JavaScript)
- List and use available tools from the server
- Interact with the server using natural language queries
- Process responses with OpenAI's language models
- Plan and execute complex tasks automatically
- Python 3.12 or higher
- OpenAI API key
- MCP server to connect to
UV is a fast, reliable Python package installer and resolver. It's recommended for this project:
-
Install UV:
# Unix (macOS, Linux) curl -LsSf https://astral.sh/uv/install.sh | bash # Windows (PowerShell) irm https://astral.sh/uv/install.ps1 | iex
-
Verify installation:
uv --version
For more installation options, visit UV's official documentation.
-
Clone this repository
-
Create a virtual environment:
python -m venv .venv source .venv/bin/activate # On Windows: .venv\Scripts\activate
-
Install dependencies:
pip install -e .Or using UV (recommended):
uv venv uv pip install -e . -
Create a
.envfile with your API keys and configuration:# OpenAI API Configuration OPENAI_API_KEY=your_openai_api_key BASE_URL=https://api.openai.com/v1 # Optional: Use your own OpenAI API proxy MODEL=gpt-3.5-turbo # Optional: Specify the model to use -
Install required dependencies:
Using UV (recommended):
# Install core dependencies uv pip install -e . # Or install individual packages uv add openai # OpenAI API client uv add python-dotenv # For loading environment variables uv add mcp # Machine Conversation Protocol library
Using standard pip:
pip install openai python-dotenv mcp
Each server type may require additional dependencies:
# For JavaScript servers uv add node-fetch # If connecting to JavaScript MCP servers # For additional functionality uv add asyncio # For asynchronous operations uv add pathlib # For file path handling
Get up and running quickly with UV:
# Clone the repository
git clone https://github.com/yourusername/deepin-mcp-python.git
cd deepin-mcp-python
# Install UV
curl -sSf https://astral.sh/uv/install.sh | bash
# Create a virtual environment and install dependencies
uv venv
uv pip install -e .
# Create .env file (replace with your actual API key)
echo "OPENAI_API_KEY=your_openai_api_key" > .env
echo "MODEL=gpt-3.5-turbo" >> .env
# Run the client with bash server
uv run python client.py servers/bash_server.py
# Or run the task planning system
uv run python main.pyRun the client by specifying the path to your MCP server script:
python client.py path/to/your/server_script.pyOr using UV:
uv run python client.py path/to/your/server_script.pyFor JavaScript servers:
python client.py path/to/your/server_script.jsOr using UV:
uv run python client.py path/to/your/server_script.jsThe project includes a powerful task planning system that can:
- Break down complex user requests into sequential tasks
- Execute tasks automatically using the MCP server
- Provide detailed execution summaries
To use the task planning system:
python main.pyOr using UV (recommended):
uv run python main.pyYou can also specify a custom server path:
python main.py --server path/to/your/server_script.pyOr check the version information:
python main.py --versionThe project provides a main entry point (main.py) that serves as the centralized entry point for the application:
python main.pyOr using UV (recommended):
uv run python main.pyThe main entry point:
- Initializes the task planning system
- Provides version information and system description
- Handles exceptions gracefully
- Makes the application more user-friendly
- Allows specifying custom server paths with the
--serveroption
Example usage:
$ python main.py
====== Deepin MCP 任务规划系统 ======
版本: 1.0.0
描述: 这是一个基于MCP协议的任务规划执行系统
======================================
欢迎使用任务规划执行系统
这个系统会将您的请求拆解为多个任务,并依次执行
已成功连接到服务器: /home/user/deepin-mcp-python/servers/bash_server.py
请输入您的请求 (输入'quit'退出):
Example interaction:
请输入您的请求 (输入'quit'退出): Create a new directory called 'projects' and copy all .txt files from 'documents' to it
正在分析您的请求...
已将您的请求拆解为 3 个任务:
1. Create a new directory called 'projects'
2. List all .txt files in the 'documents' directory
3. Copy all .txt files from 'documents' to 'projects'
是否执行这些任务? (y/n): y
[Task execution progress will be shown here]
执行总结:
[Detailed summary of the executed tasks will be shown here]
The task planning system is particularly useful for:
- Complex file operations
- Multi-step system configurations
- Automated workflow execution
- Batch processing tasks
Once connected, you can enter queries in the interactive prompt. The client will:
- Send your query to the language model
- Process any tool calls requested by the model
- Return the final response
Type quit to exit the interactive mode.
The weather_server.py provides an MCP tool to query weather information for a specified city.
python client.py weather_server.pyOr using UV:
uv run python client.py weather_server.pyExample query: "查询北京的天气"
Dependencies:
uv add requests # For HTTP requests to weather APIsThe file_server.py provides MCP tools for file operations like open, copy, move, rename, delete, and create files.
python client.py file_server.pyOr using UV:
uv run python client.py file_server.pyExample query: "创建一个名为test.txt的文件"
Dependencies:
uv add pathlib # For file path handlingThe bash_server.py provides MCP tools for executing Linux bash commands.
python client.py bash_server.pyOr using UV:
uv run python client.py bash_server.pyExample query: "列出当前目录下的所有文件"
Dependencies:
uv add shlex # For shell command parsingTo extend or modify this client:
- The
MCPClientclass handles the main functionality - The
process_querymethod processes queries using the LLM - The
TaskPlannerclass breaks down complex requests into tasks - The main entry point (
main.py) provides a user-friendly interface - Error handling is implemented throughout for robustness
If you encounter issues:
- Check your API keys in
.env - Verify the server script path is correct
- Ensure the server implements the MCP protocol correctly
- Check console output for error messages
- Make sure you have required dependencies installed
This project is available under the MIT License.
You can build a standalone binary executable that can be distributed without requiring Python installation:
The project includes a build script that simplifies the process:
# Make the build script executable
chmod +x build.sh
# Run the build script
./build.shThe executable will be created in the dist directory as deepin-mcp.
If you prefer to build manually:
-
Install PyInstaller using UV:
uv pip install pyinstaller
-
Build the executable:
pyinstaller --clean deepin-mcp.spec
-
The executable will be available at
dist/deepin-mcp
Once built, you can run the executable directly:
# Run with default settings
./dist/deepin-mcp
# Check version
./dist/deepin-mcp --version
# Specify a custom server
./dist/deepin-mcp --server path/to/your/server_script.pyThe executable includes all necessary dependencies and can be distributed to systems without Python installed.
The application automatically discovers and loads available server scripts:
-
When starting the application, it searches for server scripts in these locations:
- The current working directory
- The
serverssubdirectory - The application's executable directory
- The
serverssubdirectory within the executable directory
-
Available servers can be listed with:
./deepin-mcp --list-servers
-
A specific server can be selected at startup:
./deepin-mcp --server bash # or by path ./deepin-mcp --server /path/to/custom_server.py -
Servers can be switched during runtime by typing
switchat the prompt.
When the application is packaged as a binary executable:
- Server scripts are automatically deployed to the
serversdirectory next to the executable - The application automatically finds and loads these servers
- Each server is accompanied by a wrapper script (e.g.,
bash_server.wrapper.py) that ensures proper dependency loading - A dedicated Python environment with all necessary dependencies (including the
mcpmodule) is created in theserver_envdirectory - Custom servers can be added by placing them in the
serversdirectory with a filename containing "server" (e.g.,custom_server.py) - The
run_server.shhelper script can be used to directly run any server
You can run servers directly using the provided helper script:
# Run a server by name
./dist/deepin-mcp/run_server.sh bash
# Run a server by file name
./dist/deepin-mcp/run_server.sh bash_server.py
# Run a server by full path
./dist/deepin-mcp/run_server.sh /path/to/your/custom_server.pyThis is particularly useful for:
- Testing server functionality independently
- Running servers in separate terminals
- Using custom server scripts with the application
The packaged application includes a complete environment for the servers:
dist/deepin-mcp/
├── deepin-mcp # Main executable
├── run_server.sh # Server helper script
├── server_env/ # Python environment for servers
│ ├── mcp/ # MCP module and dependencies
│ ├── openai/ # OpenAI module
│ └── ... # Other dependencies
└── servers/
├── bash_server.py # Original server script
├── bash_server.wrapper.py # Wrapper that configures path
├── file_server.py # Original server script
├── file_server.wrapper.py # Wrapper that configures path
└── ...
To add custom servers to a packaged application:
- Create your server script following the MCP protocol
- Place it in the
serversdirectory of the packaged application - Create a wrapper script that follows the same pattern as the included wrappers, or run the following command inside the dist/deepin-mcp directory:
cat > servers/myserver_server.wrapper.py << EOF
#!/usr/bin/env python
# Wrapper script for myserver_server.py
import os
import sys
# Add the server_env to the Python path
server_env_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', 'server_env'))
sys.path.insert(0, server_env_path)
# Import the actual server code
server_file = os.path.abspath(os.path.join(os.path.dirname(__file__), 'myserver_server.py'))
with open(server_file, 'r') as f:
server_code = f.read()
# Execute the server code with the modified path
exec(server_code)
EOF
chmod +x servers/myserver_server.wrapper.py- Run your custom server with the helper script or use it directly from the main application