diff --git a/README.md b/README.md
index 817a094..9081d59 100644
--- a/README.md
+++ b/README.md
@@ -1,59 +1,302 @@
# MCPHub
-A hub for Model Context Protocol (MCP) servers that enables you to manage and run MCP servers locally.
+MCPHub is an embeddable Model Context Protocol (MCP) solution for AI services. It enables seamless integration of MCP servers into any AI framework, allowing developers to easily configure, set up, and manage MCP servers within their applications. Whether you're using OpenAI Agents, LangChain, or Autogen, MCPHub provides a unified way to connect your AI services with MCP tools and resources.
-## Installation
+## Quick Start
+### Prerequisites
+
+Ensure you have the following tools installed:
```bash
+# Install uv (Python package manager)
+curl -LsSf https://astral.sh/uv/install.sh | sh
+
+# Install git (for repository cloning)
+sudo apt-get install git # Ubuntu/Debian
+brew install git # macOS
+
+# Install npx (comes with Node.js)
+npm install -g npx
+
+# Install MCPHub
pip install mcphub
```
-## Usage
+### Configuration
-### Command Line Interface
+Create a `.mcphub.json` file in your project root:
-MCPHub comes with a command-line interface for common operations:
+```json
+{
+ "mcpServers": {
+ "sequential-thinking-mcp": {
+ "package_name": "smithery-ai/server-sequential-thinking",
+ "command": "npx",
+ "args": [
+ "-y",
+ "@smithery/cli@latest",
+ "run",
+ "@smithery-ai/server-sequential-thinking"
+ ]
+ }
+ }
+}
+```
-```bash
-# Set up all configured MCP servers
-mcphub setup
+### Usage with OpenAI Agents
-# List available MCP servers
-mcphub list-servers
+```python
+import asyncio
+import json
+from agents import Agent, Runner
+from mcphub import MCPHub
-# List tools from all MCP servers
-mcphub list-tools
+async def main():
+ """
+ Example of using MCPHub to integrate MCP servers with OpenAI Agents.
+
+ This example demonstrates:
+ 1. Initializing MCPHub
+ 2. Fetching and using an MCP server
+ 3. Listing available tools
+ 4. Creating and running an agent with MCP tools
+ """
+
+ # Step 1: Initialize MCPHub
+ # MCPHub will automatically:
+ # - Find .mcphub.json in your project
+ # - Load server configurations
+ # - Set up servers (clone repos, run setup scripts if needed)
+ hub = MCPHub()
+
+ # Step 2: Create an MCP server instance using async context manager
+ # Parameters:
+ # - mcp_name: The name of the server from your .mcphub.json
+ # - cache_tools_list: Cache the tools list for better performance
+ async with hub.fetch_openai_mcp_server(
+ mcp_name="sequential-thinking-mcp",
+ cache_tools_list=True
+ ) as server:
+ # Step 3: List available tools from the MCP server
+ # This shows what capabilities are available to your agent
+ tools = await server.list_tools()
+
+ # Pretty print the tools for better readability
+ tools_dict = [
+ dict(tool) if hasattr(tool, "__dict__") else tool for tool in tools
+ ]
+ print("Available MCP Tools:")
+ print(json.dumps(tools_dict, indent=2))
-# List tools from a specific server
-mcphub list-tools --server azure-devops
+ # Step 4: Create an OpenAI Agent with MCP server
+ # The agent can now use all tools provided by the MCP server
+ agent = Agent(
+ name="Assistant",
+ instructions="Use the available tools to accomplish the given task",
+ mcp_servers=[server] # Provide the MCP server to the agent
+ )
+
+ # Step 5: Run your agent with a complex task
+ # The agent will automatically have access to all MCP tools
+ complex_task = """Please help me analyze the following complex problem:
+ We need to design a new feature for our product that balances user privacy
+ with data collection for improving the service. Consider the ethical implications,
+ technical feasibility, and business impact. Break down your thinking process
+ step by step, and provide a detailed recommendation with clear justification
+ for each decision point."""
+
+ # Execute the task and get the result
+ result = await Runner.run(agent, complex_task)
+ print("\nAgent Response:")
+ print(result)
-# Use the MCPHubAdapter
-mcphub adapter --config mcp_config.json --server azure-devops-mcp
+if __name__ == "__main__":
+ # Run the async main function
+ asyncio.run(main())
```
-### Using in code
+## Features and Guidelines
+
+### Server Configuration
+
+- **JSON-based Configuration**: Simple `.mcphub.json` configuration file
+- **Environment Variable Support**: Use environment variables in configuration
+- **Predefined Servers**: Access to a growing list of pre-configured MCP servers
+- **Custom Server Support**: Easy integration of custom MCP servers
+
+Configure your MCP servers in `.mcphub.json`:
+
+```json
+{
+ "mcpServers": {
+ // TypeScript-based MCP server using NPX
+ "sequential-thinking-mcp": {
+ "package_name": "smithery-ai/server-sequential-thinking", // NPM package name
+ "command": "npx", // Command to run server
+ "args": [ // Command arguments
+ "-y",
+ "@smithery/cli@latest",
+ "run",
+ "@smithery-ai/server-sequential-thinking"
+ ]
+ },
+ // Python-based MCP server from GitHub
+ "azure-storage-mcp": {
+ "package_name": "mashriram/azure_mcp_server", // Package identifier
+ "repo_url": "https://github.com/mashriram/azure_mcp_server", // GitHub repository
+ "command": "uv", // Python package manager
+ "args": ["run", "mcp_server_azure_cmd"], // Run command
+ "setup_script": "uv pip install -e .", // Installation script
+ "env": { // Environment variables
+ "AZURE_STORAGE_CONNECTION_STRING": "${AZURE_STORAGE_CONNECTION_STRING}",
+ "AZURE_STORAGE_CONTAINER_NAME": "${AZURE_STORAGE_CONTAINER_NAME}",
+ "AZURE_STORAGE_BLOB_NAME": "${AZURE_STORAGE_BLOB_NAME}"
+ }
+ }
+ }
+}
+```
+
+### MCP Server Installation and Management
+
+- **Flexible Server Setup**: Supports both TypeScript and Python-based MCP servers
+- **Multiple Installation Sources**:
+ - NPM packages via `npx`
+ - Python packages via GitHub repository URLs
+ - Local development servers
+- **Automatic Setup**: Handles repository cloning, dependency installation, and server initialization
+
+### Transport Support
+
+- **stdio Transport**: Run MCP servers as local subprocesses
+- **Automatic Path Management**: Manages server paths and working directories
+- **Environment Variable Handling**: Configurable environment variables per server
+
+### Framework Integration
+
+Provides adapters for popular AI frameworks:
+- OpenAI Agents
+- LangChain
+- Autogen
```python
-import asyncio
-from mcphub import MCPHubAdapter, setup_all_servers, store_mcp, list_tools
-from dataclasses import asdict
+from mcphub import MCPHub
+
+async def framework_examples():
+ hub = MCPHub()
+
+ # 1. OpenAI Agents Integration
+ async with hub.fetch_openai_mcp_server(
+ mcp_name="sequential-thinking-mcp",
+ cache_tools_list=True
+ ) as server:
+ # Use server with OpenAI agents
+ agent = Agent(
+ name="Assistant",
+ mcp_servers=[server]
+ )
+
+ # 2. LangChain Tools Integration
+ langchain_tools = await hub.fetch_langchain_mcp_tools(
+ mcp_name="sequential-thinking-mcp",
+ cache_tools_list=True
+ )
+ # Use tools with LangChain
+
+ # 3. Autogen Adapters Integration
+ autogen_adapters = await hub.fetch_autogen_mcp_adapters(
+ mcp_name="sequential-thinking-mcp"
+ )
+ # Use adapters with Autogen
+```
+
+### Tool Management
-# Initialize and set up servers
-async def init():
- await setup_all_servers()
- await store_mcp()
+- **Tool Discovery**: Automatically list and manage available tools from MCP servers
+- **Tool Caching**: Optional caching of tool lists for improved performance
+- **Framework-specific Adapters**: Convert MCP tools to framework-specific formats
+
+Discover and manage MCP server tools:
+
+```python
+from mcphub import MCPHub
+
+async def tool_management():
+ hub = MCPHub()
- # List all available tools
- tools = await list_tools()
- print(f"Available tools: {tools}")
+ # List all tools from a specific MCP server
+ tools = await hub.list_tools(mcp_name="sequential-thinking-mcp")
- # Use the adapter to get a specific server
- adapter = MCPHubAdapter().from_config("mcp_config.json", cache_path="cache")
- server = adapter.get_server("azure-devops-mcp")
+ # Print tool information
+ for tool in tools:
+ print(f"Tool Name: {tool.name}")
+ print(f"Description: {tool.description}")
+ print(f"Parameters: {tool.parameters}")
+ print("---")
- if server:
- print(f"Server config: {server}")
+ # Tools can be:
+ # - Cached for better performance using cache_tools_list=True
+ # - Converted to framework-specific formats automatically
+ # - Used directly with AI frameworks through adapters
+```
+
+## MCPHub: High-Level Overview
+
+MCPHub simplifies the integration of Model Context Protocol (MCP) servers into AI applications through four main components:
+
+
+
+### Core Components
+
+1. **Params Hub**
+ - Manages configurations from `.mcphub.json`
+ - Defines which MCP servers to use and how to set them up
+ - Stores server parameters like commands, arguments, and environment variables
+
+2. **MCP Servers Manager**
+ - Handles server installation and setup
+ - Supports two types of servers:
+ * TypeScript-based servers (installed via npx)
+ * Python-based servers (installed via uv from GitHub)
+ - Manages server lifecycle and environment
+
+3. **MCP Client**
+ - Establishes communication with MCP servers
+ - Uses stdio transport for server interaction
+ - Handles two main operations:
+ * `list_tools`: Discovers available server tools
+ * `call_tool`: Executes server tools
+
+4. **Framework Adapters**
+ - Converts MCP tools to framework-specific formats
+ - Supports multiple AI frameworks:
+ * OpenAI Agents
+ * LangChain
+ * Autogen
+
+### Workflow
+
+1. **Configuration & Setup**
+ - Params Hub reads configuration
+ - Servers Manager sets up required servers
+ - Servers start and become available
+
+2. **Communication**
+ - MCP Client connects to servers via stdio
+ - Tools are discovered and made available
+ - Requests and responses flow between client and server
+
+3. **Integration**
+ - Framework adapters convert MCP tools
+ - AI applications use adapted tools through their preferred framework
+ - Tools are executed through the established communication channel
+
+This architecture provides a seamless way to integrate MCP capabilities into any AI application while maintaining clean separation of concerns and framework flexibility.
+
+## Contributing
+
+We welcome contributions! Please check out our [Contributing Guide](CONTRIBUTING.md) for guidelines on how to proceed.
+
+## License
-# Run the async function
-asyncio.run(init())
-```
\ No newline at end of file
+This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
\ No newline at end of file
diff --git a/docs/mcphub_work.mmd b/docs/mcphub_work.mmd
new file mode 100644
index 0000000..2118991
--- /dev/null
+++ b/docs/mcphub_work.mmd
@@ -0,0 +1,79 @@
+graph TB
+ subgraph "MCPHub Components"
+ PH[Params Hub]
+ MS[MCP Servers]
+ MC[MCP Clients]
+ FA[Framework Adapters]
+ end
+
+ subgraph "Configuration"
+ CF[".mcphub.json"]
+ PS["Predefined Servers
mcphub_preconfigured_servers.json"]
+ end
+
+ subgraph "Server Sources"
+ NPM["TypeScript-based
NPM Packages
(via npx)"]
+ GH["Python-based
GitHub Repos
(via uv)"]
+ end
+
+ subgraph "Framework Integration"
+ OA["OpenAI Agents"]
+ LC["LangChain"]
+ AG["Autogen"]
+ end
+
+ subgraph "MCP Server Runtime"
+ direction LR
+ ST["Server Tools"]
+ SE["Server Environment"]
+
+ subgraph "Client-Server Communication"
+ direction TB
+ REQ["Request
(list_tools/call_tool)"]
+ RES["Response
(tools/results)"]
+ REQ --> |"stdio"| RES
+ end
+ end
+
+ %% Flow for configuration and setup
+ CF --> PH
+ PS --> PH
+ PH --> |"Load Config"| MS
+ MS --> |"Install/Setup"| NPM
+ MS --> |"Clone & Install"| GH
+
+ %% Flow for client and tools
+ MS --> |"Start Server Process"| ST
+ MC --> |"Send Request"| REQ
+ RES --> |"Return Tools/Results"| MC
+ SE --> ST
+
+ %% Flow for framework integration
+ MC --> FA
+ FA --> |"Adapt"| OA
+ FA --> |"Adapt"| LC
+ FA --> |"Adapt"| AG
+
+ %% Application usage
+ APP["AI Application"] --> |"Use"| OA
+ APP --> |"Use"| LC
+ APP --> |"Use"| AG
+
+ %% Tool execution flow
+ OA --> |"Execute Tools"| MC
+ LC --> |"Execute Tools"| MC
+ AG --> |"Execute Tools"| MC
+
+ classDef config fill:#f9f,stroke:#333,stroke-width:2px
+ classDef source fill:#bbf,stroke:#333,stroke-width:2px
+ classDef framework fill:#bfb,stroke:#333,stroke-width:2px
+ classDef runtime fill:#fbb,stroke:#333,stroke-width:2px
+ classDef component fill:#fff,stroke:#333,stroke-width:4px
+ classDef communication fill:#ff9,stroke:#333,stroke-width:2px
+
+ class CF,PS config
+ class NPM,GH source
+ class OA,LC,AG framework
+ class ST,SE runtime
+ class PH,MS,MC,FA component
+ class REQ,RES communication
\ No newline at end of file
diff --git a/docs/simple_mcphub_work.mmd b/docs/simple_mcphub_work.mmd
new file mode 100644
index 0000000..92f3f33
--- /dev/null
+++ b/docs/simple_mcphub_work.mmd
@@ -0,0 +1,45 @@
+graph TB
+ subgraph "MCPHub"
+ PH["Params Hub
.mcphub.json"]
+ MS["MCP Servers
Manager"]
+ MC["MCP Client"]
+ FA["Framework
Adapters"]
+ end
+
+ subgraph "Server Sources"
+ NPM["TypeScript NPM
(npx)"]
+ GH["Python GitHub
(uv)"]
+ end
+
+ subgraph "MCP Server"
+ ST["Server Tools"]
+ COM["stdio Transport
list_tools/call_tool"]
+ end
+
+ subgraph "AI Application"
+ APP["Agent Application"]
+ FW["OpenAI/LangChain/
Autogen"]
+ end
+
+ %% Main flows
+ PH --> |"Configure"| MS
+ MS --> |"Setup"| NPM
+ MS --> |"Setup"| GH
+ NPM & GH --> |"Start"| ST
+
+ MC <--> |"Request/Response"| COM
+ COM <--> ST
+
+ MC --> FA
+ FA --> FW
+ APP --> FW
+
+ classDef mcphub fill:#bbf,stroke:#333,stroke-width:4px
+ classDef server fill:#bfb,stroke:#333,stroke-width:2px
+ classDef app fill:#fbb,stroke:#333,stroke-width:2px
+ classDef source fill:#fff,stroke:#333,stroke-width:2px
+
+ class PH,MS,MC,FA mcphub
+ class ST,COM server
+ class APP,FW app
+ class NPM,GH source
\ No newline at end of file
diff --git a/docs/simple_mcphub_work.png b/docs/simple_mcphub_work.png
new file mode 100644
index 0000000..13ac4e4
Binary files /dev/null and b/docs/simple_mcphub_work.png differ
diff --git a/src/mcphub/mcp_servers/servers.py b/src/mcphub/mcp_servers/servers.py
index 6f9a146..8f454cb 100644
--- a/src/mcphub/mcp_servers/servers.py
+++ b/src/mcphub/mcp_servers/servers.py
@@ -192,3 +192,16 @@ async def make_autogen_mcp_adapters(self, mcp_name: str) -> List[StdioMcpToolAda
adapter = await StdioMcpToolAdapter.from_server_params(server_params, tool.name)
adapters.append(adapter)
return adapters
+
+ async def list_tools(self, mcp_name: str) -> List[BaseTool]:
+ """
+ List all tools from an MCP server.
+
+ Args:
+ mcp_name: The name of the MCP server configuration to use
+
+ Returns:
+ List[BaseTool]: List of tools provided by the MCP server
+ """
+ async with self.make_openai_mcp_server(mcp_name, cache_tools_list=True) as server:
+ return await server.list_tools()
diff --git a/src/mcphub/mcphub.py b/src/mcphub/mcphub.py
index 0a55125..2f0c5b6 100644
--- a/src/mcphub/mcphub.py
+++ b/src/mcphub/mcphub.py
@@ -68,4 +68,16 @@ async def fetch_autogen_mcp_adapters(self, mcp_name: str) -> List[StdioMcpToolAd
Returns:
StdioMcpToolAdapter: The configured MCP adapter
"""
- return await self.servers.make_autogen_mcp_adapters(mcp_name)
\ No newline at end of file
+ return await self.servers.make_autogen_mcp_adapters(mcp_name)
+
+ async def list_tools(self, mcp_name: str) -> List[BaseTool]:
+ """
+ List all tools from an MCP server.
+
+ Args:
+ mcp_name: The name of the MCP server configuration to use
+
+ Returns:
+ List[BaseTool]: List of tools provided by the MCP server
+ """
+ return await self.servers.list_tools(mcp_name)
\ No newline at end of file
diff --git a/src/mcphub/test.py b/src/mcphub/test.py
index 17d863f..5ca75c9 100644
--- a/src/mcphub/test.py
+++ b/src/mcphub/test.py
@@ -1,26 +1,67 @@
import asyncio
-from dataclasses import asdict
-from mcphub import MCPHub # Import MCPHub from mcphub.py
-from agents.mcp import MCPServerStdio
import json
-
-# Create an instance of MCPHub, which automatically loads the configuration
-mcphub = MCPHub()
-sequential_thinking_server = mcphub.fetch_server_params("sequential-thinking-mcp")
-
-print(f"Using MCP server: {sequential_thinking_server}")
+from agents import Agent, Runner
+from mcphub import MCPHub
async def main():
- async with MCPServerStdio(
- cache_tools_list=True, # Cache the tools list, for demonstration
- params=asdict(sequential_thinking_server), # Use the MCP server configuration
+ """
+ Example of using MCPHub to integrate MCP servers with OpenAI Agents.
+
+ This example demonstrates:
+ 1. Initializing MCPHub
+ 2. Fetching and using an MCP server
+ 3. Listing available tools
+ 4. Creating and running an agent with MCP tools
+ """
+
+ # Step 1: Initialize MCPHub
+ # MCPHub will automatically:
+ # - Find .mcphub.json in your project
+ # - Load server configurations
+ # - Set up servers (clone repos, run setup scripts if needed)
+ hub = MCPHub()
+
+ # Step 2: Create an MCP server instance using async context manager
+ # Parameters:
+ # - mcp_name: The name of the server from your .mcphub.json
+ # - cache_tools_list: Cache the tools list for better performance
+ async with hub.fetch_openai_mcp_server(
+ mcp_name="sequential-thinking-mcp",
+ cache_tools_list=True
) as server:
+ # Step 3: List available tools from the MCP server
+ # This shows what capabilities are available to your agent
tools = await server.list_tools()
+
+ # Pretty print the tools for better readability
tools_dict = [
dict(tool) if hasattr(tool, "__dict__") else tool for tool in tools
]
- print("Tools available:")
+ print("Available MCP Tools:")
print(json.dumps(tools_dict, indent=2))
+ # Step 4: Create an OpenAI Agent with MCP server
+ # The agent can now use all tools provided by the MCP server
+ agent = Agent(
+ name="Assistant",
+ instructions="Use the available tools to accomplish the given task",
+ mcp_servers=[server] # Provide the MCP server to the agent
+ )
+
+ # Step 5: Run your agent with a complex task
+ # The agent will automatically have access to all MCP tools
+ complex_task = """Please help me analyze the following complex problem:
+ We need to design a new feature for our product that balances user privacy
+ with data collection for improving the service. Consider the ethical implications,
+ technical feasibility, and business impact. Break down your thinking process
+ step by step, and provide a detailed recommendation with clear justification
+ for each decision point."""
+
+ # Execute the task and get the result
+ result = await Runner.run(agent, complex_task)
+ print("\nAgent Response:")
+ print(result)
+
if __name__ == "__main__":
+ # Run the async main function
asyncio.run(main())
\ No newline at end of file