Skip to content

Commit 515e008

Browse files
committed
docs mcp
1 parent 45a7d1a commit 515e008

File tree

1 file changed

+179
-2
lines changed

1 file changed

+179
-2
lines changed

docs/my-website/docs/mcp.md

Lines changed: 179 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,185 @@ LiteLLM Proxy provides an MCP Gateway that allows you to use a fixed endpoint fo
2525

2626
## Using your MCP
2727

28+
<Tabs>
29+
<TabItem value="openai" label="OpenAI API">
30+
31+
#### Connect via OpenAI Responses API
32+
33+
Use the OpenAI Responses API to connect to your LiteLLM MCP server:
34+
35+
```bash title="cURL Example" showLineNumbers
36+
curl --location 'https://api.openai.com/v1/responses' \
37+
--header 'Content-Type: application/json' \
38+
--header "Authorization: Bearer $OPENAI_API_KEY" \
39+
--data '{
40+
"model": "gpt-4o",
41+
"tools": [
42+
{
43+
"type": "mcp",
44+
"server_label": "litellm",
45+
"server_url": "<your-litellm-proxy-base-url>/mcp",
46+
"require_approval": "never",
47+
"headers": {
48+
"x-litellm-api-key": "YOUR_LITELLM_API_KEY"
49+
}
50+
}
51+
],
52+
"input": "Run available tools",
53+
"tool_choice": "required"
54+
}'
55+
```
56+
57+
</TabItem>
58+
59+
<TabItem value="litellm" label="LiteLLM Proxy">
60+
61+
#### Connect via LiteLLM Proxy Responses API
62+
63+
Use this when calling LiteLLM Proxy for LLM API requests to `/v1/responses` endpoint.
64+
65+
```bash title="cURL Example" showLineNumbers
66+
curl --location '<your-litellm-proxy-base-url>/v1/responses' \
67+
--header 'Content-Type: application/json' \
68+
--header "Authorization: Bearer $LITELLM_API_KEY" \
69+
--data '{
70+
"model": "gpt-4o",
71+
"tools": [
72+
{
73+
"type": "mcp",
74+
"server_label": "litellm",
75+
"server_url": "<your-litellm-proxy-base-url>/mcp",
76+
"require_approval": "never",
77+
"headers": {
78+
"x-litellm-api-key": "YOUR_LITELLM_API_KEY"
79+
}
80+
}
81+
],
82+
"input": "Run available tools",
83+
"tool_choice": "required"
84+
}'
85+
```
86+
87+
</TabItem>
88+
89+
<TabItem value="cursor" label="Cursor IDE">
90+
91+
#### Connect via Cursor IDE
92+
93+
Use tools directly from Cursor IDE with LiteLLM MCP:
94+
95+
**Setup Instructions:**
96+
97+
1. **Open Cursor Settings**: Use `⇧+⌘+J` (Mac) or `Ctrl+Shift+J` (Windows/Linux)
98+
2. **Navigate to MCP Tools**: Go to the "MCP Tools" tab and click "New MCP Server"
99+
3. **Add Configuration**: Copy and paste the JSON configuration below, then save with `Cmd+S` or `Ctrl+S`
100+
101+
```json title="Cursor MCP Configuration" showLineNumbers
102+
{
103+
"mcpServers": {
104+
"LiteLLM": {
105+
"url": "<your-litellm-proxy-base-url>/mcp",
106+
"headers": {
107+
"x-litellm-api-key": "$LITELLM_API_KEY"
108+
}
109+
}
110+
}
111+
}
112+
```
113+
114+
</TabItem>
115+
116+
<TabItem value="http" label="Streamable HTTP">
117+
118+
#### Connect via Streamable HTTP Transport
119+
120+
Connect to LiteLLM MCP using HTTP transport. Compatible with any MCP client that supports HTTP streaming:
121+
122+
**Server URL:**
123+
```text showLineNumbers
124+
<your-litellm-proxy-base-url>/mcp
125+
```
126+
127+
**Headers:**
128+
```text showLineNumbers
129+
x-litellm-api-key: YOUR_LITELLM_API_KEY
130+
```
131+
132+
This URL can be used with any MCP client that supports HTTP transport. Refer to your client documentation to determine the appropriate transport method.
133+
134+
</TabItem>
135+
136+
<TabItem value="fastmcp" label="Python FastMCP">
137+
138+
#### Connect via Python FastMCP Client
139+
140+
Use the Python FastMCP client to connect to your LiteLLM MCP server:
141+
142+
**Installation:**
143+
144+
```bash title="Install FastMCP" showLineNumbers
145+
pip install fastmcp
146+
```
147+
148+
or with uv:
149+
150+
```bash title="Install with uv" showLineNumbers
151+
uv pip install fastmcp
152+
```
153+
154+
**Usage:**
155+
156+
```python title="Python FastMCP Example" showLineNumbers
157+
import asyncio
158+
import json
159+
160+
from fastmcp import Client
161+
from fastmcp.client.transports import StreamableHttpTransport
162+
163+
# Create the transport with your LiteLLM MCP server URL
164+
server_url = "<your-litellm-proxy-base-url>/mcp"
165+
transport = StreamableHttpTransport(
166+
server_url,
167+
headers={
168+
"x-litellm-api-key": "YOUR_LITELLM_API_KEY"
169+
}
170+
)
171+
172+
# Initialize the client with the transport
173+
client = Client(transport=transport)
174+
175+
176+
async def main():
177+
# Connection is established here
178+
print("Connecting to LiteLLM MCP server...")
179+
async with client:
180+
print(f"Client connected: {client.is_connected()}")
181+
182+
# Make MCP calls within the context
183+
print("Fetching available tools...")
184+
tools = await client.list_tools()
185+
186+
print(f"Available tools: {json.dumps([t.name for t in tools], indent=2)}")
187+
188+
# Example: Call a tool (replace 'tool_name' with an actual tool name)
189+
if tools:
190+
tool_name = tools[0].name
191+
print(f"Calling tool: {tool_name}")
192+
193+
# Call the tool with appropriate arguments
194+
result = await client.call_tool(tool_name, arguments={})
195+
print(f"Tool result: {result}")
196+
197+
198+
# Run the example
199+
if __name__ == "__main__":
200+
asyncio.run(main())
201+
```
202+
203+
</TabItem>
204+
</Tabs>
205+
206+
28207
## MCP Permission Management
29208

30209

@@ -49,8 +228,6 @@ When MCP clients connect to LiteLLM's MCP Gateway they can run the following MCP
49228
2. Call Tools: Call a specific MCP tool with the provided arguments
50229

51230

52-
#### Usage
53-
54231
#### 1. Define your tools on under `mcp_servers` in your config.yaml file.
55232

56233
LiteLLM allows you to define your tools on the `mcp_servers` section in your config.yaml file. All tools listed here will be available to MCP clients (when they connect to LiteLLM and call `list_tools`).

0 commit comments

Comments
 (0)