Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@ test/
build/
scripts/
dist/
contribute/output/
15 changes: 13 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,12 @@ This architecture allows language models to:
Currently supports:
- Claude 3.5 Sonnet (claude-3-5-sonnet-20240620)
- Any Ollama-compatible model with function calling support
- Any OpenAI-compatible local or online model with function calling support


## Features ✨

- Interactive conversations with either Claude 3.5 Sonnet or Ollama models
- Interactive conversations with support models
- Support for multiple concurrent MCP servers
- Dynamic tool discovery and integration
- Tool calling capabilities for both model types
Expand Down Expand Up @@ -53,6 +55,10 @@ ollama pull mistral
ollama serve
```

3. OpenAI compatible online Setup
- Get your api server base url, api key and model name


## Installation 📦

```bash
Expand Down Expand Up @@ -99,7 +105,7 @@ MCPHost is a CLI tool that allows you to interact with various AI models through
### Available Models
Models can be specified using the `--model` (`-m`) flag:
- Anthropic Claude (default): `anthropic:claude-3-5-sonnet-latest`
- OpenAI: `openai:gpt-4`
- OpenAI or OpenAI-compatible: `openai:gpt-4`
- Ollama models: `ollama:modelname`

### Examples
Expand All @@ -109,6 +115,11 @@ mcphost -m ollama:qwen2.5:3b

# Use OpenAI's GPT-4
mcphost -m openai:gpt-4

# Use OpenAI-compatible model
mcphost --model openai:<your-model-name> \
--openai-url <your-base-url> \
--openai-api-key <your-api-key>
```

### Flags
Expand Down
4 changes: 4 additions & 0 deletions contribute/boost.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
./output/mcphost --model openai:<your-model-name> \
--openai-url <your-base-url> \
--openai-api-key <your-api-key> \
--config ./conf/demo.json --debug
6 changes: 6 additions & 0 deletions contribute/build.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
#!/bin/bash

RUN_NAME="mcphost"

mkdir -p output
go build -o output/${RUN_NAME}
22 changes: 22 additions & 0 deletions contribute/conf/demo.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"./"
]
},
"weather": {
"command": "uv",
"args": [
"--directory",
"/Users/bytedance/code/MCP/my_host/weather",
"run",
"weather.py"
]
}
}
}

31 changes: 31 additions & 0 deletions contribute/contribute.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
# Contribute README
Thanks for your contribution, you can follow these step to run this repo and debug it.
## Run demo
1. clone this repo to your work dir.
```bash
git clone https://github.com/mark3labs/mcphost.git
```

2. enter the `contribute` dir.
```bash
cd mcphost/contribute
```

3. run `build.sh` to build your binary file.
```bash
./build.sh
```

4. open `boost.sh` file and fill your model info in.
```bash
cat boost.sh
vi boost.sh
```

5. run `boost.sh` to run your mcphost, if you don't want run it in debug model, you can delete the `--debug` flag in `boost.sh`.
```bash
./boost.sh
```

## Contribute your code
just write your code and push it.
3 changes: 0 additions & 3 deletions pkg/llm/openai/client.go
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@ import (
"encoding/json"
"fmt"
"net/http"
"strings"
)

type Client struct {
Expand All @@ -18,8 +17,6 @@ type Client struct {
func NewClient(apiKey string, baseURL string) *Client {
if baseURL == "" {
baseURL = "https://api.openai.com/v1"
} else if !strings.HasSuffix(baseURL, "/v1") {
baseURL = strings.TrimSuffix(baseURL, "/") + "/v1"
}
return &Client{
apiKey: apiKey,
Expand Down
13 changes: 7 additions & 6 deletions pkg/llm/openai/types.go
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,13 @@ type CreateRequest struct {
}

type MessageParam struct {
Role string `json:"role"`
Content *string `json:"content"`
FunctionCall *FunctionCall `json:"function_call,omitempty"`
ToolCalls []ToolCall `json:"tool_calls,omitempty"`
Name string `json:"name,omitempty"`
ToolCallID string `json:"tool_call_id,omitempty"`
Role string `json:"role"`
Content *string `json:"content"`
ReasoningContent *string `json:"reasoning_content"`
FunctionCall *FunctionCall `json:"function_call,omitempty"`
ToolCalls []ToolCall `json:"tool_calls,omitempty"`
Name string `json:"name,omitempty"`
ToolCallID string `json:"tool_call_id,omitempty"`
}

type ToolCall struct {
Expand Down