Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 5 additions & 3 deletions .env.template
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@

# https://console.anthropic.com/settings/keys
ANTHROPIC_API_KEY=sk-ant-...
# https://platform.openai.com/api-keys
OPENAI_API_KEY=sk-proj-...
# https://aistudio.google.com/apikey
GOOGLE_API_KEY=AI...
# GROQ_API_KEY=gsk_...

# GITHUB_PERSONAL_ACCESS_TOKEN=github_pat_...
# BRAVE_API_KEY=BSA...
# GITHUB_PERSONAL_ACCESS_TOKEN=github_pat_...
# NOTION_INTEGRATION_SECRET=ntn_...
31 changes: 15 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,17 @@
**Quickly test and explore MCP servers from the command line!**

A simple, text-based CLI client for [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) servers built with LangChain and TypeScript.
This tool performs automatic schema adjustments for LLM compatibility.
Suitable for testing MCP servers, exploring their capabilities, and prototyping integrations.

Internally it uses [LangChain ReAct Agent](https://github.com/langchain-ai/react-agent-js) and
a utility function `convertMcpToLangchainTools()` from [`@h1deya/langchain-mcp-tools`](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools).
a utility function `convertMcpToLangchainTools()` from
[`@h1deya/langchain-mcp-tools`](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools).
This function performs the aforementioned MCP tools schema transformations for LLM compatibility.
See [this page](https://github.com/hideya/langchain-mcp-tools-ts/blob/main/README.md#llm-provider-schema-compatibility)
for details.

A Python equivalent of this utility is available [here](https://pypi.org/project/mcp-chat/)

## Prerequisites

Expand All @@ -18,7 +25,7 @@ a utility function `convertMcpToLangchainTools()` from [`@h1deya/langchain-mcp-t
[OpenAI](https://platform.openai.com/api-keys),
[Anthropic](https://console.anthropic.com/settings/keys),
and/or
[Google GenAI](https://aistudio.google.com/apikey)
[Google AI Studio (for GenAI/Gemini)](https://aistudio.google.com/apikey)
as needed

## Quick Start
Expand All @@ -43,7 +50,7 @@ a utility function `convertMcpToLangchainTools()` from [`@h1deya/langchain-mcp-t
// "model_provider": "anthropic",
// "model": "claude-3-5-haiku-latest",
// "model_provider": "google_genai",
// "model": "gemini-2.0-flash",
// "model": "gemini-2.5-flash",
},

"mcp_servers": {
Expand Down Expand Up @@ -128,9 +135,9 @@ mcp-try-cli --help

## Supported LLM Providers

- **OpenAI**: `gpt-4o`, `gpt-4o-mini`, etc.
- **OpenAI**: `o4-mini`, `gpt-4o-mini`, etc.
- **Anthropic**: `claude-sonnet-4-0`, `claude-3-5-haiku-latest`, etc.
- **Google (GenAI)**: `gemini-2.0-flash`, `gemini-1.5-pro`, etc.
- **Google (GenAI)**: `gemini-2.5-pro`, `gemini-2.5-flash`, etc.

## Configuration

Expand All @@ -153,7 +160,7 @@ Create a `llm_mcp_config.json5` file:
{
"llm": {
"model_provider": "openai",
"model": "gpt-4o-mini",
"model": "gpt-4.1-nano",
// model: "o4-mini",
},

Expand All @@ -165,8 +172,8 @@ Create a `llm_mcp_config.json5` file:

// "llm": {
// "model_provider": "google_genai",
// "model": "gemini-2.0-flash",
// // "model": "gemini-2.5-pro-preview-06-05",
// "model": "gemini-2.5-flash",
// // "model": "gemini-2.5-pro",
// }

"example_queries": [
Expand Down Expand Up @@ -246,14 +253,6 @@ There are quite a few useful MCP servers already available:
- Use `--verbose` flag for detailed output
- Refer to [MCP documentation](https://modelcontextprotocol.io/)

## Development

This tool is built with:
- [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
- [LangChain](https://langchain.com/) for LLM integration
- [TypeScript](https://www.typescriptlang.org/) for type safety
- [Yargs](https://yargs.js.org/) for CLI parsing

## License

MIT License - see [LICENSE](LICENSE) file for details.
Expand Down
2 changes: 1 addition & 1 deletion README_DEV.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ For the convenience of debugging MCP servers, this client prints local (stdio) M

LLMs from Anthropic, OpenAI and Google (GenAI) are currently supported.

A python version of this MCP client is available
A Python version of this MCP client is available
[here](https://github.com/hideya/mcp-client-langchain-py)

## Prerequisites
Expand Down
52 changes: 32 additions & 20 deletions llm_mcp_config.json5
Original file line number Diff line number Diff line change
Expand Up @@ -15,32 +15,33 @@
// // "max_tokens": 10000,
// },

"llm": {
// https://platform.openai.com/docs/pricing
// https://platform.openai.com/settings/organization/billing/overview
"model_provider": "openai",
"model": "gpt-4o-mini",
// "model": "o4-mini",
// "temperature": 0.0, // 'temperature' is not supported with "o4-mini"
// "max_completion_tokens": 10000, // Use 'max_completion_tokens' instead of 'max_tokens'
},

// "llm": {
// // https://ai.google.dev/gemini-api/docs/pricing
// // https://console.cloud.google.com/billing
// "model_provider": "google_genai",
// "model": "gemini-2.0-flash",
// // "model": "gemini-1.5-pro",
// // "temperature": 0.0,
// // "max_tokens": 10000,
// // https://platform.openai.com/docs/pricing
// // https://platform.openai.com/settings/organization/billing/overview
// "model_provider": "openai",
// "model": "gpt-4.1-nano",
// // "model": "o4-mini",
// // "temperature": 0.0, // 'temperature' is not supported with "o4-mini"
// // "max_completion_tokens": 10000, // Use 'max_completion_tokens' instead of 'max_tokens'
// },

"llm": {
// https://ai.google.dev/gemini-api/docs/pricing
// https://console.cloud.google.com/billing
"model_provider": "google_genai",
"model": "gemini-2.5-flash",
// "model": "gemini-2.5-pro",
// "temperature": 0.0,
// "max_tokens": 10000,
},

"example_queries": [
"Are there any weather alerts in California?",
"Read the news headlines on bbc.com",
"Read and briefly summarize the LICENSE file",
"Are there any weather alerts in California?",
// "What's the news from Tokyo today?",
// "Open the webpage at bbc.com",
// "Search the web and get today's news related to tokyo",
// "Tell me about my Notion account",
],

"mcp_servers": {
Expand Down Expand Up @@ -82,7 +83,7 @@
// },

// // Test SSE connection with the auto fallback
// // See the comments at the top of index.ts
// // See the comments at the top of src/index.ts
// weather: {
// "url": "http://localhost:${SSE_SERVER_PORT}/sse"
// },
Expand Down Expand Up @@ -112,5 +113,16 @@
// "args": [ "-y", "@modelcontextprotocol/server-brave-search"],
// "env": { "BRAVE_API_KEY": "${BRAVE_API_KEY}" }
// },

// notion: {
// "command": "npx",
// "args": ["-y", "@notionhq/notion-mcp-server"],
// "env": {
// // Although the following implies that this MCP server is designed for
// // OpenAI LLMs, it works fine with others models.
// // Tested Claude and Gemini (with schema adjustments).
// "OPENAPI_MCP_HEADERS": '{"Authorization": "Bearer ${NOTION_INTEGRATION_SECRET}", "Notion-Version": "2022-06-28"}'
// },
// },
}
}
Loading