Skip to content

manishiitg/mcpagent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

1 Commit
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

MCP Agent - Go Library

Go Version License

A production-ready Go library for building MCP (Model Context Protocol) agents that connect to multiple MCP servers and execute tools using LLMs. This is a fully independent package that can be used in any Go application.

🎯 What is MCP Agent?

MCP Agent is a Go library that provides a complete framework for building AI agents that interact with MCP servers. It handles:

  • Multi-Server MCP Connections: Connect to multiple MCP servers simultaneously (HTTP, SSE, stdio protocols)
  • LLM Integration: Works with OpenAI, AWS Bedrock, Google Vertex AI, and other LLM providers
  • Tool Execution: Automatic tool discovery, execution, and result handling
  • Code Execution Mode: Execute Go code instead of JSON tool calls for complex workflows
  • Smart Routing: Dynamically filter tools based on conversation context
  • Large Output Handling: Automatically handle tool outputs that exceed context limits
  • Observability: Built-in tracing with Langfuse support
  • Caching: Intelligent caching of MCP server metadata and tool definitions

πŸš€ Quick Start

Installation

# Add to your go.mod
go get mcpagent

# Or use replace directive for local development
replace mcpagent => ../mcpagent

Basic Usage

package main

import (
    "context"
    "time"
    
    mcpagent "mcpagent/agent"
    "mcpagent/llm"
)

func main() {
    // Initialize LLM
    llmModel, err := llm.InitializeLLM(llm.Config{
        Provider: llm.ProviderOpenAI,
        ModelID:  "gpt-4.1",
        APIKeys: &llm.ProviderAPIKeys{
            OpenAI: &openAIKey,
        },
    })
    
    // Create agent
    ctx, cancel := context.WithTimeout(context.Background(), 5*time.Minute)
    defer cancel()
    
    agent, err := mcpagent.NewAgent(
        ctx,
        llmModel,
        "",              // server name (empty = all servers)
        "mcp_servers.json", // MCP config path
        "gpt-4.1",       // model ID
        nil,             // tracer (optional)
        "",              // trace ID
        nil,             // logger (optional)
    )
    
    // Ask a question
    response, err := agent.Ask(ctx, "What tools are available?")
    fmt.Println(response)
}

See examples/ for complete working examples.

πŸ“š Core Features

1. Standard Tool-Use Agent

The default mode where the LLM invokes tools directly through native tool calling:

agent, err := mcpagent.NewAgent(
    ctx, llmModel, "", "config.json", "model-id",
    nil, "", nil,
    mcpagent.WithMode(mcpagent.SimpleAgent),
)

2. Code Execution Mode

Execute Go code instead of JSON tool calls for complex logic:

agent, err := mcpagent.NewAgent(
    ctx, llmModel, "", "config.json", "model-id",
    nil, "", nil,
    mcpagent.WithCodeExecutionMode(true),
    mcpagent.SetFolderGuardPaths([]string{"/workspace"}, []string{"/workspace"}),
)

The LLM can write Go programs that import and use MCP tools as native functions.

3. Smart Routing

Dynamically filter tools based on conversation context to reduce token usage:

agent, err := mcpagent.NewAgent(
    ctx, llmModel, "", "config.json", "model-id",
    nil, "", nil,
    mcpagent.WithSmartRouting(true),
    mcpagent.WithSmartRoutingThresholds(20, 3), // max tools, max servers
)

4. Large Output Handling

Automatically handle tool outputs that exceed context limits:

agent, err := mcpagent.NewAgent(
    ctx, llmModel, "", "config.json", "model-id",
    nil, "", nil,
    mcpagent.WithLargeOutputVirtualTools(true),
    mcpagent.WithLargeOutputThreshold(20000), // characters
)

5. MCP Server Caching

Intelligent caching reduces connection times by 60-85%:

// Caching is enabled by default
// Configure via environment variables:
// MCP_CACHE_DIR=/path/to/cache
// MCP_CACHE_TTL_MINUTES=10080 (7 days)

6. Observability

Built-in tracing with Langfuse support:

tracer := observability.NewLangfuseTracer(...)
agent, err := mcpagent.NewAgent(
    ctx, llmModel, "", "config.json", "model-id",
    tracer, "trace-id", logger,
)

πŸ“– Documentation

Comprehensive documentation is available in the docs/ directory:

πŸ”§ Configuration

MCP Server Configuration

Create a JSON file with your MCP servers:

{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "./demo"]
    },
    "memory": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-memory"]
    }
  }
}

Agent Options

The agent supports extensive configuration via functional options:

agent, err := mcpagent.NewAgent(
    ctx, llmModel, "", "config.json", "model-id",
    tracer, traceID, logger,
    // Agent mode
    mcpagent.WithMode(mcpagent.SimpleAgent),
    
    // Conversation settings
    mcpagent.WithMaxTurns(30),
    mcpagent.WithTemperature(0.7),
    mcpagent.WithToolChoice("auto"),
    
    // Code execution
    mcpagent.WithCodeExecutionMode(true),
    mcpagent.SetFolderGuardPaths(allowedRead, allowedWrite),
    
    // Smart routing
    mcpagent.WithSmartRouting(true),
    mcpagent.WithSmartRoutingThresholds(20, 3),
    
    // Large output handling
    mcpagent.WithLargeOutputVirtualTools(true),
    mcpagent.WithLargeOutputThreshold(20000),
    
    // Custom tools
    mcpagent.WithCustomTools(customTools),
    
    // Tool selection
    mcpagent.WithSelectedTools([]string{"server1:tool1", "server2:*"}),
    mcpagent.WithSelectedServers([]string{"server1", "server2"}),
)

πŸ§ͺ Testing

The package includes comprehensive testing utilities:

# Run all tests
cd cmd/testing
go test ./...

# Run specific test
go run testing.go agent-mcp --log-file logs/test.log
go run testing.go code-exec --log-file logs/test.log
go run testing.go smart-routing --log-file logs/test.log

See cmd/testing/README.md for details.

πŸ“ Package Structure

mcpagent/
β”œβ”€β”€ agent/              # Core agent implementation
β”‚   β”œβ”€β”€ agent.go       # Main Agent struct and NewAgent()
β”‚   β”œβ”€β”€ conversation.go # Conversation loop and tool execution
β”‚   β”œβ”€β”€ connection.go   # MCP server connection management
β”‚   └── ...
β”œβ”€β”€ mcpclient/         # MCP client implementations
β”‚   β”œβ”€β”€ client.go       # Client interface and implementations
β”‚   β”œβ”€β”€ stdio_manager.go # stdio protocol
β”‚   β”œβ”€β”€ sse_manager.go  # SSE protocol
β”‚   └── http_manager.go # HTTP protocol
β”œβ”€β”€ mcpcache/          # Caching system
β”‚   β”œβ”€β”€ manager.go     # Cache manager
β”‚   └── codegen/       # Code generation for tools
β”œβ”€β”€ llm/               # LLM provider integration
β”‚   β”œβ”€β”€ providers.go   # Provider implementations
β”‚   └── types.go       # LLM types
β”œβ”€β”€ events/            # Event system
β”‚   β”œβ”€β”€ data.go        # Event data structures
β”‚   └── types.go       # Event types
β”œβ”€β”€ logger/             # Logging
β”‚   └── v2/            # Logger v2 interface
β”œβ”€β”€ observability/     # Tracing and observability
β”‚   β”œβ”€β”€ tracer.go      # Tracer interface
β”‚   └── langfuse_tracer.go # Langfuse implementation
β”œβ”€β”€ executor/          # Tool execution handlers
β”œβ”€β”€ examples/          # Example applications
└── docs/              # Documentation

πŸ”Œ Supported LLM Providers

  • OpenAI: GPT-4, GPT-3.5, and other models
  • AWS Bedrock: Claude Sonnet, Claude Haiku, and other models
  • Google Vertex AI: Gemini, PaLM, and other models
  • Custom Providers: Extensible provider interface

πŸ”Œ Supported MCP Protocols

  • stdio: Standard input/output (most common)
  • SSE: Server-Sent Events
  • HTTP: REST API

🀝 Contributing

Contributions are welcome! Please see the Documentation Writing Guide for standards.

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

  • MCP Protocol: Built on the Model Context Protocol
  • multi-llm-provider-go: LLM provider abstraction layer
  • mcp-go: MCP protocol implementation

Made with ❀️ for the AI community

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages