Welcome to Coreon MCP Execution Engine
2025-09-19 – The project now officially supports the Claude-style MCP protocol (stdio mode) in the new alpha version.
Coreon-MCP-Execution-Engine provides a unified runtime for structured ToolCall
chains. It allows LLM agents or users to:
Dynamically execute multiple tools in sequence
Interact via terminal, HTTP API, or Telegram
Plug in custom tools via the modular
tools/
systemUse anywhere via Docker — no manual setup required
This project is inspired by the idea of decoupling agent planning from tool execution, making it perfect for backend execution engines, plugin-based AI systems, or on-chain/off-chain hybrid AI workflows.
Built-in modes: CLI / API Server / Telegram Bot Docker-native, zero local dependency Supports future extensibility with user-defined tools
%%{init: {'theme': 'neutral'}}%%
flowchart LR
subgraph Input[Input Layer]
A[User / Bot / CLI]
end
subgraph Planning[Planning Layer]
B["Planner\nIntent Recognition & Generate Plan(JSON)"]
end
subgraph Execution[Execution Layer]
C["Executor\nSequentially Execute ToolChain"]
end
subgraph Registry[Tool Registry Layer]
D["Tool Registry\nDeclaration: name/module/function/schema"]
end
subgraph Toolset[Toolset]
E{{Tools}}
E1[Market Data\nDexScreener / Binance]
E2[News Fetcher\nCryptoPanic / Feeds]
E3[Social Metrics\nTwitter / Telegram]
E4[On-chain APIs\nToken Metadata / Holders]
E5[Custom Utils\nFormatters / Indicators]
end
subgraph Output[Output Layer]
F["Response Formatter\nHuman-readable & Structured Output"]
G["Outputs\nCLI Charts / Telegram Messages / API JSON"]
end
A --> B
B --> C
C --> D
D --> E
E --> E1
E --> E2
E --> E3
E --> E4
E --> E5
E1 --> C
E2 --> C
E3 --> C
E4 --> C
E5 --> C
C --> F
F --> G
Acts as the “brain” of the MCP Engine. It takes natural language input from CLI, Telegram Bot, or API and:
- Recognizes user intent using LLM-based intent recognition.
- Generates a structured execution plan (
ToolCall Chain
) in JSON format.- Breaks down complex tasks into ordered, executable steps.
The “execution core” responsible for carrying out the plan generated by the Planner:
- Executes tools step-by-step or in parallel when possible.
- Handles retries, error recovery, and logging.
- Ensures the correct order of execution across dependent tasks.
A unified registry for all tools used by the MCP Engine:
- Declares each tool’s name, module path, function signature, and parameter schema.
- Stores tool metadata such as version and description.
- Allows new tools to be plugged in without changing the execution logic.
Entry points for different user interaction modes:
- CLI – Developer-friendly command-line interface for direct execution and debugging.
- Telegram Bot – Chat-based interface for instant, on-the-go interactions.
- Python 3.11+
- Docker
Download from the official Docker site:
Follow the installation steps.
After installation, run:
docker --version
If you see version output, Docker is installed.
Follow these steps to get the MCP Engine running in minutes.
mkdir mcp-execution-env
cd mcp-execution-env
Generate the environment file with required variables:
cat <<EOF > .env
MCP_LANG=EN
OPENAI_API_KEY=sk-xxxxxxxxxx
EOF
Variable | Description | Required |
---|---|---|
MCP_LANG |
Language: EN or ZH |
Yes |
OPENAI_API_KEY |
OpenAI API Key | Yes |
Replace sk-xxxxxxxxxx with your actual OpenAI API key.
docker pull coreonmcp/coreon-mcp-execution-engine
docker run --rm -it --env-file .env coreonmcp/coreon-mcp-execution-engine start cli
docker run --rm -it --env-file .env -p 8080:8080 coreonmcp/coreon-mcp-execution-engine start server
docker run --rm -it --env-file .env coreonmcp/coreon-mcp-execution-engine start telegram-bot
Coreon MCP Execution Engine is designed as the AI Execution Layer for Web3, with a strong focus on the BNB Chain ecosystem (BSC / BNB Smart Chain).
Current support: Query balances, token metadata, DeFi data, and contract calls on BNB Smart Chain (BSC).
Mid-term roadmap: Natural-language swaps on PancakeSwap, AI wallet assistants, and on-chain security monitoring for BNB users.
Future vision: Expand to opBNB (for low-cost L2 execution) and Greenfield (for decentralized data storage), making Coreon MCP a full-stack AI interface for the entire BNB ecosystem.
By bridging natural language with on-chain execution, Coreon MCP lowers entry barriers and positions BNB as the first AI-Ready blockchain.
Our story is just beginning. The MCP Execution Engine will keep evolving — becoming smarter and more powerful with every iteration. We’ll continue to refine features, explore new possibilities, and work hand in hand with developers to shape the future of Web3.