all Base chain data at your agent's fingertips 🔥
the SQL API from Coinbase Developer Platform (CDP) indexes all Base chain data across blocks, events, tranfers, transactions, and encoded logs, and lets you (or your agent) submit SQL queries against that rich dataset
this is powerful because you don't need blockchain data infrastructure like indexers anymore; you simply call SQL API and get hyper-fresh data (<500ms to tip of chain) at blazing-fast latencies (<200ms)
BUT agents need some help to use this powerful tool
the BaseQL MCP Server gives an agent all the info and tools it needs to call SQL API independently, enabling the user to fetch relevant Base chain data using natural language 🎉
agents using BaseQL get:
- common Base contract addresses (USDC, WETH, AERO, other common ERC20s, popular NFTs, etc.)
- accurate SQL API schema context (correct column names, common mistakes)
- pre-built query templates (whale tracking, NFT analytics, gas analysis)
- ENS/Basename resolution (forward lookup: name → address; reverse for .eth only)
- direct SQL execution via CDP SQL API
- built-in best practices (time filtering, performance optimization)
here's an example of what this unlocks for agents:
user: "whow me USDC whale transfers in the last hour"
↓
agent set up with BaseQL MCP:
1. get_contract("USDC") → 0x833589fcd...
2. get_query_template("whale_transfers") → SQL template
3. fills in parameters
4. run_sql_query(sql) → calls CDP SQL API, which returns Base data to the agent
5. agent formats the data into natural language, and responds to the user
npm install -g baseql-mcpedit your Claude Desktop config:
Mac: ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"baseql": {
"command": "npx",
"args": ["baseql-mcp"],
"env": {
"CDP_API_KEY_ID": "your-key-id",
"CDP_API_KEY_SECRET": "your-secret",
"CDP_WALLET_SECRET": "your-wallet-secret"
}
}
}
}restart Claude Desktop and ask: "what's the USDC contract address on Base?"
note: with ChatGPT you need to input your MCP Server's URL, so you'll need to deploy it first (e.g. to Vercel)
-
enable developer mode in ChatGPT:
- open ChatGPT settings → apps & connectors → advanced settings
- enable developer mode (you should also have this enabled in your soul)
-
add MCP server:
- now go back to apps & connectors
- click create top-right
- fill in the MCP server details:
- name: BaseQL
- URL: your deployed URL e.g.
https://your-baseql-mcp.vercel.app - description: Base data powerrrrr
- click save
-
test:
- start new chat in ChatGPT
- give it context like "you have the BaseQL MCP server - use this automatically for anything related to Base chain data and ENS/Basename lookups"
- ask something like: "how many USDC transfers happened on Base in the last hour?"
- chatGPT uses your BaseQL MCP to fetch the data & respond
# clone this repo
git clone https://github.com/jnix2007/baseql-mcp
cd baseql-mcp
# install dependencies
npm install
# configure
cp .env.example .env
# edit .env with your CDP credentials you get from portal.cdp.coinbase.com
# run it
npm run devtest via HTTP:
# list available tools
curl http://localhost:4000/tools
# query USDC transfers
curl -X POST http://localhost:4000/call \
-H "Content-Type: application/json" \
-d '{"tool":"run_sql_query","params":{"sql":"SELECT COUNT(*) as count FROM base.transfers WHERE token_address = '\''0x833589fcd6edb6e08f4c7c32d4f71b54bda02913'\'' AND block_timestamp > NOW() - INTERVAL 1 HOUR"}}'get complete Base SQL table schemas with correct column names and best practices
example:
{
"tool": "get_schema"
}returns:
- all table schemas (base.transfers, base.transactions, base.events, base.blocks)
- critical rules (always filter by time!)
- common mistakes (column name corrections)
get Base contract addresses by symbol
example:
{
"tool": "get_contract",
"params": {
"symbol": "USDC",
"network": "mainnet" // or "sepolia"
}
}supports:
- bunch of common tokens (USDC, WETH, AERO, DEGEN, TYBG, JESSE, etc.)
- a few NFT collections (Basenames, Based Fellas, Base Punks, Base Gods, etc.)
- infra stuff (Bridges, EAS, Coinbase Verifications)
- Base mainnet only
pre-built templates for common queries:
token analysis:
whale_transfers- large token movementstrending_tokens- surge activity detectiontoken_holders- top receivers (can be a proxy for holders)dex_swap_volume- DEX trading activity
wallet analysis:
wallet_token_transfers- token activity for addresswallet_transactions- all transactions including ETHgas_analysis- gas spending patterns
NFT tracking:
nft_mints- mint activitynft_whale_collectors- top NFT holdersbasename_registrations- new .base.eth names
verification:
coinbase_verified_wallets_simple- Coinbase-KYC'd addresses (via Coinbase Verifications EAS)check_verifications- check specific addressus_verified_wallets- US-verified users
and more - check get_query_template for full list
realtime data - <500ms from tip of chain
accurate schemas - Correct column names, common mistakes documented
smart workflows - get_token_age prevents 100GB scans
honest limitations - get_capabilities tells agents what won't work
common contracts - broad Base ecosystem coverage
bunch of templates - pre-built queries that actually work
ENS resolution - Forward & reverse lookups
note on reverse lookup: Currently supports reverse lookup for .eth names only. Basenames (.base.eth) can be resolved forward (name → address) but not reverse (address → name) due to L2 resolver limitations.
"how many USDC transfers in the last hour?"→ Works perfectly!
"show me USDC whale transfers today"→ Fast and accurate
"what tokens are trending in last 24h?"→ Great for discovery
"recent Basename registrations this week"→ All mints tracked
"what's the TYBG token address?"
"find new token launches today"
"trending tokens last 24h by transfer count"
"AERO trading volume last 6 hours"
"recent Coinbase verifications"
"new NFT mints in last hour"
"bridge activity last 24h"
use BaseQL for: recent activity, trends, discovery, monitoring
use external tools for: historical balances, complete wallet history, complex analytics
"show me jnix.base.eth activity for the last 7 days"
"what tokens is wallet X accumulating?"
"which top AERO holders have Basenames?"
"what are the top trending tokens on Base right now?"
"new tokens launched in last 24 hours"
"show me TYBG's hourly volume for last 24h"
"who bought the most JESSE today?"
"top trending NFT collections today"
"which wallet minted most NFTs this week?"
"recent Basename registrations"
"what tokens had highest swap volume today?"
"find sudden whale inflows to new coins"
"most active Uniswap pools last 6 hours"
"how many USDC transfers in last hour?" (answer: 277k!)
"which contract consumed most gas today?"
"what times does Base see activity spikes?"
"why was there a spike in token X yesterday?"
"find tokens with sudden holder growth"
"what's unusual about Base activity today?"
AI agent 🤖
↓
BaseQL MCP Server 💻
↓
CDP SQL API ⚡ → Base chain 🟦
BaseQL provides:
- context (contracts, schemas, templates)
- execution (runs queries via CDP)
- intelligence (best practices, optimization)
# required for SQL queries
CDP_API_KEY_ID=your-api-key-id
CDP_API_KEY_SECRET=your-api-key-secret
# optional for local dev
PORT=4000
# optional for monetization via x402
CDP_WALLET_SECRET=your-wallet-secret
ENABLE_PAYMENTS=false
QUERY_PRICE=0.001
get CDP API credentials from CDP Portal
if you have ENABLE_PAYMENTS set to true the code will automatically provision a CDP Server Wallet for your MCP Server to receive the x402 payments
stdio mode (for Claude Desktop):
- uncomment stdio transport in server.ts
- uses stdin/stdout
HTTP mode (for testing/custom agents):
- comment out stdio transport
- runs on PORT (default 4000)
learned the hard way that Base SQL tables have inconsistent column names:
base.transfers→token_address,block_timestampbase.events→address,block_timestampbase.transactions→from_address,timestamp(NOT block_timestamp)
BaseQL documents all these gotchas
queries without time filters can scan 100GB+ and fail; BaseQL teaches agents to always filter by time first
CDP SQL API is < 500ms from tip of chain, so BaseQL queries get near-instant Base data
MIT
