A Model Context Protocol (MCP) server that indexes PostgreSQL database schemas and enables LLMs to query schema information and execute read-only SQL queries through natural language.
This MCP server provides:
- Schema metadata indexing for PostgreSQL databases
- Support for multiple PostgreSQL databases simultaneously
- Background schema indexing with configurable intervals
- Seamless integration with MCP-compatible clients (Kilo Code, Cursor, Claude Desktop)
- Read-only query execution with built-in security controls
- Natural language interface for database exploration
The Model Context Protocol provides:
- No custom UI/interface required
- Direct integration with AI development tools
- Standardized protocol for LLM tool calls
- Native support for asynchronous operations
┌─────────────────────────────────────────────────────────────┐
│ LLM Client (MCP Compatible) │
│ "Show me all tables in the database" │
└─────────────────────────┬───────────────────────────────────┘
│ MCP Protocol
┌─────────────────────────▼───────────────────────────────────┐
│ MCP Server │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ Tools: list_databases, get_schema, execute_query... │ │
│ └──────────────────────────────────────────────────────┘ │
└─────────────────────────┬───────────────────────────────────┘
│
┌─────────────────┼─────────────────┐
│ │ │
┌───────▼────────┐ ┌─────▼──────┐ ┌────────▼─────────┐
│ Schema Indexer │ │Schema Cache│ │ Metadata Store │
│ (Worker) │ │ (Memory) │ │ (SQLite) │
└───────┬────────┘ └────────────┘ └──────────────────┘
│
│ Read-Only Schema Queries
│
┌───────▼─────────────────────────────────────────────────┐
│ PostgreSQL Databases (information_schema access) │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ │
│ │ DB 1 │ │ DB 2 │ │ DB N │ │
│ └──────────┘ └──────────┘ └──────────┘ │
└─────────────────────────────────────────────────────────┘
- Node.js 20+
- PostgreSQL database(s)
- MCP-compatible client (Kilo Code, Cursor, or Claude Desktop)
# Clone the repository
git clone <repository-url>
cd postgres-schema-mcp
# Install dependencies
npm install
# Configure environment variables
cp .env.example .env
# Edit .env and add your PostgreSQL credentials
# Create database configuration
cp config/databases.example.json config/databases.json
# Edit config/databases.json with your database details
# Build the project
npm run buildFor security, create a dedicated read-only user:
-- Create read-only user
CREATE USER readonly_user WITH PASSWORD 'secure_password';
-- Grant connection to database
GRANT CONNECT ON DATABASE your_database TO readonly_user;
-- Grant schema access
GRANT USAGE ON SCHEMA public TO readonly_user;
-- Grant access to information_schema and pg_catalog only
GRANT SELECT ON ALL TABLES IN SCHEMA information_schema TO readonly_user;
GRANT SELECT ON ALL TABLES IN SCHEMA pg_catalog TO readonly_user;
-- No direct data table access required for schema indexingAdd the server to your MCP client configuration:
Kilo Code / Cursor:
Location:
- macOS/Linux:
~/.config/Code/User/globalStorage/kilocode.kilo-code/settings/mcp_settings.json - Windows:
%APPDATA%\Code\User\globalStorage\kilocode.kilo-code\settings\mcp_settings.json
Configuration:
{
"mcpServers": {
"postgres-schema": {
"command": "node",
"args": ["/absolute/path/to/postgres-schema-mcp/dist/index.js"],
"env": {
"PG_SCHEMA_PASSWORD": "your_schema_password",
"PG_QUERY_PASSWORD": "your_query_password"
}
}
}
}# Development mode with auto-reload
npm run dev
# Production mode
npm startCreate config/databases.json from the example:
{
"databases": [
{
"id": "production-db",
"name": "Production Database",
"connection": {
"host": "localhost",
"port": 5432,
"database": "myapp",
"schemaUser": "readonly_schema_user",
"schemaPassword": "${PG_SCHEMA_PASSWORD}",
"queryUser": "readonly_query_user",
"queryPassword": "${PG_QUERY_PASSWORD}",
"ssl": false
},
"indexing": {
"enabled": true,
"interval": "1h",
"schemas": ["public"],
"excludeTables": ["migrations", "schema_version"]
},
"queryExecution": {
"enabled": true,
"maxLimit": 1000,
"defaultLimit": 100,
"timeout": 30000,
"rateLimit": {
"maxQueries": 100,
"windowMs": 60000,
"maxConcurrent": 5
}
}
}
],
"worker": {
"parallelism": 3,
"retryAttempts": 3
}
}Create .env from .env.example:
# PostgreSQL Credentials
PG_SCHEMA_PASSWORD=your_schema_readonly_password
PG_QUERY_PASSWORD=your_query_readonly_password
# Metadata DB Path
METADATA_DB_PATH=./data/metadata.db
# Logging
LOG_LEVEL=infoNatural language database assistant powered by LLM reasoning.
Requirements:
- Currently requires LM Proxy running with GitHub Copilot
- Configure in
.env:AGENT_ENABLED=true AGENT_LM_PROXY_URL=http://localhost:4000 AGENT_MODEL=gpt-4o
Input:
{
"question": "Show me all tables in the database",
"databaseId": "production-db",
"detailed": false
}Ask in natural language - Monkey builds and executes the optimal SQL query for you.
Note: The Monkey agent is an autonomous database assistant that:
- Analyzes your natural language questions
- Reads database schemas automatically
- Builds and executes optimized SQL queries
- Returns formatted results with reasoning steps
For more details on LM Proxy setup, see the vscode-lm-proxy documentation.
npm run add-dbInteractive wizard to add new databases:
- Connection details
- SSL configuration
- Schema selection
- Query limits
- Connection testing
npm run discoverExplore schemas in configured databases.
npm run logsView server logs for debugging and monitoring.
postgres-schema-mcp/
├── src/
│ ├── index.ts # Entry point
│ ├── mcp/
│ │ ├── server.ts # MCP server implementation
│ │ └── tools/ # Tool implementations
│ ├── agent/
│ │ └── monkey.ts # Natural language query agent
│ ├── db/
│ │ └── metadata.ts # SQLite metadata store
│ ├── services/
│ │ ├── schemaExtractor.ts
│ │ ├── queryExecutor.ts
│ │ └── queryValidator.ts
│ └── types/
│ └── index.ts
├── config/
│ └── databases.json # Database configurations
├── data/
│ └── metadata.db # Schema cache
└── bin/
└── monkey.cjs # CLI wrapper
# Development
npm run dev # Start with auto-reload
npm run build # Build TypeScript
npm start # Start production server
# Database Management
npm run add-db # Add new database
npm run discover # Discover schemas
# Utilities
npm run logs # View logs
npm run help # Show all commands# Run all tests
npm test
# Run specific test suites
npm run test:unit
npm run test:integration- Read-Only Users: Always use dedicated read-only PostgreSQL users
- Environment Variables: Never commit credentials to version control
- SSL/TLS: Use SSL for production databases
- Least Privilege: Grant only necessary schema access permissions
- Rate Limiting: Configure appropriate query rate limits
- No row data from tables
- No column contents
- No sensitive information
- Only schema metadata (structure)
- Only SELECT queries allowed
- Automatic LIMIT enforcement
- Query timeout protection
- Rate limiting per database
- Read-only transaction mode
- Query validation and sanitization
- In-memory schema caching
- Incremental schema updates
- Parallel database indexing
- Connection pooling
- SQLite indexes for fast metadata queries
- Small database (< 50 tables): 5-10 seconds
- Medium database (50-200 tables): 30-60 seconds
- Large database (200+ tables): 2-5 minutes
# Check Node.js version
node --version # Should be >= 20
# Reinstall dependencies
npm install
# Check logs
npm run logs# Test connection manually
psql -h localhost -U readonly_user -d your_database
# Check user permissions
\du readonly_user- Verify absolute path in MCP configuration
- Rebuild the project:
npm run build - Restart the MCP client completely
- Check client logs for errors
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
MIT License - see LICENSE file for details
- Model Context Protocol by Anthropic
- PostgreSQL - The world's most advanced open source database
- Node.js runtime
For issues or questions:
- Open an issue on GitHub
- Check existing issues and documentation
- Review the troubleshooting section
Built for better LLM-database integration through the Model Context Protocol.