This project implements a Model Context Protocol (MCP) server that provides a standardized interface for AI assistants to interact with PostgreSQL databases. MCP enables secure, structured communication between AI models and external data sources through well-defined tools, resources, and prompts.
- Database Accessibility: Enable AI assistants to safely query PostgreSQL databases without direct database access
- Structured Interaction: Provide standardized tools for database operations through MCP protocol
- Security: Implement read-only operations with strict query validation to prevent data manipulation
- Context Awareness: Supply database schema information and context for intelligent query generation
- FastMCP: Lightweight MCP server implementation providing transport layers (stdio/SSE)
- Transport Layer: Dual transport support for local development (stdio) and network deployment (SSE via HTTP)
- Registration System: Decorators for automatic registration of tools, resources, and prompts
- YAML-based Configuration: Centralized settings for database connections and server parameters
- Environment Variables: Runtime configuration override capabilities
- Validation Layer: Pydantic models ensuring data integrity and type safety
- Connection Pooling: Async PostgreSQL connections with automatic lifecycle management
- Query Execution Engine: Isolated read-only operations with comprehensive error handling
- Result Serialization: Consistent data format conversion for MCP protocol compatibility
- Tools: Executable database operations (schema listing, table inspection, query execution)
- Resources: Static/contextual data endpoints providing database metadata
- Prompts: Dynamic instruction templates guiding AI query generation
The main application serves as a transport-aware launcher that initializes the MCP server with appropriate communication protocols. It supports both local development through standard I/O streams and production deployment via HTTP streaming.
Externalized settings management loads database credentials and server parameters from structured configuration files. The system provides fallback mechanisms and environment variable overrides for flexible deployment across different environments.
Asynchronous connection management establishes secure PostgreSQL connections using connection pooling. The abstraction layer handles connection lifecycle, error recovery, and resource cleanup while maintaining connection isolation for concurrent operations.
Six specialized tools provide comprehensive database interaction capabilities:
- Health Monitoring: Basic connectivity verification returning server status
- Schema Discovery: Enumerates available database schemas for navigation
- Table Enumeration: Lists tables within specified schemas with metadata
- Schema Inspection: Retrieves detailed column information and constraints
- Query Execution: Safe SQL execution with forbidden operation filtering
- Performance Analysis: Query optimization insights through EXPLAIN plan generation
Contextual data endpoints serve static database information and schema-specific guidance. These provide AI assistants with domain knowledge about table relationships, data types, and common query patterns without requiring direct database inspection.
Dynamic instruction templates guide AI assistants in generating appropriate database queries. The system provides structured workflows for safe query construction, emphasizing read-only operations and performance considerations.
Connection validation utilities enable developers to verify database connectivity and explore schema structures. The testing module provides diagnostic capabilities for troubleshooting deployment issues and validating configuration correctness.
- Install dependencies:
uv sync-
Configure database connection in
config.yaml -
Install Cloudflare Tunnel (for exposing server):
# macOS
brew install cloudflared
# Or download from: https://developers.cloudflare.com/cloudflare-one/connections/connect-apps/install-and-setup/installation/MCP_TRANSPORT=stdio uv run python main.py- Start the server (defaults to SSE transport):
uv run python main.pyThe server will start on http://127.0.0.1:8000 by default.
- In another terminal, start Cloudflare tunnel:
cloudflared tunnel --url http://127.0.0.1:8000Cloudflare will provide a public URL (e.g., https://xxxxx.trycloudflare.com) that you can use to access your MCP server.
MCP_TRANSPORT: Transport type -sse(default) orstdioMCP_HOST: Host address (default:127.0.0.1)MCP_PORT: Port number (default:8000)
Example:
MCP_PORT=3000 MCP_HOST=0.0.0.0 uv run python main.pyping: Health checklist_schemas: List all database schemaslist_tables: List tables in a schemaget_table_info: Get table schema informationrun_sql_query: Execute read-only SQL queriesrun_explain_query: Get query performance metrics
db://context: Database context informationdb://schema/{schema_name}: Schema-specific context
get_table_data_prompt: Prompt to generate queries for table data
- The server uses
main.pyas the entry point - SSE transport is used for HTTP/network access (Cloudflare tunnel)
- stdio transport is used for local development