Skip to content

This project demonstrates how to build a **Model Context Protocol (MCP)** server with **FastAPI**, integrate it with a **PostgreSQL** database, and use a **local Ollama LLM** to generate safe SQL queries that answer natural language questions.

Notifications You must be signed in to change notification settings

nutanlade/mcp-llm-db

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


MCP + FastAPI + PostgreSQL + Ollama Example

This project demonstrates how to build a Minimal Chat Protocol (MCP) server with FastAPI, integrate it with a PostgreSQL database, and use a local Ollama LLM to generate safe SQL queries that answer natural language questions.


🚀 Features

  • MCP-compatible FastAPI server

  • Tool query_db that:

    • Generates SQL queries with Ollama
    • Executes them against PostgreSQL
    • Returns JSON results
  • Custom /ask API endpoint + HTML UI for direct queries

  • Example schema: users, products, orders, order_items, etc.

  • Safe SQL: only SELECT queries are executed


🚀 Architecture

Architecture Diagram

🔄 Flow

  1. User Question → sent from the HTML UI (/ask) to FastAPI.
  2. FastAPI + MCP Server → passes the question to Ollama.
  3. Ollama LLM → generates a SQL SELECT query (only read-only).
  4. SQL Validator → ensures query is safe (no DESCRIBE, SHOW, or modifications).
  5. PostgreSQL DB → executes the safe SQL query and returns results.
  6. FastAPI → formats results into JSON.
  7. Response → returned to the user via the HTML UI.

🛠️ Prerequisites

  • Python 3.11+
  • PostgreSQL running locally on port 5432
  • Ollama installed with a SQL-friendly model (e.g. llama3 or codellama)

📦 Setup Instructions

1. Clone the repository

git clone https://github.com/nutanlade/mcp-llm-db.git
cd mcp-llm-db

2. Create Python virtual environment

# Ensure Python 3.11 is installed
python3.11 -m venv .venv
source .venv/bin/activate

3. Install dependencies

pip install --upgrade pip
pip install fastapi uvicorn psycopg2-binary sqlalchemy fastmcp ollama

🗄️ Database Setup

1. Create the database

createdb mcpexample

2. Connect to Postgres

psql -U <your_username> -d mcpexample

3. Create schema

CREATE TABLE users (
    id SERIAL PRIMARY KEY,
    name VARCHAR(100) NOT NULL,
    email VARCHAR(100) UNIQUE NOT NULL,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

CREATE TABLE products (
    id SERIAL PRIMARY KEY,
    name VARCHAR(100) NOT NULL,
    description TEXT,
    price NUMERIC(10,2) NOT NULL,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

CREATE TABLE inventories (
    product_id INT PRIMARY KEY REFERENCES products(id) ON DELETE CASCADE,
    quantity INT NOT NULL DEFAULT 0,
    updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

CREATE TABLE orders (
    id SERIAL PRIMARY KEY,
    user_id INT REFERENCES users(id) ON DELETE CASCADE,
    order_date TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    status VARCHAR(20) DEFAULT 'pending'
);

CREATE TABLE order_items (
    id SERIAL PRIMARY KEY,
    order_id INT REFERENCES orders(id) ON DELETE CASCADE,
    product_id INT REFERENCES products(id) ON DELETE CASCADE,
    quantity INT NOT NULL,
    price NUMERIC(10,2) NOT NULL,
    UNIQUE(order_id, product_id)
);

4. Insert sample data

INSERT INTO users (name, email) VALUES
('Alice', 'alice@example.com'),
('Bob', 'bob@example.com');

INSERT INTO products (name, description, price) VALUES
('Laptop', 'High performance laptop', 1500.00),
('Phone', 'Smartphone with OLED screen', 900.00),
('Headphones', 'Noise cancelling headphones', 250.00);

🤖 Running Ollama

Pull a model that works well with SQL generation (e.g. llama3):

ollama pull llama3

🖥️ Running the Server

Start FastAPI app

uvicorn main:app --reload

The server will run at: 👉 http://localhost:8000


📡 Endpoints

/mcp

  • MCP-compatible tool interface
  • Exposes query_db as a tool

/ask

  • Accepts a question (string) and returns:

    • SQL generated by Ollama
    • Query results from Postgres

Example via curl:

curl -X POST http://localhost:8000/ask \
     -F "question=show me top 5 costly products"

Response:

{
  "ok": true,
  "sql": "SELECT name, price FROM products ORDER BY price DESC LIMIT 5;",
  "rows": [
    {"name": "Laptop", "price": 1500.0},
    {"name": "Phone", "price": 900.0},
    {"name": "Headphones", "price": 250.0}
  ]
}

/

  • Simple HTML form UI for manual queries

📝 Notes

  • If you see Decimal not JSON serializable, FastAPI’s jsonable_encoder will handle it.
  • If you get role "postgres" does not exist, create the role or update your DB URL.
  • For ambiguous queries (“costly products”), improve prompt examples to clarify unit price vs sales revenue.

✅ Example Questions

  • Show me the top 5 costly products by price
  • Which products generated the highest sales revenue?
  • Who placed the most orders?
  • List all users and their total order count

About

This project demonstrates how to build a **Model Context Protocol (MCP)** server with **FastAPI**, integrate it with a **PostgreSQL** database, and use a **local Ollama LLM** to generate safe SQL queries that answer natural language questions.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published