A simple, end-to-end demo that converts natural language into SQL queries using LLMs (Deepseek R1 8B via Ollama) and executes them on a sample SQLite Bookstore database, all through an interactive Streamlit interface.
Modern data systems are full of valuable insights, but querying databases still requires technical knowledge of SQL.
This project bridges that gap by allowing ANYONE (even non-technical users) to interact with structured data using PLAIN ENGLISH.
The core idea is simple but powerful:
Using Large Language Models (LLMs) through Ollama and LangChain, this project translates user questions into executable SQL queries that run against a real SQLite database.
- Converts natural language questions (like “Show all customers from Austin”) into SQL queries.
- Executes those SQL queries safely on a local Bookstore database.
- Displays the results instantly in a clean Streamlit web interface.
- Provides a transparent, explainable way to explore relational data without knowing SQL syntax.
- User Input: The user types a question into the Streamlit interface.
- Schema Extraction: The app uses SQLAlchemy to read the database’s schema (tables, columns, and relationships).
- LLM Conversion: The schema and user question are passed to the Deepseek R1 8B model via LangChain and Ollama.
- SQL Generation: The model returns a syntactically valid SQL query based on the schema.
- Query Execution: The SQL is executed on a local SQLite database, and results are fetched.
- Result Display: The data is displayed in the Streamlit frontend in real-time.
“Show all customers from Austin”
“What is the total revenue from all orders?”
“List products in the Fiction category costing more than 15 dollars”
and the system will:
- Generate an SQL query using the Deepseek model.
- Execute it on a local SQLite database.
- Display the results in an elegant Streamlit app.
User Query:
“Show me all Fiction books priced above 15 dollars.”
Generated SQL:
SELECT name, price FROM products WHERE category = 'Fiction' AND price > 15;📂 text-to-sql-deepseek
├── Database.py # Creates and populates the sample SQLite database
├── App.py # Core logic: schema extraction, LLM prompt, SQL execution
├── frontend.py # Streamlit interface for user interaction
├── requirements.txt # Dependency list
└── README.md # Project documentation
uv init .
uv pip install -r requirements.txt
requirements.txt
streamlit
langchain
langchain_community
langchain_ollama
langchain_core
sqlalchemy
This project uses Ollama. To run the Deepseek model locally:
- Download and install Ollama
- Pull the Deepseek model
ollama pull deepseek-r1:8b
- Verify it’s running
ollama list