This is a demonstration of how to use reason and act with llama.cpp and a LLM to pose plain english queries to a SQLite database using one of two strategies:
- Actions that mimic interaction with a frontend like Datasette. Actions: list tables, list table columns, facet, filter
- Let the LLM use SQLite queries directly. Actions: list tables, list table schema, execute sql
The things you'll need to do are:
- Provide a SQLite database (named
example.db
or you need to change the name in the Python files) - Change the prompts in both Python scripts (the
prompt
string inside theexecute
functions) to be specific to your data and problems. You'll also want to date theDATA_HELP
table and column descriptions inrun-sql-queries.py
. - Download a GGUF model for use. The default is to look for dolphin-2.2.1-mistral-7b.Q5_K_M.gguf in the current dir. If you want to use a different model, edit the script you're running.
There are some dependencies for this project that you need, first. You can install with using pip:
pip install -r requirements.txt
Once you have everything installed and configured, you can kick off a session by coming up with a question and asking it on the command line:
python run_interface.py "What kind of data do I have available?"
python llm_sql_queries.py "What are some interesting records in the database?"
The model output will be printed to stdout.