This guide will help you set up the environment, run the databases, ingest data, and start the backend and frontend applications.
Before you begin, make sure you have the following installed:
- Python 3.12
- pip
- Docker
- Docker Compose
Install the required Python dependencies:
pip install uv
uv pip install -r requirements.txtCreate a .env file inside the src/ and the notebook/ folder:
src/
└── .env
notebook/
└── .env
Add the following variables to your .env file:
GROQ_API_KEY=<your_api_key>
GROQ_API_URL=https://api.groq.com/openai/v1
POSTGRES_URL=postgresql://myuser:mypassword@localhost:5432/postgres # Local PostgreSQL🔑 Replace
<your_api_key>and database credentials with your actual values.
Start the databases using Docker Compose:
cd docker
docker compose upThis will spin up both Qdrant and PostgreSQL services.
- Navigate to the notebook folder.
- Run the following Jupyter notebooks in order:
sql_data_ingestion.ipynb→ for relational database ingestion.train_test_to_sql.ipynb→ for vector database ingestion.
⚠️ Make sure to uncomment thevanna traincell before running.
- Open 3_indexing.ipynb notebook.
- Advance to the latest cell to run the Text to SQL system.