This project is a chatbot application that integrates Google Sheets data with Google's Gemini AI to provide conversational responses based on spreadsheet content. It consists of a Flask backend API and a PHP frontend interface.
- Python 3.7 or higher
- PHP 7.4 or higher
- Google Cloud service account with access to Google Sheets API
- Gemini API key from Google AI Generative Language
-
Clone the repository or download the project files.
-
Create and activate a virtual environment:
python3 -m venv venv source venv/bin/activate
python -m venv venv .\venv\Scripts\Activate
💡 You should now see
(venv)
at the start of your terminal prompt. -
Install Python dependencies inside the activated environment:
pip install -r requirements.txt
-
Obtain a Google Cloud service account JSON key with permissions to read Google Sheets.
-
Get your Gemini API key from Google AI Generative Language.
-
Prepare your Google Spreadsheet and note its Spreadsheet ID.
Run the Flask backend server on port 5001:
python app.py
Run the PHP built-in server on port 8000:
php -S localhost:8000 index.php
-
Open your browser and navigate to
http://localhost:8000
. -
Sign up or log in with a username and password.
-
Fill in the chatbot configuration form:
- Chatbot Name
- Generate or enter a Chatbot ID
- Gemini API Key
- Gemini Model (default:
gemini-2.0-flash
) - Google Spreadsheet ID
- Paste the Service Account JSON key
-
Click Connect to authorize and list available sheets.
-
Select one or more sheets to load.
-
Click Load to Chat to start chatting with the bot based on your spreadsheet data.
-
Save your chatbot configuration for later use if desired.
The project uses a SQLite database (chatbots.db
) to store user credentials and chatbot configurations.
- Ensure your Google service account has read access to the specified spreadsheet.
- The Flask backend runs on port 8080 and the PHP frontend on port 8000—make sure these ports are free.
- The chatbot uses Gemini AI to generate responses based on spreadsheet data.
This project includes comprehensive Docker support for easy deployment and development.
-
Clone and navigate to the project:
git clone <repository-url> cd ChatBot
-
Start the application:
# Using docker-compose (recommended) docker-compose up -d # Or using make make up
-
Access the application:
- Frontend: http://localhost:8000
- Backend API: http://localhost:5001
# Start development environment
make dev
# View logs
make logs
# Open shell in container
make shell
# Stop development environment
make down
# Build production image
make build-prod
# Start production environment
make prod
# Scale application
make scale
# Clean up containers and images
make clean
# Backup database
make backup
# Check status
make status
# Build the image
docker build -t chatbot:latest .
# Run the container
docker run -d \
--name chatbot \
-p 8000:8000 \
-p 5001:5001 \
-v $(pwd)/data:/app/data \
chatbot:latest
## To build and run the docker
docker build -t chatbot:latest .
docker run -d --name chatbot -p 8080:8080 chatbot:latest
# Check running containers
docker ps
# Check all containers (including stopped)
docker ps -a
# View container logs
docker logs -f chatbot
# Execute commands in running container
docker exec -it chatbot bash
# Stop container
docker stop chatbot
# Start stopped container
docker start chatbot
# Remove container
docker rm chatbot
-
Build for production:
docker buildx build --platform linux/amd64 -t shivamnishad/chatbot:latest .
-
Push to registry:
docker push shivamnishad/chatbot:latest
-
Run in production:
docker-compose -f docker-compose.prod.yml up -d
Create a .env
file based on .env.example
:
cp .env.example .env
Key variables:
GEMINI_API_KEY
: Your Google Gemini API keyGEMINI_MODEL
: AI model to use (default: gemini-2.0-flash)FLASK_ENV
: Set to 'production' for production deployment
The SQLite database is automatically persisted in the data/
directory. For production, consider using PostgreSQL:
# Uncomment PostgreSQL service in docker-compose.prod.yml
# Update DATABASE_URL in .env file
- Port conflicts: Change ports in docker-compose.yml if needed
- Permission issues: Ensure data directory has proper permissions
- Database errors: Check database file permissions and location
- Memory issues: Adjust resource limits in docker-compose.prod.yml