This project is a web-based dashboard that displays real-time data from a Microsoft SQL Server database. It uses a Flask backend to serve a web interface and provide an API, with the SQL Server instance running in a Docker container for easy setup. It also now includes an integrated AI assistant, powered by a local Ollama model, to provide analysis and answer questions about the cooling system data.
- User Authentication: Secure signup and login functionality.
 - Real-Time Data Display: Dashboard fetches and displays the latest entry from the 
CoolingDatatable every second. - Containerized SQL Server: Uses Microsoft SQL Server in a Docker container for portability.
 - Flask Backend: Lightweight Python-based web server to handle API requests and serve the frontend.
 - AI-Powered Chat: An integrated chat interface (powered by Ollama and 
gemma3) that can answer questions about the cooling system and retrieve real-time data. 
- Backend: Flask (Python), Ollama (for local LLM inference)
 - Database: Microsoft SQL Server (Docker)
 - Python Libraries: 
pyodbc,SQLAlchemy,pandas,Flask-Cors,python-dotenv,werkzeug,ollama - Frontend: HTML, CSS, JavaScript (served by Flask)
 
Ensure the following software is installed:
- Python 3.8+: Download Python
 - Docker Desktop: Download Docker
 - Git: Download Git
 - SQL Database GUI Tool (choose one):
- Azure Data Studio (recommended, cross-platform)
 - Beekeeper Studio (cross-platform)
 - SQL Server Management Studio (SSMS) (Windows-only)
 
 - Ollama: Download Ollama (for running the local AI model)
 
Clone the repository and navigate to the project folder:
git clone <your-repository-url>
cd <project-folder>Create a requirements.txt file in the project root with the following:
Flask
pyodbc
python-dotenv
Flask-Cors
werkzeug
pandas
SQLAlchemy
Use a virtual environment to manage dependencies:
python3 -m venv venvActivate the virtual environment:
- macOS/Linux:
source venv/bin/activate - Windows:
.\venv\Scripts\activate
 
Install dependencies:
pip install -r requirements.txtCreate a .env file in the project root with the following content. Do not commit this file to version control:
SERVER=localhost
DATABASE=CoolingSystemDB
USERNAME=sa
PASSWORD=YourStrongP@ssw0rd!
PASSWORD must meet SQL Server's complexity requirements (uppercase, lowercase, numbers, symbols).
Run the following command to start the SQL Server container:
docker run -e "ACCEPT_EULA=Y" -e "SA_PASSWORD=YourStrongP@ssw0rd!" -p 1433:1433 --name sql_server_dashboard -d mcr.microsoft.com/mssql/server:2022-latest-p 1433:1433: Maps the container's SQL Server port to your local machine.--name sql_server_dashboard: Names the container.
The pyodbc library requires an ODBC driver to connect to SQL Server.
- Download and install ODBC Driver 17 for SQL Server.
 - Installing SSMS often includes the necessary drivers.
 
For macOS (using Homebrew):
brew install unixodbc
brew tap microsoft/mssql-release https://github.com/Microsoft/homebrew-mssql-release
brew install msodbcsql17For Linux, follow the Microsoft ODBC Driver installation instructions.
Open your SQL GUI tool (Azure Data Studio, Beekeeper Studio, or SSMS) and create a connection:
- Server/Hostname: 
localhost - Port: 
1433 - Authentication Type: SQL Login
 - Username: 
sa - Password: 
YourStrongP@ssw0rd!(or as set in.env) 
Run these scripts in your SQL GUI tool:
Script 1: Create the Database
CREATE DATABASE CoolingSystemDB;Script 2: Create the users Table
Ensure you are using the CoolingSystemDB database:
USE CoolingSystemDB;
GO
CREATE TABLE users (
    username NVARCHAR(50) PRIMARY KEY,
    password NVARCHAR(255) NOT NULL,
    email NVARCHAR(255) NOT NULL,
    firstName NVARCHAR(100),
    lastName NVARCHAR(100),
    age INTEGER
);- Download the Cooling Tower Optimization Dataset from Kaggle.
 - Save 
cooling_tower_data.csvin the project root. - Run the import script (with virtual environment active):
 
python import_kaggle_data.pyThis script creates the CoolingData table and uploads the CSV data.
This project uses a locally-run Ollama instance to power the AI chat feature.
- Install Ollama: Ensure you have installed Ollama (from the Prerequisites).
 - Run the Ollama Service: Launch the Ollama application. It must be running in the background for the chat feature to work.
 - Pull the AI Model: The application is configured to use the 
gemma3model. Open your terminal and run:ollama pull gemma3
 
Before starting the web server, ensure your two background services (SQL Server and Ollama) are running.
If the container is stopped, restart it:
docker start sql_server_dashboardEnsure the Ollama application (which you set up in Step 7) is running in the background.
Ensure your virtual environment is active:
- macOS/Linux:
source venv/bin/activate - Windows:
.\venv\Scripts\activate
 
Run the Flask app:
python app.pyOpen your browser and navigate to http://127.0.0.1:5001. Use the signup page to create an account, then log in to view the real-time dashboard.
This project uses the Cooling Tower Optimization Dataset by Ziya on Kaggle.
- Author: Ziya
 - Source: Kaggle