Full-stack AI chat platform with intelligent model routing built with Flask, Tailwind CSS, and open-source AI models.
- DeepSeek Coder: Specialized for coding, debugging, and programming tasks
- Llama.cpp: Optimized for document processing, PDFs, and large files
- Vicuna: Multimodal AI for images, videos, and rich media content
- GPT4All: General-purpose conversational AI for everyday tasks
- Auto-Select: Automatically routes queries to the best model based on content type
- User authentication with registration, login, and password reset
- Rate limiting by tier (Free: 10/hour, Premium: 100/hour, Admin: 1000/hour)
- Chat history with message persistence
- Admin panel for user management and analytics
- Payment integration with Midtrans
- Email notifications via Postfix SMTP
- Dark/light mode toggle
- Markdown rendering with code syntax highlighting
- RESTful API for programmatic access
- CSRF protection
- Secure password hashing (Werkzeug)
- Session management with secure cookies
- Rate limiting (Redis-backed)
- Input sanitization
- Security headers (CSP, X-Frame-Options, etc.)
# Clone repository
git clone https://github.com/Bucin404/AI-platform.git
cd AI-platform
# Configure environment
cp .env.example .env
# Edit .env with your configuration
# Download & integrate AI models (AUTOMATIC - recommended!)
python download_models.py
# Downloads models + installs dependencies + configures platform
# Use --lite for smaller models (~1.6GB vs ~16GB)
# Start services
docker-compose up -d
# Initialize database
docker-compose exec web flask db upgrade
# Create admin user
docker-compose exec web flask create-admin
# Access at http://localhost:5000# Use M4-optimized configuration
docker-compose -f docker-compose.m4.yml up -d
# Initialize database
docker-compose -f docker-compose.m4.yml exec web flask db upgrade
# Create admin user
docker-compose -f docker-compose.m4.yml exec web flask create-admin
# See full M4 setup guide: docs/SETUP_M4.mdImportant: For local development, use .env.local.example instead of .env.example:
# Clone and setup
git clone https://github.com/Bucin404/AI-platform.git
cd AI-platform
# Create virtual environment
python3 -m venv venv
source venv/bin/activate
# Install dependencies
pip install -r requirements.txt
# Configure for LOCAL development
cp .env.local.example .env
# Edit .env - use localhost for database, or SQLite for simplicity
# Option 1: Use SQLite (no PostgreSQL needed)
# In .env: DATABASE_URL=sqlite:///app.db
# Option 2: Use local PostgreSQL
# Install PostgreSQL, create database, then:
# In .env: DATABASE_URL=postgresql://user:pass@localhost:5432/aiplatform
# Initialize database
export FLASK_APP=run.py
flask db init
flask db migrate -m "Initial migration"
flask db upgrade
# Download AI models (automatic integration!)
python download_models.py
# Run application
python run.py
# Access at http://localhost:5000Troubleshooting? See docs/TROUBLESHOOTING.md for common issues and solutions.
# Create virtual environment
python3 -m venv venv
source venv/bin/activate
# Install dependencies
pip install -r requirements.txt
# Set up environment
cp .env.example .env
# Edit .env with your configuration
# Download & integrate AI models (AUTOMATIC!)
python download_models.py --lite # Lite version for local dev (~1.6GB)
# Or use: python download_models.py # Full version (~16GB)
# This automatically installs deps and configures the platform!
# Initialize database
export FLASK_APP=run.py
flask db init
flask db migrate -m "Initial migration"
flask db upgrade
# Create admin user
flask create-admin
# Run application
python run.py- Installation Guide - Detailed installation instructions
- MacBook M4 Setup - Optimized setup for Apple Silicon (NEW)
- Model Download Guide - How to download and configure AI models
- Deployment Guide - Production deployment on Ubuntu
- API Documentation - REST API reference
Backend:
- Flask (Python web framework)
- PostgreSQL (Database)
- Redis (Caching & rate limiting)
- SQLAlchemy (ORM)
- Flask-Login (Authentication)
- Flask-WTF (Forms & CSRF)
- Flask-Mail (Email)
- Flask-Limiter (Rate limiting)
- Flask-Migrate (Database migrations)
Frontend:
- Tailwind CSS (Styling)
- Vanilla JavaScript
- Marked.js (Markdown rendering)
- Highlight.js (Code syntax highlighting)
AI Models:
- DeepSeek Coder
- Llama.cpp
- Vicuna
- GPT4All
Infrastructure:
- Docker & Docker Compose
- Nginx (Reverse proxy)
- Postfix (Email)
- Midtrans (Payments)
AI-platform/
βββ app/
β βββ blueprints/ # Flask blueprints
β β βββ auth/ # Authentication routes
β β βββ chat/ # Chat routes
β β βββ admin/ # Admin panel routes
β β βββ payments/ # Payment routes
β β βββ api/ # API routes
β βββ models/ # Database models
β βββ services/ # Business logic services
β β βββ model_service.py # AI model adapters
β β βββ email_service.py # Email service
β β βββ payment_service.py # Payment service
β βββ templates/ # HTML templates
β βββ static/ # Static files (CSS, JS)
β βββ utils/ # Utility functions
βββ tests/ # Unit tests
βββ docs/ # Documentation
βββ config.py # Configuration
βββ run.py # Application entry point
βββ requirements.txt # Python dependencies
βββ Dockerfile # Docker configuration
βββ docker-compose.yml # Docker Compose configuration
See .env.example for all required environment variables. Key variables:
# Flask
SECRET_KEY=your-secret-key
DEBUG=False
# Database
DATABASE_URL=postgresql://user:pass@localhost:5432/aiplatform
# Redis
REDIS_URL=redis://localhost:6379/0
# Email (Postfix)
MAIL_SERVER=localhost
MAIL_PORT=587
MAIL_USERNAME=
MAIL_PASSWORD=
# Midtrans Payment
MIDTRANS_SERVER_KEY=your-server-key
MIDTRANS_CLIENT_KEY=your-client-key
MIDTRANS_IS_PRODUCTION=False
# Admin
ADMIN_EMAIL=admin@aiplatform.com
ADMIN_PASSWORD=changeme# Run all tests
pytest
# Run with coverage
pytest --cov=app tests/
# Run specific test file
pytest tests/test_auth.pyimport requests
# Login
session = requests.Session()
session.post('http://localhost:5000/auth/login', data={
'email': 'user@example.com',
'password': 'password'
})
# Send chat message with auto model selection
response = session.post('http://localhost:5000/api/chat', json={
'message': 'Write a Python function to sort a list',
'model': 'auto' # Will automatically use DeepSeek for coding
})
print(response.json())The platform intelligently routes queries based on content detection:
| Query Type | Model | Example Keywords |
|---|---|---|
| Coding | DeepSeek Coder | code, function, debug, python, javascript |
| Documents | Llama.cpp | pdf, document, file, csv |
| Images/Videos | Vicuna | image, photo, video, analyze |
| General | GPT4All | weather, explain, tell me |
Contributions are welcome! Please feel free to submit a Pull Request.
MIT License - see LICENSE file for details
- GitHub Issues: https://github.com/Bucin404/AI-platform/issues
- Email: support@aiplatform.com
Built with β€οΈ using open-source AI models and modern web technologies.