If OpenChat is useful to you, consider leaving a β β it helps others find the project.
Your AI assistant. Your rules. Your experience.
Today you can access models that answer your questions β but the experience is always someone else's. Fixed UI. Fixed prompts. Fixed personality. OpenChat is different: it gives you full ownership of the chat experience, running on whatever model you choose, deployed wherever you want, shaped exactly how you want it.
OpenChat is a free, open source chat UI with persistent session management, multi-provider LLM support, and cited sources built in. Bring your own model β Groq, OpenAI, Anthropic, Google Gemini, or a local Ollama instance β and get a production-quality research assistant that you fully control. Configure the personality, swap the model, deploy to your own infrastructure. No locked-in subscriptions, no data sent to third parties you didn't choose.
Overview![]() |
Chat β cited sources![]() |
Settings![]() |
Most chat products give you a great experience β until they change it. You have no say in the model, the personality, or where your conversations go. OpenChat flips that:
- You pick the model. OpenAI, Anthropic, Gemini, Ollama, Groq β swap with one env var.
- You own the experience. Override the system prompt to tune the AI's personality, tone, and focus.
- You own your data. Conversations live in your own PostgreSQL database. Nothing leaves your infrastructure.
- You control the deployment. Run locally, on Docker, or in the cloud. SAP BTP Cloud Foundry guide included.
- Multi-provider LLM support β Groq, OpenAI, Anthropic, Google Gemini, Ollama β zero code changes to switch
- Persistent sessions β full chat history stored in PostgreSQL; survives restarts and reloads
- Session sidebar β browse, switch, and create conversations from the sidebar
- Real-time streaming β responses stream live with a typing cursor indicator
- Cited sources β every AI response includes a collapsible sources banner; no unverified claims
- Configurable personality β override the default system prompt via
LLM_SYSTEM_PROMPT - Dark-mode UI β clean, minimal interface built with Next.js 16 and Tailwind CSS v4
- Cloud Foundry ready β manifest and deployment guide for SAP BTP
- Node.js β₯ 20
- PostgreSQL β₯ 14 (running locally or via Docker)
- An API key for your chosen LLM provider
git clone https://github.com/SentorLabs/openchat.git
cd openchat
npm installCopy the example and fill in your values:
cp .env.example .env.local# Choose your provider: groq | openai | anthropic | gemini | ollama
LLM_PROVIDER=groq
# Model name for your chosen provider
LLM_MODEL=llama-3.3-70b-versatile
# API key (not required for Ollama)
LLM_API_KEY=your_api_key_here
# PostgreSQL connection string
DATABASE_URL=postgresql://postgres:password@localhost:5432/openchat
# Optional: override the default system prompt
# LLM_SYSTEM_PROMPT="You are a helpful assistant focused on software engineering."createdb openchatOpenChat automatically creates the required tables on first run β no manual migrations needed.
npm run devOpen http://localhost:3000.
| Provider | LLM_PROVIDER |
Example LLM_MODEL |
Key required |
|---|---|---|---|
| Groq | groq |
llama-3.3-70b-versatile |
Yes |
| OpenAI | openai |
gpt-4o |
Yes |
| Anthropic | anthropic |
claude-opus-4-6 |
Yes |
| Google Gemini | gemini |
gemini-2.0-flash |
Yes |
| Ollama (local) | ollama |
llama3.2 |
No |
For Ollama, also set OLLAMA_BASE_URL if your instance isn't on http://localhost:11434.
| Variable | Required | Default | Description |
|---|---|---|---|
LLM_PROVIDER |
Yes | groq |
LLM backend to use |
LLM_MODEL |
Yes | β | Model name for your provider |
LLM_API_KEY |
Yes* | β | API key (*not needed for Ollama) |
DATABASE_URL |
Yes | β | PostgreSQL connection string |
LLM_SYSTEM_PROMPT |
No | Built-in | Override the AI's system prompt |
OLLAMA_BASE_URL |
No | http://localhost:11434 |
Ollama base URL |
The most powerful feature of OpenChat is the ability to define exactly what kind of assistant you want. Set LLM_SYSTEM_PROMPT to anything:
# A focused coding assistant
LLM_SYSTEM_PROMPT="You are a senior software engineer. Answer only technical questions with code examples. Be concise."
# A research assistant with strict citation rules
LLM_SYSTEM_PROMPT="You are a research analyst. Always cite primary sources. Never state unverified facts."
# A customer support agent for your product
LLM_SYSTEM_PROMPT="You are a support agent for Acme Corp. Only answer questions about our product. Be friendly and brief."No redeploy needed β just update the env var and restart.
docker build -t openchat .
docker run -p 3000:3000 \
-e LLM_PROVIDER=groq \
-e LLM_MODEL=llama-3.3-70b-versatile \
-e LLM_API_KEY=your_key \
-e DATABASE_URL=postgresql://... \
openchatSee DEPLOY.md for the complete step-by-step guide including PostgreSQL deployment as a CF app and network policy configuration.
Quick summary:
# 1. Push Postgres as an internal CF app
cf push -f deploy-postgres.yml
# 2. Create the service binding
cf create-user-provided-service openchat-db -p '{"uri":"postgresql://..."}'
# 3. Push OpenChat
cf push -f manifest.yml
# 4. Network policy + env vars
cf add-network-policy openchat --destination-app postgres-db --port 5432 --protocol tcp
cf set-env openchat LLM_PROVIDER groq
cf set-env openchat LLM_MODEL llama-3.3-70b-versatile
cf set-env openchat LLM_API_KEY your_key
cf restage openchatsrc/
βββ app/
β βββ api/
β β βββ chat/route.ts # Streaming chat endpoint
β β βββ sessions/route.ts # List & create sessions
β β βββ sessions/[id]/
β β βββ messages/route.ts # Load session messages
β βββ about/page.tsx # About page
β βββ releases/page.tsx # Changelog
β βββ page.tsx # Main chat UI
β βββ layout.tsx
β βββ globals.css
βββ components/
β βββ Nav.tsx # Top navigation bar
β βββ Sidebar.tsx # Session list panel
β βββ SourcesBanner.tsx # Collapsible sources banner
βββ lib/
βββ db.ts # PostgreSQL pool + schema init
βββ llm.ts # Unified multi-provider LLM adapter
βββ sources.ts # Parse & strip [SOURCES] blocks
βββ types.ts # Shared TypeScript interfaces
Contributions are welcome and appreciated. OpenChat is built for the community β if you run your own AI assistant, you know what's missing.
- Fork the repository
- Create a feature branch:
git checkout -b feat/your-feature - Make your changes and add tests where relevant
- Ensure the build passes:
npm run build - Open a pull request with a clear description of the change
- New LLM providers β add support for Mistral, Cohere, Together AI, etc.
- UI improvements β message formatting, markdown rendering, code highlighting
- Export features β download chat history as PDF or Markdown
- Auth support β multi-user mode with authentication
- Docker Compose β a ready-to-run compose file with Postgres included
- Bug fixes β check the issues tab
- TypeScript strict mode β no
anywithout a comment explaining why - Keep components focused β one responsibility per file
- Environment-driven configuration β no hardcoded credentials or model names
- Test your provider integration locally before submitting
Please open an issue with:
- Steps to reproduce
- Expected vs actual behaviour
- Your
LLM_PROVIDERand Node.js version - Any relevant log output
OpenChat auto-creates these tables on first run:
CREATE TABLE sessions (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
title TEXT NOT NULL DEFAULT 'New Chat',
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE TABLE messages (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
session_id UUID NOT NULL REFERENCES sessions(id) ON DELETE CASCADE,
role TEXT NOT NULL CHECK (role IN ('user','assistant')),
content TEXT NOT NULL,
sources JSONB,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);| Layer | Technology |
|---|---|
| Framework | Next.js 16 (App Router) |
| Language | TypeScript 5 |
| Styling | Tailwind CSS v4 |
| Database | PostgreSQL (via node-postgres) |
| LLM providers | Groq, OpenAI, Anthropic, Google Gemini, Ollama |
| Runtime | Node.js β₯ 20 |
Apache 2.0 β see LICENSE for details.
You are free to use, modify, and distribute OpenChat for any purpose, including commercial use. If you distribute derivative works, include a copy of the license and preserve attribution notices. If you build something great on top of it, we'd love to hear about it.
OpenChat is a community project. Thank you to everyone who has contributed ideas, bug reports, and code. Built with Next.js, Tailwind CSS, and the open source AI ecosystem.
Take control of your AI experience.



