Open-source AI-chat-bot: an open-source version of ChatGPT that runs any local or hosted LLM with web search and custom modes to tailor responses.
- Hybrid AI control in one UI: Switch between local and hosted models without leaving the chat flow.
- OpenRouter = model superhub: Tap into OpenAI, Anthropic, Google, Qwen, MiniMax, and dozens more through a single key and use their free tiers to prototype fast.
- Offline-first with LM Studio: Run everything locally by loading compatible models in LM Studio and keeping keys private.
- Persona-driven customization: Drop new prompt files to craft modes tailored to any workflow or teaching style.
- LM Studio: Download the installer for your OS from the official Downloads page.
- Node.js + npm: Get the latest LTS release for your OS from the Downloads page.
- Clone:
git clone https://github.com/Dom-ML/open_chat.git && cd open_chat. - Install dependencies:
npm install. - Create a
.envwithAI_GATEWAY_API_KEY,OPENROUTER_API_KEY, andLMSTUDIO_BASE_URL=http://localhost:1234/v1. - Start LM Studio with the models you want available (match the names listed under each provider).
- Run the dev server:
npm run dev.
- Default local model:
mlx-community/LFM2-2.6B-4bit. In LM Studio, search for this model, download it, then hit Serve so it runs onhttp://localhost:1234/v1(see LM Studio docs for details). - To swap models later, update the first entry in
modelsByProvider.lmstudioinsidesrc/app/api/config/route.tsso itsvaluematches the Serve ID you expose from LM Studio.
- Providers live in
src/app/api/config/route.ts; add models by extending themodelsByProviderlist for the provider you use. - Streaming chat logic is in
src/app/api/chat/route.ts, which reads your provider choice and system prompt. - Prompt modes come from markdown files in
src/prompts; copy one, edit the front matter, choose any Lucide icon name, and reload to see it. - UI helpers for fetching provider data sit in
src/lib/ai-config.ts, and prompt loading is cached insrc/lib/prompts-server.ts.
- Next.js app ready for Vercel hosting or local dev.
- UI layer built with shadcn’s
@ai-elementscomponents. - AI SDK suite:
ai,@ai-sdk/react,@ai-sdk/openai-compatible,@openrouter/ai-sdk-provider, pluszodfor validation.
- Copy one of the markdown files in
src/prompts, change theidandname, and write the persona instructions under the front matter. - Keep the instructions clear and scannable: define the tone, outline the response structure (bullets, summaries, steps), and give 1-2 concrete dos/don’ts so the model stays on brief.
- Pick an icon by matching any entry from the Lucide icon list (https://lucide.dev/icons) or by browsing
node_modules/lucide-react, then drop the plain name into theiconfield. - Restart the dev server or reload the page so the cache in
prompts-serverpicks up the new file.
- Persistent storage
- MCP support

