Skip to content

marksxiety/ai-chat-verse

Repository files navigation

AI Chat Verse

A unified multi-LLM chat interface for seamless switching between AI providers and models. Built for simplicity and focused on chat completions.

Tests Build Docker Release License: MIT Node.js


Features

  • Multi-Provider Support - Switch between OpenAI, Z.AI, and DeepSeek in one interface
  • Model Flexibility - Choose from various models per provider
  • Session Management - Persistent chat history across sessions
  • Lightweight Design - Focused exclusively on chat completions
  • Easy Configuration - JSON-based provider and model management

Supported Providers

OpenAI

  • gpt-4o-mini - GPT-4o Mini (Budget)
  • gpt-5-mini - GPT-5 Mini (2025 Budget)
  • gpt-4o - GPT-4o
  • gpt-4.1 - GPT-4.1

Z.AI

  • glm-4.7 - GLM-4.7
  • glm-4.6 - GLM-4.6

DeepSeek

  • deepseek-chat - DeepSeek-V3.2 (General/Chat)
  • deepseek-reasoner - DeepSeek-V3.2 (Thinking/R1)

Customization: Edit src/data/ProviderModels.json to modify providers and models.


Quick Start

Clone the repository

git clone https://github.com/marksxiety/ai-chat-verse.git
cd ai-chat-verse

Install dependencies

npm install

Configure API keys

cp .env.example .env

Update .env with your credentials:

OPENAI_API_KEY=your_openai_api_key_here
ZAI_API_KEY=your_zai_api_key_here
DEEPSEEK_API_KEY=your_deepseek_api_key_here
VITE_PORT=3001  # Optional, default port

Usage

Development Mode

UI only (Frontend with hot-reload):

npm run dev

Access at http://localhost:5173

Server only (Backend API):

npm run dev:server

Access at http://localhost:3001

Full stack (Both UI and server):

npm run dev:all

Access at http://localhost:5173

Production Mode

Build application:

npm run build

Run production server (serves built frontend + API):

npm run dev:server

Access at http://localhost:3001

Note: The production build serves both the frontend and backend through the Express server on port 3001.

Docker Deployment

Pull from GitHub Container Registry:

docker pull ghcr.io/marksxiety/ai-chat-verse:latest
docker run -p 3001:3001 \
  -e OPENAI_API_KEY=your_key \
  -e ZAI_API_KEY=your_key \
  -e DEEPSEEK_API_KEY=your_key \
  ghcr.io/marksxiety/ai-chat-verse:latest

Docker Compose (Recommended):

cp .env.example .env
docker-compose up --build

Docker (Build from source):

docker build -t ai-chat-verse .
docker run -p 3001:3001 \
  -e OPENAI_API_KEY=your_key \
  -e ZAI_API_KEY=your_key \
  -e DEEPSEEK_API_KEY=your_key \
  ai-chat-verse

Access at http://localhost:3001


Current Limitations

This project is currently focused on core chat functionality:

  • Supports only chat completions (no image generation, audio processing, etc.)
  • Limited to providers and models listed above

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

A multi-LLM chat interface that lets you switch between AI providers and models in one unified chat experience.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors