Skip to content

An LLM playground application for experimenting with multiple language models (OpenAI GPT, Anthropic Claude) through a web interface.

License

Notifications You must be signed in to change notification settings

omapem/llm-playground

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

13 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

LLM Playground

A full-stack application for experimenting with multiple language models (OpenAI GPT, Anthropic Claude) through an interactive web interface. Built with React, TypeScript, Fastify, and Prisma.

Features

Core Functionality

  • πŸ€– Multi-model support - GPT-4, GPT-4 Turbo, GPT-3.5 Turbo, Claude 3.5 Sonnet, Claude 3 Opus/Haiku
  • πŸ’¬ Real-time streaming responses - Server-Sent Events (SSE) with progressive content rendering
  • πŸŽ›οΈ Adjustable parameters - Fine-tune temperature, max tokens, and top_p for each conversation
  • πŸ’Ύ Conversation persistence - SQLite database with full conversation history and auto-save
  • πŸ“Š Token counting & cost tracking - Real-time token usage and cost estimation per message and conversation

User Experience

  • ⌨️ Keyboard shortcuts - Ctrl+B (sidebar), Ctrl+K (parameters), Ctrl+N (new chat), Esc (close panels)
  • 🎨 Modern UI - Polished interface with Tailwind CSS, shadcn/ui components, and smooth animations
  • πŸ“ Markdown rendering - Full GitHub-flavored markdown with syntax highlighting via rehype-highlight
  • πŸ“‹ Code block copy - One-click copy functionality for code snippets with visual feedback
  • πŸ”„ Conversation management - Create, rename, delete, and organize your chat history with kebab menus
  • 🎯 Empty state guidance - Helpful suggestion cards for getting started

Tech Stack

Frontend:

  • React 18 + TypeScript
  • Vite (build tool & dev server)
  • Tailwind CSS + shadcn/ui components
  • Zustand (state management)
  • React Markdown + rehype-highlight (markdown rendering)
  • Lucide React (icons)

Backend:

  • Fastify + TypeScript
  • Prisma ORM + SQLite
  • OpenAI SDK (GPT models)
  • Anthropic SDK (Claude models)
  • Server-Sent Events (SSE) for streaming

Infrastructure:

  • Turborepo (monorepo management)
  • pnpm workspaces
  • ESLint + Prettier (code quality)

Prerequisites

  • Node.js >= 18.0.0
  • pnpm >= 8.0.0
  • At least one LLM API key (OpenAI or Anthropic)

Getting Started

Quick Start (Recommended)

# 1. Install dependencies
pnpm install

# 2. Set up environment variables
./setup.sh
# This will copy .env.example files and prompt for API keys

# 3. Initialize the database
cd packages/backend
pnpm prisma migrate dev
pnpm prisma generate
cd ../..

# 4. Start development servers (both frontend & backend)
pnpm dev

The frontend will be available at http://localhost:5173 and the backend at http://localhost:3000.

Manual Setup

If you prefer manual setup:

# 1. Install dependencies
pnpm install

# 2. Configure backend environment
cd packages/backend
cp .env.example .env
# Edit .env and add your API keys:
# OPENAI_API_KEY=sk-...
# ANTHROPIC_API_KEY=sk-ant-...

# 3. Set up the database
pnpm prisma migrate dev
pnpm prisma generate

# 4. Configure frontend environment (optional)
cd ../frontend
cp .env.example .env
# Edit .env if you need to change the API URL

# 5. Start development servers
cd ../..
pnpm dev

Build for Production

pnpm build

Development Commands

Common Commands

pnpm dev              # Run both frontend and backend
pnpm build            # Build all packages
pnpm lint             # Lint all packages
pnpm format           # Format code with Prettier
pnpm typecheck        # TypeScript type checking

Frontend Commands

cd packages/frontend
pnpm dev              # Vite dev server (http://localhost:5173)
pnpm build            # Production build
pnpm preview          # Preview production build
pnpm lint             # ESLint

Backend Commands

cd packages/backend
pnpm dev              # Fastify with hot reload (http://localhost:3000)
pnpm build            # Compile TypeScript
pnpm start            # Run compiled version

Database Commands

cd packages/backend
pnpm prisma migrate dev      # Create and apply migrations
pnpm prisma generate         # Generate Prisma client
pnpm prisma studio           # Open Prisma Studio GUI
pnpm prisma db push          # Push schema changes without migration

Project Structure

llm-playground/
β”œβ”€β”€ packages/
β”‚   β”œβ”€β”€ frontend/              # React application
β”‚   β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”‚   β”œβ”€β”€ components/       # UI components (shadcn/ui)
β”‚   β”‚   β”‚   β”œβ”€β”€ hooks/            # Custom React hooks
β”‚   β”‚   β”‚   β”œβ”€β”€ store/            # Zustand state management
β”‚   β”‚   β”‚   β”œβ”€β”€ services/         # API client
β”‚   β”‚   β”‚   β”œβ”€β”€ types/            # TypeScript types
β”‚   β”‚   β”‚   └── lib/              # Utilities and helpers
β”‚   β”‚   └── package.json
β”‚   β”‚
β”‚   └── backend/               # Fastify API server
β”‚       β”œβ”€β”€ src/
β”‚       β”‚   β”œβ”€β”€ routes/           # API endpoints (chat, models, conversations)
β”‚       β”‚   β”œβ”€β”€ services/         # Business logic (LLM, database)
β”‚       β”‚   β”œβ”€β”€ types/            # TypeScript types
β”‚       β”‚   └── utils/            # Utility functions
β”‚       β”œβ”€β”€ prisma/
β”‚       β”‚   β”œβ”€β”€ schema.prisma     # Database schema
β”‚       β”‚   β”œβ”€β”€ migrations/       # Database migrations
β”‚       β”‚   └── dev.db            # SQLite database (dev)
β”‚       └── package.json
β”‚
β”œβ”€β”€ package.json               # Root package with Turborepo
β”œβ”€β”€ pnpm-workspace.yaml        # pnpm workspace config
└── turbo.json                 # Turborepo build config

API Endpoints

Chat & Models

  • POST /api/chat - Stream chat completions via SSE
  • GET /api/models - List available models based on configured API keys

Conversations

  • GET /api/conversations - List all conversations with metadata
  • POST /api/conversations - Create a new conversation
  • PATCH /api/conversations/:id - Update conversation (e.g., rename)
  • DELETE /api/conversations/:id - Delete conversation and all messages

Messages

  • GET /api/conversations/:id/messages - Get all messages for a conversation
  • POST /api/conversations/:id/messages - Add a message to a conversation
  • PATCH /api/conversations/:conversationId/messages/:messageId - Update message (content, tokens, cost)

Environment Variables

Backend (.env)

PORT=3000                          # Server port
NODE_ENV=development               # Environment (development/production)
OPENAI_API_KEY=sk-...             # OpenAI API key (optional)
ANTHROPIC_API_KEY=sk-ant-...      # Anthropic API key (optional)
DATABASE_URL=file:./dev.db        # SQLite database path (Prisma)

Frontend (.env)

VITE_API_URL=http://localhost:3000  # Backend API URL

Note: At least one API key (OpenAI or Anthropic) must be configured for the application to function.

Development Status

βœ… Completed (Phases 1-3)

  • Monorepo setup with TypeScript and Turborepo
  • Chat UI with MessageInput, MessageDisplay, ConversationList
  • Fastify API with SSE streaming
  • LLM provider abstraction (OpenAI + Anthropic)
  • Streaming API integration in frontend
  • Parameter controls UI (temperature, max_tokens, top_p)
  • SQLite + Prisma for conversation persistence
  • Token counting and cost estimation
  • Markdown rendering with syntax highlighting
  • Code block copy functionality
  • Conversation management (create, rename, delete)
  • Keyboard shortcuts
  • Auto-save functionality with progressive persistence

🚧 Future Enhancements (Phase 4+)

  • System prompt editor
  • Export conversations (JSON, Markdown)
  • Search within conversations
  • User authentication
  • Multi-user support with API key management
  • Rate limiting and usage quotas
  • Comprehensive test coverage
  • Docker containerization
  • Production deployment (Railway/Render + Vercel)
  • Database migration to PostgreSQL for production

License

MIT

About

An LLM playground application for experimenting with multiple language models (OpenAI GPT, Anthropic Claude) through a web interface.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published