Skip to content

K-Kasper/webui

Repository files navigation

WebUI

A clean, efficient web interface for interacting with language models running locally via Ollama. No cloud services, no accounts required. Just you and your local AI models.

WebUI screenshot

Built With

  • Vite + React
  • TypeScript
  • Tailwind
  • HeroUI

Features

Focused on efficiency in a local environment, with customizable system prompts and no required setup beyond Ollama itself.

  • CHAT: Multiple conversations with auto-generated titles, streaming responses, and live reasoning
  • MODELS: Browse local Ollama models with parameter size, quantization, and loaded status
  • IMAGES: Send images to multimodal models
  • SETTINGS: Custom system message, initial greeting, and context window override
  • STATS: Live token counts, tokens/sec, and context usage on every message
  • HISTORY: Persistent storage in your browser, why overcomplicate things?

Getting Started

The recommended setup is self-hosting. A hosted version is also available if you'd prefer not to run another local service, though it needs an Ollama CORS tweak to connect to your local instance.

All chat data lives in your browser's localStorage. Nothing is sent anywhere except your local Ollama instance.

Self-hosting

Requirements:

  • Node.js v20+
  • Ollama running locally
  • pnpm

Quick start:

pnpm install
pnpm build
pnpm start

Application runs at http://localhost:3000. Ollama allows localhost origins by default, so no extra CORS configuration is needed.

Hosted version

You can check out the hosted version at https://webui.kerajarvi.com/. Note: It cannot connect to your local Ollama instance due to default CORS settings. To enable this, start Ollama with the OLLAMA_ORIGINS environment variable set to allow the hosted origin:

OLLAMA_ORIGINS=https://webui.kerajarvi.com ollama serve

This step is only needed for the hosted version. Self-hosters can skip it.

Development

pnpm dev           # development server
pnpm typecheck     # TypeScript checks
pnpm build         # production build
pnpm start         # production server (requires build first)
pnpm format        # format with Prettier
pnpm format:check  # check formatting

License

Licensed under the GNU AGPL v3. Feel free to contribute or fork.

About

A clean web UI for your local Ollama AI models.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages