Skip to content

πŸ¦™ chat-o-llama: A lightweight yet powerful web interface for Ollama with markdown rendering, syntax highlighting, and intelligent conversation management. Zero external dependencies, perfect for privacy-focused local AI development.

License

Notifications You must be signed in to change notification settings

ukkit/chat-o-llama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

34 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

chat-o-llama πŸ¦™

Your lightweight, private, local AI chatbot (no GPU needed

A lightweight yet powerful web interface for Ollama with markdown rendering, syntax highlighting, and intelligent conversation management.

Ollama Chat Interface Python License

✨ Features

  • πŸ’¬ Multiple Conversations - Create, manage, and rename chat sessions
  • πŸ“š Persistent History - SQLite database storage with search functionality
  • πŸ€– Model Selection - Choose from downloaded Ollama models
  • πŸš€ Lightweight - Minimal resource usage for local development
  • πŸ“ Full Markdown rendering - with GitHub-flavored syntax
  • πŸ“Š Response metrics - time, tokens, and speed tracking

πŸš€ 30-Second Quick Start

For most users (auto-install):

curl -fsSL https://github.com/ukkit/chat-o-llama/raw/main/install.sh | bash

What happens?

  • Installs Python/Ollama if missing (takes time)
  • Downloads recommended model (~380MB)
  • Installs chat-o-llama Access at: http://localhost:3000
πŸ”§ Advanced Setup (Manual Install)

For detailed manual installation steps, see install.md

git clone https://github.com/ukkit/chat-o-llama.git
cd chat-o-llama
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
./chat-manager.sh start

πŸ“Έ Screenshots

πŸ“· App Screenshots

chat-o-llama - First Screen First screen after installation

chat-o-llama - New Chat Screen New chat screen with default model

chat-o-llama - Chat Bubble Chat bubble, reply from the model

chat-o-llama - Markdown Support Support for Markdown in chat

chat-o-llama - Select Model Support to select from list of models

chat-o-llama - Thinking Thinking styling

πŸ› οΈ Need Help?

Quick Fixes:

  • Port in use? β†’ ./chat-manager.sh start 8080
  • No models? β†’ ollama pull tinyllama

πŸ“š Documentation Links

Document Description
Installation Guide Installation Guide
Features Detailed features guide
Startup & Process Guide Startup & Process Management via chat-manager.sh
Config Guide Configuration Guide
Config Comparison Compare different configs
API Guide API Guide
Troubleshooting Guide Troubleshooting Guide

βœ”οΈ Tested On (Hardware)

Device CPU RAM OS
Raspberry Pi 4 Model B Rev 1.4 ARM Cortex-A72 8GB Raspberry Pi OS
Dell Optiplex 3070 i3-9100T 8GB Debian 12
Nokia Purebook X14 i5-10210U 16 GB Windows 11 Home

πŸ™ Acknowledgments

Made with ❀️ for the AI community

⭐ Star this project if you find it helpful!


MIT License - see LICENSE file.

About

πŸ¦™ chat-o-llama: A lightweight yet powerful web interface for Ollama with markdown rendering, syntax highlighting, and intelligent conversation management. Zero external dependencies, perfect for privacy-focused local AI development.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published