A lightweight desktop chat client for local LLMs. Built in Rust with egui for a minimal, fast UI.
Connects to any OpenAI API-compatible endpoint. Defaults to Ollama at localhost:11434.
- Streaming token display with stop/regenerate controls
- Markdown rendering in AI responses (code blocks, bold, lists)
- Model selector auto-populated from your endpoint
- Conversation history with SQLite persistence
- Sidebar with conversation list, search, and export
- System prompt, temperature, and max tokens settings
- Multiple saved API endpoints with per-endpoint API key support
- Works with OpenRouter, LM Studio, vLLM, and any OpenAI-compatible API
- Edit and resend previous messages
- Copy button on messages
- Token usage display
- Dark/light theme toggle
- Persistent settings via TOML config (
~/.config/hchat/config.toml) - Configurable font sizes and UI scale
- Cross-platform (Linux, macOS)
Download the latest release from GitHub Releases.
brew tap heath0xFF/tap
brew install hchatTo update to the latest release:
brew update && brew upgrade hchat# Binary
tar xzf hchat-macos-arm64.tar.gz
mv hchat /usr/local/bin/
# Or use the .app bundle
unzip hChat.app.zip -d /Applicationssudo dpkg -i hchat_*.deb# From the repo's pkg/arch directory
makepkg -siRequires the Rust toolchain.
cargo run --releaseMake sure your local LLM server is running first:
ollama serveThen launch hChat:
# Linux: run detached so it doesn't tie up your terminal
hchat &disown
# macOS: open the .app bundle, or run detached
open /Applications/hChat.app
# or
hchat &disownhChat connects to http://localhost:11434/v1 by default. You can change the endpoint in the top bar.
- Click
+next to the endpoint selector in the top bar - Enter the API base URL (e.g.
https://openrouter.ai/api/v1) and your API key - Click Add, then select the new endpoint from the dropdown
- Models auto-populate from the remote API
Or configure it directly in config.toml — see example.config.toml for all options.
Settings are stored in ~/.config/hchat/config.toml and persist across sessions. You can edit the file directly or use the settings panel in the app (gear icon). See example.config.toml for a fully commented example of all options.
All fields are optional. Missing fields use defaults, so existing configs won't break on upgrade.
Set font_family and mono_font_family to any font installed on your system. hChat looks up fonts by name using your platform's font system (fontconfig on Linux, Core Text on macOS). Leave empty to use egui's built-in fonts. Font changes take effect on save and restart.
API keys are stored per-endpoint in config.toml. Endpoints that don't need authentication (like local Ollama) simply omit the api_key field. Keys are sent as Authorization: Bearer headers.
| Key | Action |
|---|---|
| Enter | Send message |
| Shift+Enter | New line |
| Ctrl+N | New conversation |