RustyChat is a lightweight, desktop chat UI written in Rust using Dioxus. It provides a simple, focused chat interface that stores conversation history in a local SQLite database while sending requests to a local Ollama backend for model inference. The UI is dark-themed, supports per-chat naming, renaming, persistent history, model selection, and an interrupt button to stop an in-flight model response.
This project is intended as a compact, privacy-friendly GUI wrapper around local model hosting via Ollama.
- Desktop chat UI built with Dioxus.
- Persistent history stored in
chat.db(SQLite). - Model selection populated from Ollama's
/api/tagsendpoint. - Settings modal to configure model, system prompt, temperature, top_p, max_tokens, and zoom.
- Dark theme with careful styling and responsive layout.
- Dioxus — desktop RUST UI framework
- Ollama — local model hosting backend used for inference
If you haven't installed Ollama, please visit https://ollama.com/ and follow their installation instructions. The app expects Ollama to be available at:
You can test the model list with:
curl http://localhost:11434/api/tags- dioxus (and dioxus-desktop): UI framework used to build the desktop application and components.
- reqwest: HTTP client used to call Ollama's REST API.
- rusqlite: SQLite bindings to persist chats and messages locally (
chat.db). - serde, serde_json: Serialization and Deserialization for JSON payloads exchanged with Ollama and for internal data flows.
- uuid: Generate UUIDs for chat identifiers.
- tokio (indirect / runtime used by Dioxus): asynchronous runtime used by async networking.
These crates are chosen for their ergonomics and small, practical APIs for a local GUI chat app.
- Chats are stored in
chatstable with columns(id TEXT PRIMARY KEY, title TEXT NOT NULL)whereidis a UUID string andtitleis the visible name. - Messages are stored in
messagestable with(id INTEGER PRIMARY KEY AUTOINCREMENT, chat_id TEXT, role TEXT, content TEXT, timestamp DATETIME). - Settings are persisted in a
settingstable (single-row, id=1). - The UI keeps a small in-memory buffer of the currently-viewed chat's messages for immediate responsiveness, but assistant responses are always written to the DB. Assistant replies are only pushed into the in-memory buffer if the user is still viewing that chat when the response arrives. This prevents replies from "appearing" in the wrong visible chat.
- When interrupting a running request, the in-flight HTTP call is allowed to complete, but the code marks the request as cancelled and simply discards the final assistant output (no interruption message is inserted). The UI removes the "Thinking..." indicator immediately for a responsive feel.
Requirements:
- Rust
- Cargo
- Ollama running locally and hosting models
- Git
- Clone the repository
git clone https://github.com/KPCOFGS/RustyChat.git
cd RustyChat- Run
dx build --release- Run
./target/dx/rusty-chat/release/linux/app/rusty-chatNotes:
- The app creates/uses
chat.dbin the./target/dx/rusty-chat/release/linux/app/directory. Backup if necessary before deleting.
Contributions welcome. Please open issues or PRs for bugs, feature requests, or improvements.
This project is licensed under the MIT License. See LICENSE file for more details.

