OLLMchat is a work-in-progress library and embeddable widget that provides LLM access and tool integration for applications. The project focuses on Vala and GTK4, with the main library written in pure Vala.
- Library - A library that can talk to Ollama and most OpenAI-compatible REST interfaces
libollmchat.so- Base library (no GTK dependencies)libollmchat-ui.so- UI library (depends on base library, includes GTK components)
- Technology Stack - Written in pure Vala, focusing on Vala and GTK4
- Tool Dependencies - Some tools will rely on third-party applications (e.g., semantic code search which is in another repository)
- Tool Calling - Supports tool calling functionality
- Permission System - Includes a permission system for secure tool access
- Prompt Manipulation - Provides prompt manipulation capabilities
- Generation - Supports text generation from LLM models
- Sample Tools - Includes working tools: ReadFile, EditFile, RunTerminalCommand
- Embeddable Widget - Reusable chat widget (
ChatWidget) that can be embedded in applications - Current Status - Builds two shared libraries with headers, VAPI, and GIR files. Includes test executables (
test-ollamaandtest-window)
Screencast.From.2025-11-24.11-56-37.mp4
Note: If the video doesn't display above, you can watch it directly here.
Online API documentation is available:
- ollmchat API Reference - Base library documentation
- ollmchat-ui API Reference - UI library documentation
This directory contains the OLLMchat library and test applications for working with Ollama API and prompt generation.
To build the project, follow these steps:
From the src/OLLMchat directory, run:
meson setup build --prefix=/usrThis will configure the build system with Meson and set the installation prefix to /usr.
After setup, compile the project using:
ninja -C buildThis will build:
libollmchat.so- Base library (with headers, VAPI, and GIR files)libollmchat-ui.so- UI library (with headers, VAPI, and GIR files)test-ollama- Command-line test executabletest-window- GTK UI test executable- Valadoc documentation (in
docs/ollmchat/anddocs/ollmchat-ui/)
Ollama/- Ollama API client implementationPrompt/- Prompt generation system for different agent typesChatPermission/- Permission system for tool access controlTools/- Tool implementations (ReadFile, EditFile, RunTerminalCommand, etc.)UI/- GTK UI components for chat interfaceresources/- Resource files including prompt templatesdocs/- Generated documentation (Valadoc) and implementation plans
Base library (libollmchat.so):
- Gee
- GLib/GIO
- json-glib
- libsoup-3.0
UI library (libollmchat-ui.so):
- All base library dependencies
- GTK4
- gtksourceview-5
Test executables:
- All dependencies above (test-window requires gtksourceview-5)
This project is licensed under the GNU Lesser General Public License version 3.0 (LGPL-3.0). See the LICENSE file for details.
Note: The markdown processor code in UI/MarkdownProcessor.vala is from the Planify project and is licensed under the GNU General Public License version 3.0 (GPL-3.0) or later, Copyright © 2023 Alain M. See the file header for details.
- The build system uses Meson and Ninja
- Resources are compiled into the binary using GLib's resource system
- The prompt system loads agent-specific sections from resource files
- Libraries are built as shared libraries with C headers, VAPI files, and GObject Introspection (GIR) files
- Valadoc documentation is automatically generated in
docs/ollmchat/anddocs/ollmchat-ui/