Skip to content

roojs/OLLMchat

Repository files navigation

OLLMchat

Summary

OLLMchat is a work-in-progress library and embeddable widget that provides LLM access and tool integration for applications. The project focuses on Vala and GTK4, with the main library written in pure Vala.

  • Library - A library that can talk to Ollama and most OpenAI-compatible REST interfaces
    • libollmchat.so - Base library (no GTK dependencies)
    • libollmchat-ui.so - UI library (depends on base library, includes GTK components)
  • Technology Stack - Written in pure Vala, focusing on Vala and GTK4
  • Tool Dependencies - Some tools will rely on third-party applications (e.g., semantic code search which is in another repository)
  • Tool Calling - Supports tool calling functionality
  • Permission System - Includes a permission system for secure tool access
  • Prompt Manipulation - Provides prompt manipulation capabilities
  • Generation - Supports text generation from LLM models
  • Sample Tools - Includes working tools: ReadFile, EditFile, RunTerminalCommand
  • Embeddable Widget - Reusable chat widget (ChatWidget) that can be embedded in applications
  • Current Status - Builds two shared libraries with headers, VAPI, and GIR files. Includes test executables (test-ollama and test-window)

Demo

Screencast.From.2025-11-24.11-56-37.mp4

Note: If the video doesn't display above, you can watch it directly here.

Documentation

Online API documentation is available:

Build Instructions

This directory contains the OLLMchat library and test applications for working with Ollama API and prompt generation.

Building

To build the project, follow these steps:

1. Setup the build directory

From the src/OLLMchat directory, run:

meson setup build --prefix=/usr

This will configure the build system with Meson and set the installation prefix to /usr.

2. Compile the project

After setup, compile the project using:

ninja -C build

This will build:

  • libollmchat.so - Base library (with headers, VAPI, and GIR files)
  • libollmchat-ui.so - UI library (with headers, VAPI, and GIR files)
  • test-ollama - Command-line test executable
  • test-window - GTK UI test executable
  • Valadoc documentation (in docs/ollmchat/ and docs/ollmchat-ui/)

Project Structure

  • Ollama/ - Ollama API client implementation
  • Prompt/ - Prompt generation system for different agent types
  • ChatPermission/ - Permission system for tool access control
  • Tools/ - Tool implementations (ReadFile, EditFile, RunTerminalCommand, etc.)
  • UI/ - GTK UI components for chat interface
  • resources/ - Resource files including prompt templates
  • docs/ - Generated documentation (Valadoc) and implementation plans

Dependencies

Base library (libollmchat.so):

  • Gee
  • GLib/GIO
  • json-glib
  • libsoup-3.0

UI library (libollmchat-ui.so):

  • All base library dependencies
  • GTK4
  • gtksourceview-5

Test executables:

  • All dependencies above (test-window requires gtksourceview-5)

License

This project is licensed under the GNU Lesser General Public License version 3.0 (LGPL-3.0). See the LICENSE file for details.

Note: The markdown processor code in UI/MarkdownProcessor.vala is from the Planify project and is licensed under the GNU General Public License version 3.0 (GPL-3.0) or later, Copyright © 2023 Alain M. See the file header for details.

Notes

  • The build system uses Meson and Ninja
  • Resources are compiled into the binary using GLib's resource system
  • The prompt system loads agent-specific sections from resource files
  • Libraries are built as shared libraries with C headers, VAPI files, and GObject Introspection (GIR) files
  • Valadoc documentation is automatically generated in docs/ollmchat/ and docs/ollmchat-ui/

About

vala based gtk & terminal based desktop Ollama chat with tools - designed for local LLM work

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published