Skip to content

Lightweight, standalone, multi-platform, and privacy focused local LLM chat interface with optional encryption

License

Notifications You must be signed in to change notification settings

1runeberg/confichat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ConfiChat Logo

Windows Build Linux Build Android Build macOS Build iOS Build






Welcome to ConfiChat – a multi-platform, privacy-focused LLM chat interface with optional encryption of chat history and assets.

ConfiChat offers the flexibility to operate either fully offline or blend offline-and-online capabilities:

  • Offline providers like Ollama and LlamaCpp provide privacy by operating on your local machine or network without cloud services.
  • Online providers like OpenAI and Anthropic offer cutting-edge models via APIs, which have different privacy policies than their chat services, giving you greater control over your data.

📦 1. Downloads

We provide pre-built binaries/executables for various platforms, making it easy to get started quickly.

Note for macOS and iOS users: Binaries are not provided due to platform restrictions. Please see the Compiling on your own section.

Note for Windows users: You may encounter a SmartScreen warning since the binaries aren't signed. They are safely built via GitHub CI when downloaded directly from the Releases section. You can also view the full build logs. And of course you can build from source.

❤️ If you find this app useful, consider sponsoring us in GitHub Sponsors to help us secure necessary certificates and accounts for future binary distributions.

💼 If your company needs a bespoke version with robust enterprise features, Contact Us.


📖 2. Quick Start Guides

If you're completely new to offline LLMs, check out this easy Three-Step guide to get started (including ConfiChat) - a no-coding, no-dependencies approach.

You can also get started quickly with ConfiChat by following one of our quick start guides depending on whether you want to use local models, online models, or both.


💬 3. About ConfiChat

ConfiChat is a lightweight, multi-platform chat interface designed with privacy and flexibility in mind. It supports both local and online providers.

Unlike other solutions that rely on Docker and a suite of heavy tools, ConfiChat is a standalone app that lets you focus on the models themselves rather than maintaining the UI. This makes it an ideal choice for users who prefer a streamlined, efficient interface.

All chat sessions are managed locally by the app as individual JSON files, with optional encryption available for added security.

Local LLMs are particularly beneficial for applications requiring offline access, low-latency responses, or the handling of sensitive data that must remain on your device. They also provide more customization and privacy for niche tasks, such as journaling or private counseling.

In a nutshell, ConfiChat caters to users who value transparent control over their AI experience.


✨ 4. Key Features

  • Cross-Platform Compatibility: Developed in Flutter, ConfiChat runs on Windows, Linux, Android, MacOS, and iOS

  • Local Model Support (Ollama and LlamaCpp): Ollama & LlamaCpp both offer a range of lightweight, open-source local models, such as Llama by Meta, Gemma by Google, and Llava for multimodal/image support. These models are designed to run efficiently even on machines with limited resources.

  • OpenAI and Anthropic Support: Seamlessly integrates with OpenAI and Anthropic to provide advanced language model capabilities using your own API key. Please note that while the API does not store conversations like ChatGPT does, OpenAI retains input data for abuse monitoring purposes. You can review their latest data retention and security policies. In particular, check the "How does OpenAI handle data retention and monitoring for API usage?" in their FAQ (https://openai.com/enterprise-privacy/).

  • Privacy-Focused: Privacy is at the core of ConfiChat's development. The app is designed to prioritize user confidentiality, with optional chat history encryption ensuring that your data remains secure.

  • Lightweight Design: Optimized for performance with minimal resource usage.


🛠️ 5. Compiling your own build

For those who prefer to compile ConfiChat themselves, or for macOS and iOS users, we provide detailed instructions in the Compiling on your own section.


🤝 6. Contributing

We welcome contributions from the community! Whether you're interested in adding new features, fixing bugs, or improving documentation, your help is appreciated. Please see our Contributing Guide for more details.


💖 7. Sponsorship

Your support helps us maintain and improve ConfiChat. Sponsorships are encouraged for the following items:

  • Code Signing Certificates: To provide trusted binaries.
  • macOS and iOS Signing Accounts: To distribute signed binaries for macOS.
  • Continuous Feature Development: Ensuring regular updates and new features.

If you're interested in supporting ConfiChat, please visit our Sponsorship Page or if your company needs a bespoke version with robust enterprise features, Contact Us.