A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
-
Updated
Oct 10, 2024 - Python
A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
✨ AI interface for tinkerers (Ollama, Haystack RAG, Python)
A native macOS app for chatting with local LLMs
Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨
Ollama client for Swift
LLMX; Easiest 3rd party Local LLM UI for the web!
A lightning-fast, cross-platform AI chat application built with React Native.
Ollama负载均衡服务器 | 一款高性能、易配置的开源负载均衡服务器,优化Ollama负载。它能够帮助您提高应用程序的可用性和响应速度,同时确保系统资源的有效利用。
MyOllama: Ollama-based LLM mobile client
A UI client for Ollama written in Compose Multiplatform focused on running Deepseek r1 locally
ThunderAI is a Thunderbird Addon that uses the capabilities of ChatGPT, Gemini or Ollama to enhance email management.
Odin Runes, a java-based GPT client, facilitates interaction with your preferred GPT model right through your favorite text editor. There is more: It also facilitates prompt-engineering by extracting context from diverse sources using technologies such as OCR, enhancing overall productivity and saving costs.
A Fun project using Ollama, Streamlit & PyShark to chat with PCAP/PCAPNG files locally, privately!
比简单更简单,通过 Ollama 不需要显卡轻松在你的电脑上运行 LLM。
Locally own and run your LLM - easy, simple, lightweight
Add a description, image, and links to the ollama-client topic page so that developers can more easily learn about it.
To associate your repository with the ollama-client topic, visit your repo's landing page and select "manage topics."