Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨
-
Updated
Jul 1, 2025 - TypeScript
Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨
Minimalistic UI for Ollama LMs - This powerful react interface for LLMs drastically improves the chatbot experience and works offline.
LLMX; Easiest 3rd party Local LLM UI for the web!
A single-file tkinter-based Ollama GUI project with no external dependencies.
A UI client for Ollama written in Compose Multiplatform
Odin Runes, a java-based GPT client, facilitates interaction with your preferred GPT model right through your favorite text editor. There is more: It also facilitates prompt-engineering by extracting context from diverse sources using technologies such as OCR, enhancing overall productivity and saving costs.
比简单更简单,通过 Ollama 不需要显卡轻松在你的电脑上运行 LLM。
Simple html ollama chatbot that is easy to install. Simply copy the html file on your computer and run it.
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.
LocalAPI.AI is a local AI management tool for Ollama, offering Web UI management and compatibility with vLLM, LM Studio, llama.cpp, Mozilla-Llamafile, Jan Al, Cortex API, Local-LLM, LiteLLM, GPT4All, and more.
Ollama with Let's Encrypt Using Docker Compose
PuPu is a lightweight tool that makes it easy to run AI models on your own device. Designed for smooth performance and ease of use, PuPu is perfect for anyone who wants quick access to AI without technical complexity.
Full featured demo application for OllamaSharp
Transform your writing with TextLLaMA! ✍️🚀 Simplify grammar, translate effortlessly, and compose emails like a pro. 🌍📧
A Chrome extension hosts an Ollama UI web server on localhost and other servers, helping you manage models and chat with any open-source model. 🚀💻✨
Add a description, image, and links to the ollama-ui topic page so that developers can more easily learn about it.
To associate your repository with the ollama-ui topic, visit your repo's landing page and select "manage topics."