The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.
-
Updated
Mar 7, 2025 - TypeScript
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.
Free large language model (LLM) support for Neovim, provides commands to interact with LLM (like ChatGPT, ChatGLM, kimi, deepseek, openrouter and local llms). Support Github models.
A single-file tkinter-based Ollama GUI project with no external dependencies.
TalkNexus: Ollama Chatbot Multi-Model & RAG Interface
Chat with your pdf using your local LLM, OLLAMA client.(incomplete)
Ollama with Let's Encrypt Using Docker Compose
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
PuPu is a lightweight tool that makes it easy to run AI models on your own device. Designed for smooth performance and ease of use, PuPu is perfect for anyone who wants quick access to AI without technical complexity.
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
Streamlit Chatbot using Ollama Open Source LLMs
ollama client for android
"A simple and lightweight client-server program for interfacing with local LLMs using ollama, and LLMs in groq using groq api."
This a simple but functional chat UI for ollama. Can easily add it into any web app to add floating chat UI with ollama resposnse in your web application.
A web interface for Ollama, providing a user-friendly way to interact with local language models.
Add a description, image, and links to the ollama-chat topic page so that developers can more easily learn about it.
To associate your repository with the ollama-chat topic, visit your repo's landing page and select "manage topics."