A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
-
Updated
Oct 10, 2024 - Python
A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
A Fun project using Ollama, Streamlit & PyShark to chat with PCAP/PCAPNG files locally, privately!
A unified client for seamless interaction with multiple AI providers.
A command line utility that queries websites for answers using a local LLM
Elia+ 👉 An experimental, snappy, and keyboard-centric UI for interacting with AI agents and augmenting humans using AI! Chat about any thing with any agent. ⚡
OllamaChat: A user-friendly GUI to interact with llama2 and llama2-uncensored AI models. Host them locally with Python and KivyMD. Installed Ollama for Windows. For more, visit Ollama on GitHub
CrewAI Local LLM is a GitHub repository for a locally hosted large language model (LLM) designed to enable private, offline AI model usage and experimentation.
Simple CLI tool streamlines the process of managing AI models from the CivitAI platform. It offers functionalities to list available models, view their details, download selected variants, and remove models from local storage and provides a summary of the model description using Ollama or OpenAI.
Desktop UI for Ollama made with PyQT
Auto Install and Running Public API Service for Ollama with Any Model (library)
llamachan is a project that realises the idea of a dead internet for an imageboard
A client library that makes it easy to connect microcontrollers running MicroPython to the Ollama server
A simple Ollama Client using python streamlit .
A DBMS project with Streamlit Frontend for stock management simulation with Backtesting.
Chat with local LLM about your PDF and text documents, privacy ensured [llamaindex and llama3]
A Frontend Application to the RAG Demo
This project is a simple yet powerful chatbot that leverages Gradio for the user interface, NLTK for natural language processing, SentenceTransformers for retrieval-based information extraction, and Ollama for advanced language model integration. It interacts with a locally hosted API to generate responses, making it suitable for various use cases.
Add a description, image, and links to the ollama-client topic page so that developers can more easily learn about it.
To associate your repository with the ollama-client topic, visit your repo's landing page and select "manage topics."