Chat with local LLM about your PDF and text documents, privacy ensured [llamaindex and llama3]
-
Updated
Jun 2, 2024 - Python
Chat with local LLM about your PDF and text documents, privacy ensured [llamaindex and llama3]
macOS app for interacting with local LLMs, currently Ollama. Embeds a PyInstaller binary into an unsigned macOS app.
📜 A quest will be assigned to you by LLM.
A DBMS project with Streamlit Frontend for stock management simulation with Backtesting.
Desktop UI for Ollama made with PyQT
A client library that makes it easy to connect microcontrollers running MicroPython to the Ollama server
OllamaChat: A user-friendly GUI to interact with llama2 and llama2-uncensored AI models. Host them locally with Python and KivyMD. Installed Ollama for Windows. For more, visit Ollama on GitHub
llamachan is a project that realises the idea of a dead internet for an imageboard
CrewAI Local LLM is a GitHub repository for a locally hosted large language model (LLM) designed to enable private, offline AI model usage and experimentation.
A command line utility that queries websites for answers using a local LLM
Elia+ 👉 An experimental, snappy, and keyboard-centric UI for interacting with AI agents and augmenting humans using AI! Chat about any thing with any agent. ⚡
A Fun project using Ollama, Streamlit & PyShark to chat with PCAP/PCAPNG files locally, privately!
A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
Add a description, image, and links to the ollama-client topic page so that developers can more easily learn about it.
To associate your repository with the ollama-client topic, visit your repo's landing page and select "manage topics."