An Offline Document Enquiry LLM for Everyone
-
Updated
Jul 25, 2023 - Python
An Offline Document Enquiry LLM for Everyone
GPT powered rubber duck debugger as CS50 2023 final project.
A local LLM assisted ppt generation tool
50-line local LLM assistant in Python with Streamlit and GPT4All
Local AI Open Orca For Dummies is a user-friendly guide to running Large Language Models locally. Simplify your AI journey with easy-to-follow instructions and minimal setup. Perfect for developers tired of complex processes!
Openai-style, fast & lightweight local language model inference w/ documents
Data Whisperer is a chatbot that allows you to analyze your local files using Large Language Models (LLMs). You can upload PDFs, Markdown, Text, and Document files to the chatbot and then ask questions related to the content of these files.
Infinite Craft but in Pyside6 and Python with local LLM (llama3 & others) using Ollama
implemented vector similarity algorithms to understand their inner workings, used local embeddding models
A constrained generation filter for local LLMs that makes them quote properly from a source document
Huginn Hears is a local app that transcribes and summarizes your meetings in Norwegian and English, using state-of-the-art models and open-source libraries. No cloud needed, run everything offline.
CrewAI Local LLM is a GitHub repository for a locally hosted large language model (LLM) designed to enable private, offline AI model usage and experimentation.
A comprehensive AI companion leveraging advanced semantic analysis, sentiment detection, and voice processing to provide personalized and context-aware interactions using Autogen, semantic-router, and VoiceProcessingToolkit.
Your fully proficient, AI-powered and local chatbot assistant🤖
automate the batching and execution of prompts.
'Local Large language RAG Application', an application for interfacing with a local RAG LLM.
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."