Run state-of-the-art language models locally. Chat with AI using simple slash commands. Zero cloud, zero cost – just pure, home-brewed AI magic.
-
Updated
Jul 2, 2024 - Python
Run state-of-the-art language models locally. Chat with AI using simple slash commands. Zero cloud, zero cost – just pure, home-brewed AI magic.
The Customer Support Ticket Classification and Response System combines advance AI models with RAG to automate and elevate ticket categorisation and response generation. By leveraging multi-model integration, sentiment analysis, urgency detection, and vector-based retrieval, it delivers precise, context-aware responses and actionable insights.
Local AI Open Orca For Dummies is a user-friendly guide to running Large Language Models locally. Simplify your AI journey with easy-to-follow instructions and minimal setup. Perfect for developers tired of complex processes!
A constrained generation filter for local LLMs that makes them quote properly from a source document
automate the batching and execution of prompts.
implemented vector similarity algorithms to understand their inner workings, used local embeddding models
A Python-based documentation assistant that uses local LLMs to crawl websites, process content, and provide intelligent Q&A capabilities with source citations.
SparkOllama is your go-to ollama web-UI interface powered by Streamlit for seamless, AI-powered conversations. Built on Ollama's advanced LLM, it delivers smart, dynamic responses effortlessly. Whether for work or fun, SparkOllama makes engaging with AI simple and intuitive.
I'll be your machinery.
50-line local LLM assistant in Python with Streamlit and GPT4All
GPT powered rubber duck debugger as CS50 2023 final project.
This application uses Streamlab and Streamlit to create an interactive web interface for chatting with a locally hosted Llama 3 language model.
CrewAI Local LLM is a GitHub repository for a locally hosted large language model (LLM) designed to enable private, offline AI model usage and experimentation.
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."