llama-cpp
Here are 27 public repositories matching this topic...
A Genshin Impact Question Answer Project supported by Qwen1.5-14B-Chat
-
Updated
Jun 4, 2024 - Python
Auto Complete anything using a gguf model
-
Updated
Dec 4, 2023 - Python
Lightweight implementation of the OpenAI open API on top of local models
-
Updated
Dec 18, 2023 - Python
AgentX is an Open-source library that help people use LLMs on their own computers or help them to serve LLMs as easy as possible that support multi-backends like PyTorch, llama.cpp, Ollama and EasyDeL
-
Updated
May 24, 2024 - Python
YouTube API implementation with Meta's Llama 2 to analyze comments and sentiments
-
Updated
Dec 5, 2023 - Python
LLM content classification with only prompt engineering
-
Updated
Mar 31, 2024 - Python
Transcribes videos and describes them with OpenAI APIs or local models.
-
Updated
Aug 2, 2023 - Python
Interactive Python chatbot powered by the `llama_cpp` library for text generation, offering customizable responses based on a user-provided GGUF model file.
-
Updated
May 17, 2024 - Python
Genshin Impact Character Chat Models tuned by Lora on LLM
-
Updated
Jun 3, 2024 - Python
A simple AI chat using FastAPI, Langchain and llama.cpp
-
Updated
Sep 19, 2023 - Python
Email Auto-ReplAI is a Python tool that uses AI to automate drafting responses to unread Gmail messages, streamlining email management tasks.
-
Updated
Aug 1, 2023 - Python
✨ Your Custom Offline Role Play with LLM and Stable Diffusion on Mac and Linux (for now) 🧙♂️
-
Updated
Nov 21, 2023 - Python
Local character AI chatbot with chroma vector store memory and some scripts to process documents for Chroma
-
Updated
Jul 22, 2024 - Python
BabyAGI-🦙: Enhanced for Llama models (running 100% local) and persistent memory, with smart internet search based on BabyCatAGI and document embedding in langchain based on privateGPT
-
Updated
Jun 4, 2023 - Python
Improve this page
Add a description, image, and links to the llama-cpp topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llama-cpp topic, visit your repo's landing page and select "manage topics."