Filling in the missing gaps with langchain, and creating OO wrappers to simplify some workloads.
-
Updated
May 7, 2023 - TypeScript
Filling in the missing gaps with langchain, and creating OO wrappers to simplify some workloads.
Control what LLMs can, and can't, say
A small Next.js app which utilises search results to feed context to the LLM.
An open source user experience for AI models.
Interactive LLM based chat application in TypeScript, wrapped with Electron, an utilizing Vite + React.
A desktop tool to install stable diffusion webui and chat with it
AGI for creators, professionals, entrepreneurs and organizations
A Javascript library (with Typescript types) to parse metadata of GGML based GGUF files.
Wingman is the fastest and easiest way to run Llama models on your PC or Mac.
LocalChat is a ChatGPT-like chat that runs on your computer
A minimalist Docker project to get started with Node, llama-node and Express. Ready to be used in a Hugging Face Space.
Obsidian Local LLM is a plugin for Obsidian that provides access to a powerful neural network, allowing users to generate text in a wide range of styles and formats using a local LLM.
Empower your LLM to do more than you ever thought possible with these state-of-the-art prompt templates.
Starter examples for using Next.js and the Vercel AI SDK with Llama.cpp and ModelFusion.
Empower Your Productivity with Local AI Assistants
Add a description, image, and links to the llamacpp topic page so that developers can more easily learn about it.
To associate your repository with the llamacpp topic, visit your repo's landing page and select "manage topics."