Private & local AI personal knowledge management app.
-
Updated
Jun 8, 2024 - TypeScript
Private & local AI personal knowledge management app.
Empower Your Productivity with Local AI Assistants
The TypeScript library for building AI applications.
Wingman is the fastest and easiest way to run Llama models on your PC or Mac.
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
VSCode AI coding assistant powered by self-hosted llama.cpp endpoint.
Control what LLMs can, and can't, say
LocalChat is a ChatGPT-like chat that runs on your computer
A Javascript library (with Typescript types) to parse metadata of GGML based GGUF files.
A small Next.js app which utilises search results to feed context to the LLM.
Loz is a command-line tool that enables your preferred LLM to execute system commands and utilize Unix pipes, integrating AI capabilities with other Unix tools.
AGI for creators, professionals, entrepreneurs and organizations
Starter examples for using Next.js and the Vercel AI SDK with Llama.cpp and ModelFusion.
Empower your LLM to do more than you ever thought possible with these state-of-the-art prompt templates.
An open source user experience for AI models.
A minimalist Docker project to get started with Node, llama-node and Express. Ready to be used in a Hugging Face Space.
Add a description, image, and links to the llamacpp topic page so that developers can more easily learn about it.
To associate your repository with the llamacpp topic, visit your repo's landing page and select "manage topics."