The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, and more.
-
Updated
Mar 4, 2025 - JavaScript
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, and more.
Keep searching, reading webpages, reasoning until it finds the answer (or exceeding the token budget)
ComfyUI-IF_AI_tools is a set of custom nodes for ComfyUI that allows you to generate prompts using a local Large Language Model (LLM) via Ollama. This tool enables you to enhance your image generation workflow by leveraging the power of language models.
DevoxxGenie is a plugin for IntelliJ IDEA that uses local LLM's (Ollama, LMStudio, GPT4All, Jan and Llama.cpp) and Cloud based LLMs to help review, test, explain your project code.
Python app for LM Studio-enhanced voice conversations with local LLMs. Uses Whisper for speech-to-text and offers a privacy-focused, accessible interface.
How to run a local server on LM Studio
Genius annotations for every page on the web.
RAGLight is a lightweight and modular Python library for implementing Retrieval-Augmented Generation (RAG), Agentic RAG and RAT (Retrieval augmented thinking)..
Prompt, run, edit, and deploy full-stack web applications with multi-LLM support! Extends bolt.new with OpenAI, Google, Together AI, OpenRouter, Ollama, LMStudio, Groq, DeepSeek, and Mistral integrations. Enhanced with chat file uploads, improved file editor UX, and intelligent error handling with auto-fix suggestions.
Read files (pdf/png/jpg) with OCR and rename using AI.
Meshtastic-AI - Off-Grid LM-Studio / Ollama / OpenAI integration & Home Assistant API control for Meshtastic with custom commands & emergency alerts (sms, email, discord support).
a local RAG LLM with persistent database to query your PDFs
Add a description, image, and links to the lmstudio topic page so that developers can more easily learn about it.
To associate your repository with the lmstudio topic, visit your repo's landing page and select "manage topics."