Skip to content

LocalAI

Lauri Ojansivu edited this page Mar 25, 2026 · 8 revisions

Ensu – Ente’s Local LLM app

MacBook M5 Pro and Qwen3.5 = Local AI Security System

Unsloth Studio

  • https://unsloth.ai/docs/new/studio
  • https://news.ycombinator.com/item?id=47414032
  • An open-source, no-code web UI for training, running and exporting open models in one unified local interface.
  • Run GGUF and safetensor models locally on Mac, Windows, Linux.
  • Train 500+ models 2x faster with 70% less VRAM (no accuracy loss)
  • Run and train text, vision, TTS audio, embedding models
  • MacOS and CPU work for Chat GGUF inference. MLX training coming soon.
  • No dataset needed. Auto-create datasets from PDF, CSV, JSON, DOCX, TXT files.
  • Export or save your model to GGUF, 16-bit safetensor etc.
  • Self-healing tool calling / web search + code execution
  • Auto inference parameter tuning and edit chat templates.

LM Studio - Local AI on your computer

Show HN: LocalGPT – A local-first AI assistant in Rust with persistent memory

Waypoint 1.1, a local-first world model for interactive simulation

I want everything local – Building my offline AI workspace

Sourcebot is a self-hosted tool that helps you understand your codebase

OpenWebUI: Local AI FOSS web interface

Ask HN: What's Your Useful Local LLM Stack?

Show HN: Onit – Source-available ChatGPT Desktop with local mode, Claude, Gemini

SupGen

  • SupGen is a generative coding AI... except it isn't an AI.
  • There is no model, there is no pre-training. You just give it some examples, and it gives you a program. It runs locally, in a single-core CPU. Oh, and it can also prove theorems.
  • Here's a demo, including a brief TT intro. (The synth examples start at the 6 min mark.) https://x.com/VictorTaelin/status/1881392823246729640

Run Llama locally with only PyTorch on CPU

Clone this wiki locally