Local‑first AI assistant that lives on your PC and helps automate everyday tasks.
Ambient is a desktop app that runs AI models locally via llama.cpp, with an optional cloud fallback. Think of it as a lightweight “brain for your computer”: chat with it, give it context from your screen, and let it help with routine tasks. It runs efficiently in the background and prioritizes privacy and speed by default.
Built with Tauri (Rust + Next.js).
- Local‑first inference with llama.cpp
- Ships with a built‑in model downloader
- Floating chat window
- Chat locally with Qwen3VL-2B
- Optional Gemini 3 Flash and Gemini 3 Pro
- Gemini-powered computer use
- Let Gemini take control of your computer to complete tasks on your behalf
- Screen region context with OCR
- Snipping‑tool‑style region capture
- Extracts text with OCR and injects it into the conversation
- Runs in the background, designed to be helpful without getting in your way
- The Tauri/Rust backend manages windowing, screen capture, OCR, and model orchestration
- llama.cpp runs locally; the app communicates with it through a local http server
- When you choose, Gemini can be used as the model
- Supabase manages auth, cloud user data, and sessions
- All local artifacts (models, database, caches) never leave your computer
- Proactive assistance: draft emails or messages based on detected screen context
- Scheduling: suggest/create calendar events from chats or on‑screen cues
- Rich automations: configurable actions triggered by screen activity
- Cross‑platform: macOS and Linux support
Prerequisites
- Rust toolchain (MSVC)
- Node.js LTS and pnpm
Steps
- Clone the repo and install dependencies
- Create and fill in
app/src-tauri/.env(see.env.example) - Start the app in development mode using
pnpm run tauri dev
Windows PowerShell
cd .\app
pnpm install
copy .\src-tauri\.env.example .\src-tauri\.env
# Edit .\src-tauri\.env to add your keys if needed
pnpm run tauri dev- Local models via llama.cpp
- A built‑in downloader fetches models
- Default: Qwen3VL-2B
- Optional Gemini
- Region selection opens a snipping‑style overlay; the selected area is OCR’d and added to the chat context
- OCR is powered by the Rust crate ocrs
- Local‑first by design: inference and screen processing happen on your machine
- Optional cloud access through Gemini API is opt‑in
- Artifacts (models, logs, OCR snippets) are stored locally
app/– Tauri app with Next.js frontend and Rust backendapp/src-tauri/– Tauri config, Rust commands, binaries, and environmentcloudflare/– Cloudflare worker for Gemini completionsml/– experiments and training scripts