A local-first personal AI agent that runs on your machine. No cloud, no account required.
- Runs AI agents locally - Your data stays on your device
- Connects to Ollama - Uses local AI models (Llama, Mistral, etc.)
- MCP support - Works with 80+ tools (filesystem, Git, Slack, etc.)
- Desktop app - Clean UI for managing agents
- Persistent memory - Remembers conversations
cargo run --bin agent1 -- server --bind 127.0.0.1:17371cd desktop
npm install
npm run tauri:dev| Command | What It Does |
|---|---|
agent1 run --agent my-agent.toml --task "hello" |
Run an agent |
agent1 server |
Start the API server |
agent1 models |
See available AI models |
agent1 memory write "note" |
Save a note |
| Endpoint | Description |
|---|---|
GET /api/agents |
List your agents |
POST /api/sessions/run |
Run a task |
GET /api/sessions/{id}/stream |
Stream progress |
GET /ws/events |
Real-time events |
export OLLAMA_BASE_URL=http://localhost:11434export OPENAI_BASE_URL=http://localhost:8000/v1Run ollama serve in another terminal.
Try: cargo run --bin agent1 -- server --bind 127.0.0.1:17372
Check: agent1 mcp list
- Runs locally - data never leaves your machine
- API binds to localhost only
- Secrets are automatically redacted in logs