A TUI (terminal UI) AI chat client that stores conversations as plain markdown files. Designed to live in your Obsidian vault — but works in any directory.
Chat with any LLM from your terminal. Every conversation is a .md file with proper frontmatter, [[wikilinks]], and tags. Searchable, linkable, diffable, yours.
- Terminal-first — no browser needed, works over SSH, great in tmux
- Markdown-native — conversations are plain
.mdfiles, not locked in a database - Obsidian-friendly — frontmatter, tags, and
[[wikilinks]]work as expected - Provider-agnostic — OpenRouter (100+ models), with Ollama and direct OpenAI coming next
- Vault-aware — reference other notes with
[[wikilinks]]and they're included as context - Git-friendly — no binary blobs, easy to diff and version control your conversations
git clone https://github.com/BeamLabEU/vaultchat.git
cd vaultchat
bun install
bun run devGrab the latest binary for your platform from Releases.
Linux x64:
curl -fsSL https://github.com/BeamLabEU/vaultchat/releases/latest/download/vaultchat-linux-x64 -o vaultchat
chmod +x vaultchat
mv vaultchat ~/.local/bin/macOS Apple Silicon:
curl -fsSL https://github.com/BeamLabEU/vaultchat/releases/latest/download/vaultchat-darwin-arm64 -o vaultchat
chmod +x vaultchat
mv vaultchat ~/.local/bin/If
~/.local/binisn't in your PATH, usesudo mv vaultchat /usr/local/bin/instead.
Also available: vaultchat-linux-arm64, vaultchat-darwin-x64
Coming soon — brew install beamlabeu/tap/vaultchat
Check for new releases:
vaultchat --check-updateVaultChat also checks for updates in the background when you launch the TUI — a subtle notification appears at the top if a newer version is available.
To update manually, download the latest binary from Releases and replace the old one.
On first run, VaultChat launches a setup wizard:
- Choose your LLM provider (OpenRouter to start)
- Enter your API key (get one at openrouter.ai/keys)
- Pick your default model from a searchable list
Configuration is saved to ~/.vaultchat/config.json.
Run VaultChat from any directory — ideally your Obsidian vault:
cd ~/my-vault
vaultchatOr with Bun from the repo:
cd ~/my-vault
bun run /path/to/vaultchat/src/index.tsx| Key | Action |
|---|---|
Tab |
Switch between file tree and chat panel |
Enter |
Open file / send message |
Ctrl+N |
New chat |
Ctrl+M |
Switch model |
Ctrl+S |
Settings |
Ctrl+C |
Cancel streaming / quit |
j/k or ↑/↓ |
Navigate |
Esc |
Close modal / cancel stream |
- Left panel shows
.mdfiles in the current directory - Right panel shows the conversation with rendered markdown
- Select
+ New Chator pressCtrl+Nto start a conversation - Type your message and press
Enterto send - The LLM response streams in real-time
- After the first exchange, VaultChat asks the LLM to name the file
Reference other notes in your messages:
How do I optimize the setup described in [[Server Infrastructure]]?
VaultChat resolves the link, reads the file, and includes its content as context in the API call — without modifying your message.
You can also set permanent context in the frontmatter:
---
title: Docker migration
model: anthropic/claude-sonnet-4
provider: openrouter
context:
- "[[Server Infrastructure]]"
- "[[Docker Setup Notes]]"
---Every conversation is a valid markdown file:
---
title: Caddy reverse proxy setup
date: 2026-03-28T14:32:00+02:00
model: anthropic/claude-sonnet-4
provider: openrouter
tags:
- vaultchat
- infrastructure
---
###### user
I need automatic SSL for a new subdomain pointing to port 8080.
-----
###### assistant
Add this to your Caddyfile:
newapp.example.com {
reverse_proxy container:8080
}
Then reload: `docker exec caddy caddy reload`Files use ###### role headers (H6 — subtle, won't conflict with content) and ---/----- separators. They look good in any markdown viewer.
| Flag | Description |
|---|---|
--version, -v |
Print version and exit |
--help, -h |
Show usage help |
--check-update |
Check GitHub for a newer release |
--doctor |
Run diagnostic checks (config, API key, provider reachability) |
--doctor --json |
Same diagnostics, machine-readable JSON output |
If you're working from source, these scripts help catch environment issues early:
bun run smoke # build + startup sanity check
bun run doctor # validate config, API key, provider reachability
bun run doctor:json # machine-readable JSON to stdout
bun run doctor:report # persist diagnostics to reports/doctor.json
bun run hardening # smoke + doctor
bun run hardening:strict # typecheck + hardening- Runtime: Bun
- TUI: Ink (React for the terminal)
- Language: TypeScript
- Markdown: marked + marked-terminal
MIT