A pragmatic terminal-based API and utility for interacting with Large Language Models (LLMs). Zenith is designed to be directly usable from the shell and to act as the LLM backend for other tooling, most notably nucleus-shell.
Zenith provides:
- A single executable CLI (
zenith) for querying LLMs from the terminal - Chat history persistence on disk
- Support for multiple models via OpenRouter
- A Wikipedia fallback mode when AI is disabled
- Simple integration as a backend service for shell-based tools
This project intentionally favors minimal dependencies and shell interoperability over heavy SDK usage.
- C++17-compatible compiler
- CMake ≥ 3.16
curljq- A POSIX-compatible shell environment
- An OpenRouter API key
Important
- Zenith requires an API key exposed as the environment variable
$API_KEY. - Only OpenRouter-hosted LLMs are supported for remote inference.
- Zenith is used as the LLM backend for nucleus-shell.
# Works for bash, zsh, and fish
export API_KEY=<your_openrouter_api_key>To persist this across sessions, add it to your shell configuration file (.bashrc, .zshrc, or config.fish).
cmake -S . -B build
cmake --build buildThe resulting binary will be available as:
./build/zenithUsage: zenith [--ai|-a] [--new <chatname>] [--chat <existingChatName>] [--model <model>] "<query>"
-
--ai,-aEnable LLM-backed responses. If omitted, Zenith falls back to Wikipedia search. -
--new <chatname>Create and switch to a new chat session. -
--chat <chatname>Continue an existing chat session. -
--model <model>Specify the OpenRouter model to use (default:gpt-4o-mini).
# Wikipedia lookup
zenith "quantum computing"
# Start a new AI chat
zenith -a --new research "Explain transformers"
# Continue an existing chat with a specific model
zenith -a --chat research --model gpt-4o "Give a concrete example"Chat histories are stored locally at:
~/.config/zenith/chats/<chatname>.txt
Each entry is timestamped and appended sequentially.
Basic support exists for local models such as llama or gpt4all if available in $PATH. This behavior is experimental and may require manual adjustment depending on your local runtime.
MIT License
Copyright (c) 2026 Zepyx