Skip to content
/ zenith Public

Actually usable terminal api/utility to access to LLMs.

License

Notifications You must be signed in to change notification settings

xZepyx/zenith

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Zenith

A pragmatic terminal-based API and utility for interacting with Large Language Models (LLMs). Zenith is designed to be directly usable from the shell and to act as the LLM backend for other tooling, most notably nucleus-shell.


Overview

Zenith provides:

  • A single executable CLI (zenith) for querying LLMs from the terminal
  • Chat history persistence on disk
  • Support for multiple models via OpenRouter
  • A Wikipedia fallback mode when AI is disabled
  • Simple integration as a backend service for shell-based tools

This project intentionally favors minimal dependencies and shell interoperability over heavy SDK usage.


Requirements

  • C++17-compatible compiler
  • CMake ≥ 3.16
  • curl
  • jq
  • A POSIX-compatible shell environment
  • An OpenRouter API key

Configuration

Important

  • Zenith requires an API key exposed as the environment variable $API_KEY.
  • Only OpenRouter-hosted LLMs are supported for remote inference.
  • Zenith is used as the LLM backend for nucleus-shell.

Setting the API key

# Works for bash, zsh, and fish
export API_KEY=<your_openrouter_api_key>

To persist this across sessions, add it to your shell configuration file (.bashrc, .zshrc, or config.fish).


Installation

cmake -S . -B build
cmake --build build

The resulting binary will be available as:

./build/zenith

Usage

Usage: zenith [--ai|-a] [--new <chatname>] [--chat <existingChatName>] [--model <model>] "<query>"

Options

  • --ai, -a Enable LLM-backed responses. If omitted, Zenith falls back to Wikipedia search.

  • --new <chatname> Create and switch to a new chat session.

  • --chat <chatname> Continue an existing chat session.

  • --model <model> Specify the OpenRouter model to use (default: gpt-4o-mini).

Examples

# Wikipedia lookup
zenith "quantum computing"

# Start a new AI chat
zenith -a --new research "Explain transformers"

# Continue an existing chat with a specific model
zenith -a --chat research --model gpt-4o "Give a concrete example"

Data Storage

Chat histories are stored locally at:

~/.config/zenith/chats/<chatname>.txt

Each entry is timestamped and appended sequentially.


Notes on Local Models

Basic support exists for local models such as llama or gpt4all if available in $PATH. This behavior is experimental and may require manual adjustment depending on your local runtime.


License

MIT License

Copyright (c) 2026 Zepyx

About

Actually usable terminal api/utility to access to LLMs.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published