A tiny Rust CLI for streaming responses from OpenAI, Gemini, and Claude.
- Streams model output to stdout as tokens/chunks arrive.
- Reads prompt from
--promptor stdin. - Supports provider API keys from env vars or a home config file.
- Includes config commands:
llms config set <provider> <key>llms config get <provider>llms config list
- Rust toolchain (stable)
- Network access to the model provider API
- An API key for the provider you use
cargo build --releaseBinary path:
./target/release/llmsOr run directly with cargo:
cargo run -- --helpYou can use either env vars or the built-in config file.
export OPENAI_API_KEY="sk-..."
export GEMINI_API_KEY="..."
export ANTHROPIC_API_KEY="..."llms config set openai sk-xxxx
llms config set gemini your-gemini-key
llms config set claude your-anthropic-keyRead a key:
llms config get openaiList configured providers:
llms config listConfig file location:
~/.llms/config.json
Example file:
{
"openai": "sk-xxxx",
"gemini": "your-gemini-key",
"claude": "your-anthropic-key"
}Key precedence:
- Provider env var (if set and non-empty)
~/.llms/config.json
Top-level help:
llms --helpllms --provider openai --model gpt-5 --prompt "Explain SSE in one sentence"echo "Write a haiku about rust" | llms --provider gemini --model gemini-3-flash-previewIf --model is omitted, defaults are:
openai:gpt-5gemini:gemini-2.0-flashclaude:claude-opus-4-6
The CLI writes model text to stdout and errors to stderr, so normal shell piping works.
Save output to file:
llms --provider gemini --prompt "hello" > out.txtPipe to another process:
llms --provider gemini --prompt "hello" | tee out.txtSplit output and errors:
llms --provider gemini --prompt "hello" > out.txt 2> err.logIf running through cargo and you do not want cargo status lines:
cargo run --quiet -- --provider gemini --prompt "hello"Missing API key for <provider>- Set the matching env var, or run
llms config set <provider> <key>.
- Set the matching env var, or run
- No response when using stdin
- Ensure stdin is not empty and not just whitespace.
- Wrong provider name
- Valid providers are
openai,gemini,claude.
- Valid providers are
- API keys are stored in plain text in
~/.llms/config.json. - Treat that file as sensitive.
MIT