Skip to content

Minimal CLI for querying LLMs. Works with any OpenAI-compatible API.

Notifications You must be signed in to change notification settings

dreamfast/shllm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

shllm

Minimal CLI for querying LLMs. Works with any OpenAI-compatible API.

Usage

llm "What is the capital of France?"
cat error.log | llm "What's wrong?"

Conversations are automatically saved and resumed. Use -c or --clear to start fresh:

llm -c              # clear context only
llm -c "New topic"  # clear and start new conversation

Install

make install

This builds the binary, installs to /usr/local/bin, and creates ~/.shllm/config.toml if it doesn't exist. Edit the config to add your API key.

Config

Config is loaded from ./config.toml first, then ~/.shllm/config.toml:

api_url = "https://api.openai.com/v1/chat/completions"
api_key = "sk-..."
model = "gpt-4"
system_prompt = "Be concise."
stream = true
timeout = 120
word_wrap = 100
Option Required Default
api_url yes -
api_key yes -
model yes -
system_prompt no none
stream no false
timeout no 120s
word_wrap no 100

Works with OpenAI, OpenRouter, Ollama, LM Studio, Azure, or any compatible endpoint.

License

MIT

About

Minimal CLI for querying LLMs. Works with any OpenAI-compatible API.

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published