A small bash command that talks to multiple LLM providers. One script, one interface. Switch providers with a flag.
$ llmshot -p openai -t "Give me a clever anagram I can impress my friends with"
Try this one:
A decimal point → I'm a dot in place
$
- Multiple providers: OpenAI, Google (Gemini), Anthropic (Claude), and Ollama (local).
- Flexible prompt input: Inline text (
-t), a file (-f), or stdin. - Config file: API keys and default models in
~/.config/llmshot/llmshot.confor via-e. - Model override: Use
-mto override the default model for the chosen provider. - No extra runtime: Uses only
bash,curl, andjq.
You need:
- Bash (4+)
- curl
- jq
Debian / Ubuntu / Linux Mint
sudo apt update
sudo apt install curl jq(Bash is usually already installed.)
Fedora / RHEL / CentOS / Rocky
sudo dnf install curl jqArch Linux
sudo pacman -S curl jqopenSUSE
sudo zypper install curl jqAlpine
sudo apk add curl jqHomebrew (recommended)
brew install curl jqmacOS ships with an old Bash; the script runs with /usr/bin/env bash (often Bash 3.2). For a newer Bash:
brew install bashThen ensure your PATH uses the Homebrew bash before the system one if you rely on Bash 4+ features.
The script is written for Bash, so run it in a Bash environment:
Option A: WSL (Windows Subsystem for Linux)
Use your WSL distro (e.g. Ubuntu) and install as on Debian/Ubuntu:
sudo apt update && sudo apt install curl jqOption B: Git for Windows (Git Bash)
Git for Windows includes Bash, curl, and often basic tools. Install jq separately:
- Download from jqlang/jq (e.g.
jq-win64.exe), rename tojq.exe, and put it in yourPATH, or - If you have Chocolatey:
choco install jq
Then run llmshot from Git Bash.
Option C: MSYS2 / Cygwin
Install the curl and jq packages in your MSYS2 or Cygwin environment and run the script from that shell.
curl -fsSL https://raw.githubusercontent.com/markabrahams/llmshot/main/install/install.sh | bashThis installs llmshot to ~/.local/bin (or /usr/local/bin if run as root). To use a custom directory:
curl -fsSL https://raw.githubusercontent.com/markabrahams/llmshot/main/install/install.sh | bash -s -- -d /usr/local/bin
# or
INSTALL_DIR=/usr/local/bin curl -fsSL https://raw.githubusercontent.com/markabrahams/llmshot/main/install/install.sh | bashEnsure ~/.local/bin is in your PATH (e.g. add export PATH="$HOME/.local/bin:$PATH" to ~/.bashrc or ~/.profile).
If you have a Homebrew tap that includes the formula (see scripts/llmshot.rb template in this repo):
brew install markabrahams/llmshot/llmshot-
Clone or download this repo (or copy the
llmshotscript frombin/). -
Make the script executable:
chmod +x bin/llmshot
-
Put it on your PATH, e.g.:
sudo cp bin/llmshot /usr/local/bin/ # or mkdir -p ~/.local/bin && cp bin/llmshot ~/.local/bin/ && export PATH="$HOME/.local/bin:$PATH"
Create a config file so llmshot can find your API keys and defaults.
Location (by default): ~/.config/llmshot/llmshot.conf
Override: use -e /path/to/file to point to another env/config file.
Use the same variable names you would for environment variables. Example:
# OpenAI (optional if you only use other providers)
export OPENAI_API_KEY="sk-..."
export OPENAI_MODEL="gpt-4o" # optional; script has a default
# Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."
export ANTHROPIC_MODEL="claude-sonnet-4-5" # optional
# Google (Gemini)
export GOOGLE_API_KEY="..."
export GOOGLE_MODEL="gemini-2.5-flash" # optional
# Ollama (local; no API key)
export OLLAMA_MODEL="llama3.1" # optional
export OLLAMA_URL="http://localhost:11434" # optionalOnly set the keys and variables for the providers you use. The script sources this file, so export is optional but keeps things clear.
chmod 600 ~/.config/llmshot/llmshot.confUsage: llmshot -p <provider> [-m <model>] [-t <text>] [-f <file>] [-e <env_file>] [-u <url>]
-p, --provider Provider: openai, google, anthropic, ollama
-m, --model Override model from config (or use provider default if unset)
-t, --text Prompt text (takes precedence over -f and stdin)
-f, --file Prompt file (used if -t not given; else stdin)
-e, --env Environment/config file (optional; see above for default lookup)
-u, --url URL for ollama provider (default: http://localhost:11434)
Inline prompt (OpenAI):
./llmshot -p openai -t "Explain recursion in one sentence."Prompt from file (Anthropic):
./llmshot -p anthropic -f ./prompt.txtStdin (e.g. pipe):
echo "Summarize the following: ..." | ./llmshot -p googleOllama with custom URL and model:
./llmshot -p ollama -u http://192.168.1.10:11434 -m llama3.2 -t "Hello"Override model for one call:
./llmshot -p openai -m gpt-4o-mini -t "Short joke about shells"Use a specific config file:
./llmshot -p openai -e ~/work/llmshot.env -t "Hello"If you don’t set a model in config or with -m, the script uses:
| Provider | Default model |
|---|---|
| openai | gpt-5 |
| gemini-2.5-flash | |
| anthropic | claude-sonnet-4-5 |
| ollama | llama3.1 |
(Update these in the script or in your config if your provider uses different model names.)
-
"no environment file found"
Create~/.config/llmshot/llmshot.conf(or use-e <file>) with at least the API key for the provider you’re using. -
"Unknown provider"
Use one of:openai,google,anthropic,ollama(case-insensitive). -
jq/curl not found
Installjqandcurlas in Prerequisites. -
Ollama connection errors
Ensure Ollama is running (e.g.ollama serve) and thatOLLAMA_URL(or-u) matches your server (defaulthttp://localhost:11434). -
API errors (401, 403, etc.)
Check that the corresponding API key in your config is correct and has access to the requested model.
MIT License
Copyright (c) 2026 Mark Abrahams
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
This tool can interact with external AI providers such as OpenAI, Google, and Anthropic. This project is not affiliated with or endorsed by any of these providers. Users are responsible for complying with the respective API terms of service.