Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time
November 28, 2023 20:56
November 17, 2023 10:24
November 28, 2023 21:53
November 18, 2023 22:02
November 4, 2023 21:29
October 30, 2023 18:27
November 28, 2023 21:06
November 28, 2023 20:56
October 30, 2023 18:27
November 28, 2023 21:53


PromptBox allows maintaining libraries of LLM prompt templates which can be filled in and submitted from the command line. It can submit prompts to the OpenAI API, Ollama, or LM Studio.

Template Files

  • are built in TOML
  • can use Liquid templating, and reference templates in other files
  • define command-line arguments, which can include references to files
  • have the filename format <NAME>.pb.toml
# File: summarize.pb.toml

description = "Summarize some files"

# Optional system prompt
# Or `system_prompt_path` to read from a template file
system_prompt = "You are a great summarizer."

# This can also be template_path to read from another file.
template = '''
Create a {{style}} summary of the below files
which are on the topic of {{topic}}. The summary should be about {{ len }} sentences long.

{% for f in file -%}
File {{ f.filename }}:
{{ f.contents }}

{%- endfor %}

# These model options can also be defined in a config file to apply to the whole directory of templates.
model = "gpt-3.5-turbo"
temperature = 0.7
# Also supports top_p, frequency_penalty, presence_penalty, stop, and max_tokens
# And format = "json"

len = { type = "int", description = "The length of the summary", default = 4 }
topic = { type = "string", description = "The topic of the summary" }
style = { type = "string", default = "concise" }
file = { type = "file", array = true, description = "The files to summarize" }

Then to run it:

> promptbox run summarize --topic software --file
The file provides an overview of the PromptBox utility, which is used for maintaining libraries of
LLM prompt templates that can be filled in and submitted from the command line. It explains that template files
are built in TOML and can use Liquid templating. The file also includes an example template for summarizing files
on a specific topic, with options for length, formality, and the files to summarize. Additionally, it mentions the
presence of configuration files that can set default model options and inherit settings from parent directories.

> promptbox run summarize --topic software --file --style excited 
Introducing PromptBox, a powerful utility for maintaining libraries of LLM prompt templates! With PromptBox, you can
easily fill in and submit prompt templates from the command line. These template files, built in TOML, can utilize
Liquid templating and reference templates in other files. They also define command-line arguments, including references
to files. The excitement doesn't stop there! PromptBox even supports configuration files, allowing you to set default
model options and inherit settings from parent directories. Get ready to revolutionize your software experience
with PromptBox!

Additional Input

Promptbox can take additional input from extra command-line arguments or have it piped in from another command.

cat "transcript.txt" | pb run summarize "Here is the transcript:"

By default, this content is appended to the end of the prompt, but the template can reference it as {{extra}} to have it placed elsewhere in the prompt, as in this example.

Below is a transcript of a video named "{{title}}":


Create a detailed outline of the above transcript.

This can be help when using this mode with models that work best when their instructions are at end of the prompt.

Model Choice

PromptBox supports a few model hosts, and uses a very simple logic to choose the host:

  • Any model name starting with "gpt-3.5" or "gpt-4" will result in a call to the OpenAI API.
  • The value "lm-studio" will result in a call to LM Studio. LM Studio's API currently does not support selecting a model, so you will need to switch it yourself in the GUI.
  • Any other model name indicates a model from Ollama.

Models can use aliases as well. In either the template or a configuration file, you can add an model.alias section.

phind = "phind-codellama:34b-v2-q5_K_M"
deepseek = "deepseek-coder:7b"

These model aliases can then be used in place of the actual model name.

Configuration Files

Each directory of templates contains a configuration file, which can set default model options. Config files are read from the current directory up through the parent directories.

In each directory searched, Promptbox will look for a configuration file in that directory and in a promptbox subdirectory.

The global configuration directory such as .config/promptbox/promptbox.toml is read as well.

A configuration file inherits settings from the config files in its parent directories as well, for those options that it does not set itself. All settings in a configuration file are optional.

# By default the templates are in the same directory as the configuration file, but this can be overridden
# by setting the templates option
templates = ["template_dir"]

# This can be set to true to tell PromptBox to stop looking in parent directories for
# configurations and templates.
top_level = false

# Set this to false to tell PromptBox to not read the global configuration file.
use_global_config = true

# Set a default model. All the other options from the template's `model` section can be used here.
model = "gpt-3.5-turbo"