Skip to content

tgalal/runprompt

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

runprompt

A single-file Python script for running Dotprompt files.

Quick start | dotprompt | Examples | Configuration | Providers | Spec compliance

Quick start

curl -O https://raw.githubusercontent.com/chr15m/runprompt/main/runprompt
chmod +x runprompt

Create hello.prompt:

---
model: anthropic/claude-sonnet-4-20250514
---
Say hello to {{name}}!

Run it:

export OPENAI_API_KEY="your-key"
echo '{"name": "World"}' | ./runprompt hello.prompt

(You can get an OpenAI key from here: https://platform.openai.com/api-keys)

What is Dotprompt?

Dotprompt is an executable prompt template format for GenAI. A .prompt file contains both the prompt template and metadata (model, schema, config) in a single file.

runprompt is a minimal, single-file Python implementation with no dependencies.

Examples

In addition to the following, see the tests folder for more example .prompt files.

Basic prompt with stdin

---
model: anthropic/claude-sonnet-4-20250514
---
Summarize this text: {{STDIN}}
cat article.txt | ./runprompt summarize.prompt

The special {{STDIN}} variable always contains the raw stdin as a string.

Structured JSON output

Extract structured data using an output schema:

---
model: anthropic/claude-sonnet-4-20250514
input:
  schema:
    text: string
output:
  format: json
  schema:
    name?: string, the person's name
    age?: number, the person's age
    occupation?: string, the person's job
---
Extract info from: {{text}}
echo "John is a 30 year old teacher" | ./runprompt extract.prompt
# {"name": "John", "age": 30, "occupation": "teacher"}

Schema uses Picoschema format. Fields ending with ? are optional. The format is field: type, description.

Chaining prompts

Pipe structured output between prompts:

echo "John is 30" | ./runprompt extract.prompt | ./runprompt generate-bio.prompt

The JSON output from the first prompt becomes template variables in the second.

Executable prompt files

Make .prompt files directly executable with a shebang:

#!/usr/bin/env runprompt
model: anthropic/claude-sonnet-4-20250514
---
Hello, I'm {{name}}!
chmod +x hello.prompt
echo '{"name": "World"}' | ./hello.prompt

Note: runprompt must be in your PATH, or use a relative/absolute path in the shebang (e.g. #!/usr/bin/env ./runprompt).

CLI overrides

Override any frontmatter value from the command line:

./runprompt --model anthropic/claude-haiku-4-20250514 hello.prompt
./runprompt --name "Alice" hello.prompt

Template syntax

Templates use Handlebars syntax. Supported features:

  • Variable interpolation: {{variableName}}, {{object.property}}
  • Comments: {{! this is a comment }}
  • Iteration: {{#each items}}...{{/each}} with @index, @first, @last, @key
  • Sections: {{#key}}...{{/key}} (renders if truthy)
  • Inverted sections: {{^key}}...{{/key}} (renders if falsy)

Configuration

Environment variables

Set API keys for your providers:

export ANTHROPIC_API_KEY="..."  # https://console.anthropic.com/settings/keys
export OPENAI_API_KEY="..."     # https://platform.openai.com/api-keys
export GOOGLE_API_KEY="..."     # https://aistudio.google.com/app/apikey
export OPENROUTER_API_KEY="..." # https://openrouter.ai/settings/keys

RUNPROMPT_* overrides

Override any frontmatter value via environment variables prefixed with RUNPROMPT_:

export RUNPROMPT_MODEL="anthropic/claude-haiku-4-20250514"
./runprompt hello.prompt

This is useful for setting defaults across multiple prompt runs.

Verbose mode

Use -v to see request/response details:

./runprompt -v hello.prompt

Providers

Models are specified as provider/model-name:

Provider Model format API key
Anthropic anthropic/claude-sonnet-4-20250514 Get key
OpenAI openai/gpt-4o Get key
Google AI googleai/gemini-1.5-pro Get key
OpenRouter openrouter/anthropic/claude-sonnet-4-20250514 Get key

OpenRouter provides access to models from many providers (Anthropic, Google, Meta, etc.) through a single API key.

Spec compliance

This is a minimal implementation of the Dotprompt specification. Not yet supported:

  • Multi-message prompts ({{role}}, {{history}})
  • Conditionals ({{#if}}, {{#unless}}, {{else}})
  • Helpers ({{json}}, {{media}}, {{section}})
  • Model config (temperature, maxOutputTokens, etc.)
  • Partials ({{>partialName}})
  • Nested Picoschema (objects, arrays of objects, enums)

See TODO.md for the full roadmap.

About

Tiny script to run .prompt files

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 96.4%
  • Shell 3.6%