A single-file Python script for running Dotprompt files.
Quick start | dotprompt | Examples | Configuration | Providers | Spec compliance
curl -O https://raw.githubusercontent.com/chr15m/runprompt/main/runprompt
chmod +x runpromptCreate hello.prompt:
Run it:
export OPENAI_API_KEY="your-key"
echo '{"name": "World"}' | ./runprompt hello.prompt(You can get an OpenAI key from here: https://platform.openai.com/api-keys)
Dotprompt is an executable prompt template format for GenAI. A .prompt file contains both the prompt template and metadata (model, schema, config) in a single file.
runprompt is a minimal, single-file Python implementation with no dependencies.
In addition to the following, see the tests folder for more example .prompt files.
cat article.txt | ./runprompt summarize.promptThe special {{STDIN}} variable always contains the raw stdin as a string.
Extract structured data using an output schema:
echo "John is a 30 year old teacher" | ./runprompt extract.prompt
# {"name": "John", "age": 30, "occupation": "teacher"}Schema uses Picoschema format. Fields ending with ? are optional. The format is field: type, description.
Pipe structured output between prompts:
echo "John is 30" | ./runprompt extract.prompt | ./runprompt generate-bio.promptThe JSON output from the first prompt becomes template variables in the second.
Make .prompt files directly executable with a shebang:
chmod +x hello.prompt
echo '{"name": "World"}' | ./hello.promptNote: runprompt must be in your PATH, or use a relative/absolute path in the shebang (e.g. #!/usr/bin/env ./runprompt).
Override any frontmatter value from the command line:
./runprompt --model anthropic/claude-haiku-4-20250514 hello.prompt
./runprompt --name "Alice" hello.promptTemplates use Handlebars syntax. Supported features:
- Variable interpolation:
{{variableName}},{{object.property}} - Comments:
{{! this is a comment }} - Iteration:
{{#each items}}...{{/each}}with@index,@first,@last,@key - Sections:
{{#key}}...{{/key}}(renders if truthy) - Inverted sections:
{{^key}}...{{/key}}(renders if falsy)
Set API keys for your providers:
export ANTHROPIC_API_KEY="..." # https://console.anthropic.com/settings/keys
export OPENAI_API_KEY="..." # https://platform.openai.com/api-keys
export GOOGLE_API_KEY="..." # https://aistudio.google.com/app/apikey
export OPENROUTER_API_KEY="..." # https://openrouter.ai/settings/keysOverride any frontmatter value via environment variables prefixed with RUNPROMPT_:
export RUNPROMPT_MODEL="anthropic/claude-haiku-4-20250514"
./runprompt hello.promptThis is useful for setting defaults across multiple prompt runs.
Use -v to see request/response details:
./runprompt -v hello.promptModels are specified as provider/model-name:
| Provider | Model format | API key |
|---|---|---|
| Anthropic | anthropic/claude-sonnet-4-20250514 |
Get key |
| OpenAI | openai/gpt-4o |
Get key |
| Google AI | googleai/gemini-1.5-pro |
Get key |
| OpenRouter | openrouter/anthropic/claude-sonnet-4-20250514 |
Get key |
OpenRouter provides access to models from many providers (Anthropic, Google, Meta, etc.) through a single API key.
This is a minimal implementation of the Dotprompt specification. Not yet supported:
- Multi-message prompts (
{{role}},{{history}}) - Conditionals (
{{#if}},{{#unless}},{{else}}) - Helpers (
{{json}},{{media}},{{section}}) - Model config (
temperature,maxOutputTokens, etc.) - Partials (
{{>partialName}}) - Nested Picoschema (objects, arrays of objects, enums)
See TODO.md for the full roadmap.