White Paper v1.0.22 Author: Rogério Figurelli Date: May 8, 2025
Prompy proposes a minimalist, declarative language designed to turn compact task descriptions into complete Python scripts using large language models (LLMs). Unlike traditional code generators or domain-specific languages (DSLs), Prompy reduces the user’s input to a short, structured YAML-like file that can be directly interpreted by prompt-based systems to yield working, self-documenting code. This architecture enables seamless automation and empowers both developers and non-programmers to generate functional software with near-zero boilerplate.
To illustrate the simplicity and power of Prompy, consider a minimal example: the canonical "Hello, World!" script. Using Prompy, this script can be defined with just two lines:
agent: Hello World
task: Print 'Hello, World!' to the console
This .prompy file generates the following natural language prompt:
Create a Python script named "Hello World" that prints 'Hello, World!' to the console.
And from that, a complete and executable Python file:
"""
Hello World Script
Author: Generated by Prompy
Date: May 8, 2025
"""
print('Hello, World!')This demonstrates how even the most basic scripts can be defined declaratively and generated reliably, establishing a baseline for more complex use cases.
The rise of generative AI and large language models has made it possible to create software through natural language. However, existing tools often demand verbose inputs, proprietary syntax, or code scaffolding. Prompy aims to solve this by introducing an archetype: a structured micro-language for expressing intent with the fewest possible words, and converting it to code through prompt-based generation.
Users frequently need custom automation scripts (e.g., scrapers, notifiers, bots) but:
- Lack the technical skills to write or scaffold them.
- Spend too much time crafting precise prompts.
- Cannot easily reuse or version these LLM-based interactions.
What if we had a micro-language that:
- Captures intent concisely?
- Converts to prompts with clear structure?
- Outputs reliable Python scripts?
Prompy also introduces the concept of optional inputs, allowing users to define runtime or configuration parameters — such as credentials, API keys, or URLs — directly in the .prompy file. These values are injected into the generated code in a secure and structured manner. This helps avoid hardcoded secrets and keeps code reusable across environments.
Prompy is a specification format (.prompy) that:
- Defines a task in as little as two fields:
agentandtask. - Supports optional metadata like
modelandauthor. - Converts into one or more prompts targeting LLMs.
- Outputs
.pyscripts that are ready to run, with dependencies, logic, and documentation.
- Minimalist Design: A
.prompyfile should be understandable at a glance. - Prompt-Driven Generation: Code is derived from structured, deterministic prompts.
- LLM Alignment: Prompy is designed to be optimally consumable by AI.
- Declarative Intent: The user expresses what they want, not how to build it.
Prompy introduces several key innovations that distinguish it from existing prompt-based and DSL-based code generation tools:
1. Minimal Input Surface
Prompy uses an ultra-compact input format. With as few as two lines (agent and task), it generates usable Python code. This lowers the barrier to entry for both technical and non-technical users.
2. Deterministic Prompt Generation Rather than leaving prompt crafting to the user, Prompy standardizes prompt generation. It maps structured fields to natural language prompts that are consistent, repeatable, and optimized for LLM consumption.
3. Model-Aware Design Prompy supports logical modeling like MCP (Perception–Decision–Action) natively. Users can specify high-level cognitive or functional models, which influence how prompts are framed and how code is structured.
4. YAML-Like Specification Without DSL Overhead
Unlike tools like DSL Copilot or Metaphor, Prompy doesn’t require users to define grammars or learn new programming abstractions. The .prompy file is human-readable and immediately intuitive.
5. Full Code Output, Not Fragments
Where many systems output partial suggestions, Prompy’s goal is full .py files: complete, executable, and self-documenting scripts ready for use or deployment.
6. Focus on Practical Automation While some tools lean toward prompt experimentation, Prompy targets repeatable task automation: bots, data scrapers, alerts, agents. Its architecture is built for reliability, not just creativity.
7. Extensibility Through Metadata
Prompy can grow organically. Fields like author, model, schedule, and output enable richer code generation scenarios without compromising its minimalist core.
These innovations make Prompy not just an LLM companion, but a candidate standard for natural-language-based programming interfaces.
| Solution | Prompt-Based | DSL Required | Output Scope | Code Generation | Key Differentiator |
|---|---|---|---|---|---|
| Prompy | ✅ | ❌ | Full Python script | ✅ | Ultra-minimal, structured prompting |
| Impromptu | ✅ | ✅ | Prompt fragments | ❌ | Modular prompt management |
| DSL Copilot | ✅ | ✅ | Full code | ✅ | Requires DSL definition |
| Metaphor | ✅ | ✅ | Maintenance hints | Code refactoring and logic description | |
| Vibe Coding | ✅ | ❌ | SQL only | ✅ | Task-specific to SQL |
This section describes the technical architecture of the Prompy system, outlining how data flows through its layers—from input acquisition to contextual reasoning and final output delivery.
The architecture is modular, composed of:
-
Input Layer: Accepts
.prompyfiles with task + optional metadata. -
Reasoning Layer:
- Converts structured data into natural language prompts.
- Chains and optimizes the prompt for LLMs.
- Optionally uses templates based on known task patterns.
-
Output Layer: Receives LLM responses and formats them into
.pyfiles. -
Application Interfaces: CLI tools, schedulers, or GUI integrations for end-users.
Prompy Architecture
1. Inputs
└── .prompy files (YAML-style specs)
2. Input Layer
└── Minimal parser that extracts fields (agent, task, etc.)
3. Reasoning Layer
├── Prompt generator engine
├── Task template matcher (e.g., MCP model)
├── LLM interface (OpenAI API, etc.)
4. Output Layer
├── Generated .py script with logic + comments
└── Metadata header (author, date)
5. Application Interfaces
├── CLI (e.g., prompy run task.prompy)
├── Web UI for rapid prototyping
└── Integration with schedulers (e.g., cron)
Problem and Context
Many consumers and e-commerce analysts need to track the price changes of specific products across multiple websites. Doing this manually is time-consuming, and traditional scraping scripts are difficult to maintain or generate quickly for non-technical users. This is where Prompy enables high-leverage automation: writing one short .prompy file generates a complete, working Python agent with email alerts.
.prompy Input
agent: Price Monitor
task: Track product prices from websites and send an email if they change
model: MCP
inputs:
author: Rogério Figurelli
email:
sender: your@email.com
receiver: receiver@email.com
smtp_server: smtp.yourdomain.com
smtp_port: 587
username: your@email.com
password: your_password
products:
Product A: https://example.com/productA
Product B: https://example.com/productB
Product C: https://example.com/productC
Prompt Generated by Prompy
Create a Python script called "Price Monitor" based on the MCP model (Perception–Decision–Action).
The task of this agent is to track product prices from websites and send an email if any prices change.
Use libraries such as requests, BeautifulSoup, pandas, and smtplib.
The script should detect changes compared to a previously saved file and sort the output by price.
Load email configuration and author name from an external JSON file.
Include the author name dynamically in the docstring.
Generated Python Script
"""
Product Price Monitoring Agent using MCP (Model of Perceptual Control)
This Python agent follows the MCP model (Perception–Decision–Action) and is designed to:
1. PERCEPTION: Scrape product prices from multiple web pages.
2. DECISION: Detect any change in the prices compared to the previous state.
3. ACTION: Build a ranked list of the products by price and send an email notification
whenever a change is detected.
Requirements:
- requests
- beautifulsoup4
- pandas
- smtplib (built-in)
- A proper HTML selector for each product page
Note: Adjust the CSS selectors in `get_price()` according to the actual structure of the websites.
Author: " + AUTHOR
Date: May 8, 2025
"""
import requests
from bs4 import BeautifulSoup
import pandas as pd
import smtplib
from email.message import EmailMessage
import hashlib
import os
import json
PRODUCT_URLS = config.get('products', {})
HEADERS = {'User-Agent': 'Mozilla/5.0'}
PREVIOUS_DATA_FILE = 'previous_prices.json'
with open('config.json', 'r') as config_file:
config = json.load(config_file)
EMAIL_CONFIG = config.get('email', {})
AUTHOR = config.get('author', 'Unknown')
PRODUCT_URLS = config.get('products', {})
def get_price(url):
response = requests.get(url, headers=HEADERS)
soup = BeautifulSoup(response.content, 'html.parser')
price_tag = soup.select_one('.price') or soup.select_one('.product-price')
if price_tag:
text = price_tag.get_text().strip().replace('$', '').replace(',', '.')
return float(''.join(c for c in text if c.isdigit() or c == '.'))
return None
def has_changed(new_data):
if not os.path.exists(PREVIOUS_DATA_FILE):
return True
with open(PREVIOUS_DATA_FILE, 'r') as f:
old_data = json.load(f)
return hashlib.md5(json.dumps(new_data, sort_keys=True).encode()).hexdigest() != \
hashlib.md5(json.dumps(old_data, sort_keys=True).encode()).hexdigest()
def save_current_data(data):
with open(PREVIOUS_DATA_FILE, 'w') as f:
json.dump(data, f, indent=2)
def send_email(ranked_df):
msg = EmailMessage()
msg['Subject'] = 'Updated Product Price Ranking'
msg['From'] = EMAIL_CONFIG['sender']
msg['To'] = EMAIL_CONFIG['receiver']
content = ranked_df.to_string(index=False)
msg.set_content(f"Updated product price ranking:
{content}")
with smtplib.SMTP(EMAIL_CONFIG['smtp_server'], EMAIL_CONFIG['smtp_port']) as server:
server.starttls()
server.login(EMAIL_CONFIG['username'], EMAIL_CONFIG['password'])
server.send_message(msg)
def price_agent():
prices = {}
for name, url in PRODUCT_URLS.items():
price = get_price(url)
if price:
prices[name] = price
if not prices:
print("No prices found.")
return
if has_changed(prices):
print("Changes detected, generating ranking and sending email...")
save_current_data(prices)
df = pd.DataFrame(prices.items(), columns=["Product", "Price"]).sort_values(by="Price")
send_email(df)
else:
print("No changes in prices.")
if __name__ == "__main__":
price_agent()```- Auto-generation of microservices from
.prompyspecs. - Education tools for teaching code via natural prompts.
- Autonomous agents with
.prompy+ memory/state feedback. - Real-time agents that continuously regenerate logic from updated
.prompyfiles.
Creative Commons Attribution 4.0 International (CC BY 4.0) © 2025 Rogério Figurelli. This is a conceptual framework provided “as is” without warranty.