Skip to content

Aureum01/systemgen

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

systemgen

CLI tool that builds reusable AI context from your stack and turns it into a single prompt you can send to an assistant.

Building with AI works better when the model knows your framework, database, folder layout, and conventions. Re-typing that context for every task wastes time and drifts from how your repo actually works. systemgen captures those choices once as a named profile, then merges them with whatever you want to build next. Profiles can live as TOML (terminal tools) or JSON (IDE assistants), or both.

Install

pip install systemgen

Quick start

systemgen init
# Answer prompts for stack, structure, and conventions; pick TOML, JSON, or both; name the profile.

systemgen gen --profile myapp
# Pick your profile if you omit --profile; choose how to supply the task prompt; then send the result where you need it.

Commands

Command What it does
systemgen init Create a new profile from the full question flow.
systemgen gen Build a prompt from a saved profile (--profile <name> optional).
systemgen update Show current profile values, then full re-run of all questions, field-by-field edits, or cancel (--profile <name> optional); same name on save.
systemgen profile list List saved profiles and optional follow-up actions.
systemgen profile delete Remove a profile (TOML, JSON, or both).
systemgen profile library Import from built-in templates or start from scratch.

Run systemgen with no subcommand to open the interactive menu when you have profiles.

Profile formats

Profiles are plain data. TOML suits terminal workflows (e.g. Claude Code, Ollama); JSON suits IDE tooling (e.g. Cursor, Windsurf). Example shape (FastAPI-style), shortened:

TOML (~/.systemgen/profiles/<name>.toml):

[project]
type = "Backend API"
framework = "FastAPI"
language = "python"
language_version = "3.11"

[database]
engine = "PostgreSQL"
orm = "sqlalchemy"
mode = "async"

[auth]
method = "JWT"

JSON (~/.systemgen/profiles/<name>.json):

{
  "project": {
    "type": "Backend API",
    "framework": "FastAPI",
    "language": "python",
    "language_version": "3.11"
  },
  "database": {
    "engine": "PostgreSQL",
    "orm": "sqlalchemy",
    "mode": "async"
  },
  "auth": {
    "method": "JWT"
  }
}

Prompt input options (gen)

After you pick a profile, you choose how to add the task:

Option When to use it
Type it now You know the request and want to enter it in the terminal.
Load from a text file The spec lives in a .txt or .md file (paths are re-checked until the file exists).
Skip — I'll write the prompt in my AI tool You only want the profile context; you will add the task inside the assistant.

If you skip the task text, the next step only offers copy, save to file, or exit (no pipe to Claude/Ollama).

Pipe to (gen)

When you supplied a task prompt, you can send the combined output to:

Destination What it does
Claude (terminal) Starts the Claude Code CLI with your prompt as the opening message so you can keep chatting.
Ollama Runs ollama run <model> with the full text as input (model list comes from ollama list when available).
Copy to clipboard Copies the full prompt for pasting anywhere.
Save to file Writes the prompt to a path you enter.

Profiles are saved to

~/.systemgen/profiles/

Languages and stacks

systemgen is not limited to Python. You pick language (e.g. TypeScript, Dart, Go), framework, database, auth, validation, and folder layout from prompts or templates—use it for APIs, frontends, mobile, or CLIs on whatever stack you use.

About

CLI tool that saves your project stack as a reusable profile and generates AI-ready context prompts for Claude, Ollama, and Cursor

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages