Vision
Wave today is a CLI tool that runs AI agents in sequence — give it a task, it breaks it into steps, assigns each step to a specialized AI persona (planner, implementer, reviewer), runs them one after another in isolated workspaces, and validates their output against contracts. Single model (Claude), single direction (forward), single user (you at the terminal).
Wave after this epic is an orchestration engine where AI agents can loop, branch, judge each other's work, use different models for different jobs, and pause for human decisions — while a server keeps it running 24/7 for the whole team. A fix fails tests? It loops back automatically. A plan needs approval? It waits for you. A simple task? Use the cheap model. A hard one? Use the expensive one. A run went sideways at step 5? Fork from step 4 and try again without losing work. Every run generates a retrospective telling you what went well and what didn't.
The one-liner: Wave goes from "linear pipeline runner" to "self-correcting development system."
Context
Based on deep competitive analysis of Fabro (#569). Fabro is a Rust-based workflow engine using Graphviz DOT, with a built-in agent runtime, 7 LLM providers, CSS-like model stylesheets, graph loops, human gates, automatic retrospectives, and server mode. Solo founder (Bryan Helmkamp / qlty.sh), MIT license, 563 stars in 12 days.
Child Issues
Critical
High Priority
Medium Priority
Additive
Wave's Enduring USPs (Preserve These)
- Named persona system with per-persona permissions
- Declarative contract validation (JSON schema, TypeScript, test suites)
- Per-step workspace isolation with mount modes
- Navigator-first architecture (read-only analysis before implementation)
- Forge-agnostic design (GitHub, GitLab, Gitea, Bitbucket)
- Fresh memory by default at step boundaries
Vision
Wave today is a CLI tool that runs AI agents in sequence — give it a task, it breaks it into steps, assigns each step to a specialized AI persona (planner, implementer, reviewer), runs them one after another in isolated workspaces, and validates their output against contracts. Single model (Claude), single direction (forward), single user (you at the terminal).
Wave after this epic is an orchestration engine where AI agents can loop, branch, judge each other's work, use different models for different jobs, and pause for human decisions — while a server keeps it running 24/7 for the whole team. A fix fails tests? It loops back automatically. A plan needs approval? It waits for you. A simple task? Use the cheap model. A hard one? Use the expensive one. A run went sideways at step 5? Fork from step 4 and try again without losing work. Every run generates a retrospective telling you what went well and what didn't.
The one-liner: Wave goes from "linear pipeline runner" to "self-correcting development system."
Context
Based on deep competitive analysis of Fabro (#569). Fabro is a Rust-based workflow engine using Graphviz DOT, with a built-in agent runtime, 7 LLM providers, CSS-like model stylesheets, graph loops, human gates, automatic retrospectives, and server mode. Solo founder (Bryan Helmkamp / qlty.sh), MIT license, 563 stars in 12 days.
Child Issues
Critical
High Priority
Medium Priority
Additive
Wave's Enduring USPs (Preserve These)