Welcome to your new Mastra project! We're excited to see what you'll build.
We don't cache answers. We cache understanding — and replay exploration.
The Learning Sandwich is an ABC-driven agentic understanding system for local repos: A (framing layer) defines success, B (Explorer agent) inspects the repo, C (Verifier) deterministically gates persistence. The system loops until the Candidate passes verification, then persists a canonical context with provenance (A hash, C hash, repo surface hash) for cache invalidation.
Run for a local repo:
bun run index.ts -- --repo /path/to/repo [--template protocol] [--max-iterations N] [--explorer-max-steps N] [--framer-max-steps N]Templates: Use --template protocol (or --protocol) for Go/Ethereum-style protocol repos (e.g. go-ethereum). The protocol template adds acceptance criteria (architecture map, state transition, EVM, consensus, sync, txpool, RPC, storage, config), a guided exploration plan, and deterministic C checks (export-backed API, subsystem citations, entrypoint trace).
Limits (tunable): Each agent gets a maximum number of steps (tool-call rounds) before it must output JSON. Defaults: Framer 25, Explorer 50. The explore–verify loop runs up to 5 times by default (--max-iterations). For very large repos (e.g. go-ethereum), increase Explorer steps so the model has room to list/read and then emit the candidate: e.g. --explorer-max-steps 80.
View a saved context (after a successful run):
bun run index.ts -- view <contextId> # text to stdout
bun run index.ts -- view --repo /path [--latest] # show latest for repo
bun run index.ts -- view <contextId> --md # write Markdown to .fingerling/reports/
bun run index.ts -- view <contextId> --html # write HTML report to .fingerling/reports/Canonical contexts are stored under .fingerling/ (local-first; no embeddings or vector DB).
Refine-node (single ontology node) after a macro run: Framer and Explorer use Anthropic (default claude-haiku-4-5). If the default model fails verification or returns bad output, the workflow tries claude-sonnet-4-5 then claude-opus-4-5. Set ANTHROPIC_API_KEY (in your environment or in a .env file) to run:
bun run index.ts -- refine-node --repo /path/to/repo --node-id <nodeId>
bun run test:consensus # refine consensus node (repo defaults to ../go-ethereum)Model escalation is logged to .fingerling/refine-model-attempts.jsonl and summarized at the end of refine-all.
Start the development server:
bun run devOpen http://localhost:4111 in your browser to access Mastra Studio. It provides an interactive UI for building and testing your agents, along with a REST API that exposes your Mastra application as a local service. This lets you start building without worrying about integration right away.
You can start editing files inside the src/mastra directory. The development server will automatically reload whenever you make changes.
To learn more about Mastra, visit our documentation. Your bootstrapped project includes example code for agents, tools, workflows, scorers, and observability.
If you're new to AI agents, check out our course and YouTube videos. You can also join our Discord community to get help and share your projects.
Mastra Cloud gives you a serverless agent environment with atomic deployments. Access your agents from anywhere and monitor performance. Make sure they don't go off the rails with evals and tracing.
Check out the deployment guide for more details.