Conversation
Add preflight-validator and file-bloat-analyzer inline sub-agents (claude-haiku-4.5) to reduce token usage and free turn budget for actual editing work. - Add features.inline-agents: true to frontmatter - Replace Phase 0 multi-step pre-flight with preflight-validator invocation - Replace Section 4 manual analysis with file-bloat-analyzer invocation - Add preflight-validator agent block with 3 bash checks returning pass/fail JSON - Add file-bloat-analyzer agent block returning structured bloat inventory JSON - Recompile lock file Agent-Logs-Url: https://github.com/github/gh-aw/sessions/6ed6679b-167a-45aa-a6df-335bd9ac3aaa Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
|
@copilot move common precomputation as step in the action too. |
…ecomputation Replace preflight-validator sub-agent with a deterministic pre-agent-steps block that runs the three bash checks (dir exists, editable file count, cache state) before the AI engine starts. Writes preflight.json and candidate-files.txt to /tmp/gh-aw/agent/ so the main agent reads pre-computed data instead of running any AI inference. Only file-bloat-analyzer (per-file, runtime selection) remains as an inline sub-agent. Agent-Logs-Url: https://github.com/github/gh-aw/sessions/c9d6ac71-d6c0-4ad1-98dc-4459bd97a66e Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
Done in d3e5fd7. The Warning Firewall rules blocked me from connecting to one or more addresses (expand for details)I tried to connect to the following addresses, but was blocked by firewall rules:
If you need me to access, download, or install something from one of these locations, you can either:
|
There was a problem hiding this comment.
Pull request overview
Updates the unbloat-docs agentic workflow to reduce turn count by moving preflight data gathering into deterministic pre-agent steps and extracting per-file “bloat inventory” into a Haiku inline sub-agent.
Changes:
- Enable inline sub-agents via
features.inline-agents: true. - Add
pre-agent-stepsthat writes/tmp/gh-aw/agent/preflight.jsonand/tmp/gh-aw/agent/candidate-files.txtbefore the main agent runs. - Replace manual file-analysis instructions with a
file-bloat-analyzerinline agent returning structured JSON.
Show a summary per file
| File | Description |
|---|---|
| .github/workflows/unbloat-docs.md | Adds deterministic preflight and an inline Haiku sub-agent to offload extractive analysis. |
| .github/workflows/unbloat-docs.lock.yml | Regenerated compiled workflow/lock with inline-agent restoration and the preflight step. |
Copilot's findings
Tip
Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
- Files reviewed: 2/2 changed files
- Comments generated: 3
| TOTAL=$(find docs/src/content/docs -path '*/blog*' -prune \ | ||
| -o -name '*.md' -type f ! -name 'frontmatter-full.md' -print \ | ||
| | xargs grep -rL 'disable-agentic-editing: true' 2>/dev/null \ | ||
| | wc -l) |
| CLEANED=$(awk -v cutoff="$RECENT_CUTOFF" \ | ||
| 'NF>0 && $1>=cutoff{count++} END{print count+0}' \ | ||
| /tmp/gh-aw/cache-memory/cleaned-files.txt 2>/dev/null || echo "0") | ||
| UNCLEANED=$(( TOTAL - CLEANED )) | ||
| if [ "$UNCLEANED" -le 0 ]; then | ||
| echo '{"pass":false,"reason":"Pre-flight check: all eligible documentation files were cleaned recently — nothing to do this run."}' \ | ||
| > /tmp/gh-aw/agent/preflight.json | ||
| exit 0 | ||
| fi |
| GH_AW_SUB_AGENT_EXT: ".md" | ||
| run: bash "${RUNNER_TEMP}/gh-aw/actions/restore_inline_sub_agents.sh" | ||
| - name: Pre-flight checks | ||
| run: "mkdir -p /tmp/gh-aw/agent\n\n# Check 1: verify docs directory structure exists\nDIR_COUNT=$(find docs/src/content/docs -maxdepth 1 -type d 2>/dev/null | wc -l)\nif [ \"$DIR_COUNT\" -eq 0 ]; then\n echo '{\"pass\":false,\"reason\":\"Pre-flight failed: docs/src/content/docs directory not found — documentation structure is missing or repository is not set up correctly.\"}' \\\n > /tmp/gh-aw/agent/preflight.json\n exit 0\nfi\n\n# Check 2: count editable markdown files\nTOTAL=$(find docs/src/content/docs -path '*/blog*' -prune \\\n -o -name '*.md' -type f ! -name 'frontmatter-full.md' -print \\\n | xargs grep -rL 'disable-agentic-editing: true' 2>/dev/null \\\n | wc -l)\nif [ \"$TOTAL\" -eq 0 ]; then\n echo '{\"pass\":false,\"reason\":\"Pre-flight failed: no editable markdown files found in docs/src/content/docs (all files may be protected or excluded).\"}' \\\n > /tmp/gh-aw/agent/preflight.json\n exit 0\nfi\n\n# Check 3: count uncleaned candidates (not cleaned in the past 7 days)\nRECENT_CUTOFF=$(date -d '7 days ago' '+%Y-%m-%d' 2>/dev/null \\\n || date -v-7d '+%Y-%m-%d' 2>/dev/null \\\n || echo \"0000-00-00\")\nCLEANED=$(awk -v cutoff=\"$RECENT_CUTOFF\" \\\n 'NF>0 && $1>=cutoff{count++} END{print count+0}' \\\n /tmp/gh-aw/cache-memory/cleaned-files.txt 2>/dev/null || echo \"0\")\nUNCLEANED=$(( TOTAL - CLEANED ))\nif [ \"$UNCLEANED\" -le 0 ]; then\n echo '{\"pass\":false,\"reason\":\"Pre-flight check: all eligible documentation files were cleaned recently — nothing to do this run.\"}' \\\n > /tmp/gh-aw/agent/preflight.json\n exit 0\nfi\n\n# All checks passed — write candidate file list and preflight result\nfind docs/src/content/docs -path '*/blog*' -prune \\\n -o -name '*.md' -type f ! -name 'frontmatter-full.md' -print \\\n | xargs grep -rL 'disable-agentic-editing: true' 2>/dev/null \\\n > /tmp/gh-aw/agent/candidate-files.txt\nprintf '{\"pass\":true,\"reason\":\"All pre-flight checks passed. %d uncleaned candidates available.\",\"uncleaned\":%d,\"total\":%d}\\n' \\\n \"$UNCLEANED\" \"$UNCLEANED\" \"$TOTAL\" \\\n > /tmp/gh-aw/agent/preflight.json\n\necho \"Pre-flight passed: $UNCLEANED uncleaned candidates out of $TOTAL eligible files\"\necho \"Candidate files written to /tmp/gh-aw/agent/candidate-files.txt\"\n" |
The unbloat-docs workflow was hitting its 90-turn cap with 0 write actions — 50% of turns were data-gathering that can be offloaded to deterministic steps and sub-agents. This replaces two heavyweight Sonnet phases with zero-inference precomputation and a targeted Haiku sub-agent.
Changes
features: inline-agents: trueadded to frontmatterpre-agent-steps:block that runs the three checks (docs dir exists, editable file count, cache state) as pure bash before the AI engine starts. Results are written to/tmp/gh-aw/agent/preflight.jsonand/tmp/gh-aw/agent/candidate-files.txt; the agent prompt reads the pre-computed files — zero AI inference required.file-bloat-analyzerinvocation; agent returns structured JSON inventory the main model consumes in one turnPre-agent step:
Pre-flight checksRuns deterministically as a GitHub Actions step before the engine starts:
docs/src/content/docsdirectory existspreflight.json+candidate-files.txtto/tmp/gh-aw/agent/preflight.jsonwith"pass": falseand a reason stringNew agent block:
file-bloat-analyzer(claude-haiku-4.5)file-bloat-analyzer— reads one doc file, returnsbullet_count,heavy_bullet_sections,duplicate_headings,repetitive_pattern_count,bloat_score,top_bloat_reason. Purely extractive — canonical Haiku task. Kept as a sub-agent since the file to analyze is only known at runtime after the agent selects a candidate.Estimated impact: ~18% token reduction (~1.3M effective tokens/run vs ~1.6M), 8–12 turns freed for actual editing work.