Problem
Today the skill blocks for up to 10 minutes while Copilot runs. During that time Claude can do nothing else. The net effect: delegation is slower than Claude doing the work directly in many cases.
Why it matters
This is the feature that flips the value proposition. If Claude fires a handoff, returns control to the user, and keeps working on the main thread — then delegates to a cheaper / parallel vendor — the skill earns its weight. Without it, you're paying skill complexity for a synchronous vendor swap.
Approach
Two modes:
Mode 1: background invocation
- `/ghcp-handoff` with `--background` flag
- Fires Copilot via `Bun.spawn` (not spawnSync), detaches
- Writes PID + started-at to .meta.json
- Returns control immediately
Mode 2: status polling
- `/ghcp-handoff status [slug|pr]` checks the background job
- If still running: report elapsed + recent progress
- If finished: surface result (including rabbit-holes, usage, pr_number)
- If failed/timed out: surface stderr tail
Notification (nice-to-have):
- Hook that fires on completion — Claude Code can surface a toast
Review reference
Adversarial review — called out as the single most important strategic feature alongside G1
Problem
Today the skill blocks for up to 10 minutes while Copilot runs. During that time Claude can do nothing else. The net effect: delegation is slower than Claude doing the work directly in many cases.
Why it matters
This is the feature that flips the value proposition. If Claude fires a handoff, returns control to the user, and keeps working on the main thread — then delegates to a cheaper / parallel vendor — the skill earns its weight. Without it, you're paying skill complexity for a synchronous vendor swap.
Approach
Two modes:
Mode 1: background invocation
Mode 2: status polling
Notification (nice-to-have):
Review reference
Adversarial review — called out as the single most important strategic feature alongside G1