-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
Objective
Document how to integrate AgentV eval results into GitHub Actions workflows — annotations, job summaries, and PR comments — using post-processing scripts that parse AgentV output artifacts.
Design Latitude
- Add an Astro docs page or section covering GitHub Actions integration
- Include example workflow snippets for:
- Parse
results.jsonl→::error::/::warning::workflow commands for native annotations - Parse
results.json→$GITHUB_STEP_SUMMARYmarkdown job summary - Use JUnit XML with existing GitHub Actions test reporters
- Use
--thresholdflag (once feat(cli): --threshold flag for suite-level quality gates #698 lands) for CI quality gates
- Parse
- Optionally create a reusable skill for annotation generation
- Keep this outside of AgentV core — CI integration is a downstream concern
Acceptance Signals
- Astro docs page with example GitHub Actions workflow YAML
- At least one example showing
::error::annotation generation from results - At least one example showing job summary generation
- Examples are copy-pasteable into a real workflow
Non-Goals
- No changes to AgentV core output pipeline
- No built-in annotation writer
- Not a GitHub Actions reusable action/workflow (separate scope if needed later)
Context
Originally proposed as a built-in GitHub annotations output writer. Reconsidered — annotation emission is a presentation concern better handled in the CI workflow itself. This matches how microsoft/skills does it: their skill-evaluation.yml reads results.json inline and emits ::error:: commands via a Node.js script step.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels