Skip to content

PR workflow: Improve review status signals and reduce noise#6938

Draft
Copilot wants to merge 4 commits intomasterfrom
copilot/improve-review-status-signals
Draft

PR workflow: Improve review status signals and reduce noise#6938
Copilot wants to merge 4 commits intomasterfrom
copilot/improve-review-status-signals

Conversation

Copy link
Contributor

Copilot AI commented Mar 13, 2026

PR review workflow had poor signal-to-noise ratio: copilot-visual-review reported SUCCESS when skipped, preview URLs were duplicated across 3 comments, and bot comments lacked clear agent identification.

Changes

Status signaling

  • Add Job 4 (report-skipped) to explicitly report when visual review is skipped with reason
  • resolve-review-urls.js now outputs skipped and skip-reason for downstream jobs

Reduce noise

  • Visual review comment now shows first 10 URLs with note to see PR Preview for full list
  • Removed redundant URL listing that was duplicated between PR Preview and Doc Review comments

Agent personas

  • All bot comments now have distinct headers: 📦 PR Preview — Preview Bot, 🔍 Visual Review — Doc Review Bot
  • Consistent status tables with emojis (✅ ⏭️ ⏱️ ❌) and timestamps

Completion signal

  • Updated copilot-visual-review.md template with required response format:
## 🔍 Visual Review Complete — Copilot

| Status | Details |
|--------|---------|
| **Result** | ✅ APPROVED / ⚠️ CHANGES REQUESTED / 🔍 NEEDS HUMAN REVIEW |
| **Pages Reviewed** | X page(s) |
| **Issues Found** | X BLOCKING, X WARNING, X INFO |

This makes it easy to scan a PR and immediately see which checks ran, their status, and which bot produced each comment.

Original prompt

This section details on the original issue you should resolve

<issue_title>PR workflow: Improve review status signals and reduce noise</issue_title>
<issue_description>PR visual review and related workflow issues

Summary

During the v1.12.3 release (PR #6872), several PR review workflow issues surfaced that reduce the signal-to-noise ratio for human reviewers.

Problems

1. copilot-visual-review silently skips and reports SUCCESS

Is it running? @copilot is invoked with comments when it appears first in the comment. The template appears to invoke it from further down in the body. Is it actually working?

The copilot-visual-review CI check can pass without actually performing a review. On PR #6872, the check reported SUCCESS but no visual review comment
was posted. This is a false positive — the green checkmark implies review happened when it didn't.

Related task: https://github.com/influxdata/docs-v2/tasks/d9d0ed4e-26ae-4bbf-8ce3-f9bc71daf79a

Expected behavior: If the visual review doesn't run, the check should report as skipped or failed, not SUCCESS.

2. Preview URLs listed redundantly across multiple comments

PR #6872 had preview URLs listed in three separate bot comments:

  • PR Preview Action comment
  • PR Preview detail comment (pages list)
  • Doc Review visual review prompt

This makes the PR noisy and harder for humans to find the actual status of checks and reviews.

Suggested fix: Consolidate preview URL listings into a single comment, or have later comments reference the first one instead of duplicating.

3. No clear completion signal for visual review

When visual review does run (or is performed manually), there's no standardized comment or artifact that signals "visual review complete." Humans
scanning the PR can't easily tell whether it happened.

Suggested fix: Agents/checks that perform reviews should post a clearly identified completion comment (e.g., "Visual Review — [Agent Name]: No
issues found" or "Visual Review — [Agent Name]: 2 issues found"). The agent identity and outcome should be immediately visible.

4. Agent personas lack clear identification

Multiple bot comments from github-actions make it hard to tell which workflow produced which comment. Each automated review step should have a
distinct, recognizable name so humans can quickly scan for what ran and what didn't.

Context

Comments on the Issue (you are @copilot in this section)


📍 Connect Copilot coding agent with Jira, Azure Boards or Linear to delegate work to Copilot in one click without leaving your project management tool.

Copilot AI and others added 2 commits March 13, 2026 17:26
- Add Job 4 (report-skipped) to explicitly report when visual review is skipped
- Update resolve-review-urls.js to output skip status and reason
- Add clear agent persona headers to all bot comments (Preview Bot, Doc Review Bot)
- Reduce URL duplication by having visual review reference PR Preview comment
- Update copilot-visual-review.md template with completion signal format
- Add consistent status tables with emojis for clear at-a-glance status

Co-authored-by: jstirnaman <212227+jstirnaman@users.noreply.github.com>
- Remove unnecessary comment about fallback default in resolve-review-urls.js
- Use plain text sanitization for skip reasons in preview comments
- Add clarifying comment about string comparison in GitHub Actions outputs

Co-authored-by: jstirnaman <212227+jstirnaman@users.noreply.github.com>
Copilot AI changed the title [WIP] Fix PR workflow to improve review status signals PR workflow: Improve review status signals and reduce noise Mar 13, 2026
Copilot AI requested a review from jstirnaman March 13, 2026 17:32
@jstirnaman jstirnaman added the hold off Should not be merged or address yet label Mar 17, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

hold off Should not be merged or address yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

PR workflow: Improve review status signals and reduce noise

2 participants