Conversation
|
@copilot add a step to the agentic workflow that runs the copilot PR search request (using gh cli, JSON format, one month) and saves it to a file so that the LLM can use jq to process the data. Import jqschema.md |
Updated the copilot-agent-analysis.md workflow to limit historical data collection to a maximum of 7 days (1 week) instead of 30 days. Key changes: - Updated Phase 4.1: Changed "up to last 30 days" to "up to last 7 days maximum" - Updated Phase 4.2.1 Rebuilding Strategy: Removed references to 30-day collection - Updated Phase 4.4 Trend Analysis: Removed Monthly Summary section (30+ days) - Updated Discussion Template: Removed Monthly Summary section - Updated Cache Memory Management: Changed retention from "90 days" to "7 days" - Updated Success Criteria: Changed "30+ days" to "7 days" for trends - Removed redundant "Historical Comparison" section The workflow now focuses on weekly trends only and will not attempt to collect more than 7 days of historical data, reducing context usage and API rate limit concerns. Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
Added a preparation step that runs before the AI agent starts: - Imports shared/jqschema.md for JSON schema utilities - Adds a step that uses gh CLI to search for Copilot PRs from the last 30 days - Saves results to /tmp/gh-aw/pr-data/copilot-prs.json in JSON format - Generates schema file at /tmp/gh-aw/pr-data/copilot-prs-schema.json - Added bash permissions for gh, jq, and jqschema.sh - Updated Phase 1 instructions to reference the pre-fetched data file The LLM can now use jq to efficiently process the pre-fetched PR data instead of making multiple GitHub API calls. Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
Added the requested changes in commit
The LLM can now use jq to efficiently process the pre-fetched PR data instead of making multiple GitHub API calls. |
| - "/tmp/gh-aw/jqschema.sh" | ||
|
|
||
| steps: | ||
| - name: Fetch Copilot PR data |
There was a problem hiding this comment.
Added GITHUB_TOKEN to the step's environment variables in commit e8c4542. The gh CLI now has proper authentication when searching for pull requests.
Added the GITHUB_TOKEN secret as an environment variable to the "Fetch Copilot PR data" step. This ensures the gh CLI has proper authentication when searching for pull requests. Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
|
@copilot the gh cli is wrong. Test it out locally. Here is the error 10-19T19:04:21.7417523Z ##[group]Run # Create output directory |
Fixed the gh CLI command to use the correct syntax: - Changed from --repo flag (which doesn't exist) to query format: "repo:owner/name created:>=DATE" - Added GH_TOKEN environment variable (gh CLI's preferred token variable) - Updated --json fields to only use available fields (removed: mergedAt, comments, additions, deletions, changedFiles, commits) - Added available fields: body, labels, assignees, repository The command now uses the proper GitHub search query syntax instead of non-existent command-line flags. Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
Fixed the gh CLI syntax in commit
The command now uses the proper GitHub search query syntax. |
|
@copilot fix permissions for gh request: 2025-10-19T19:21:51.5901765Z ##[group]Run # Create output directory |
Changed workflow permissions from specific scopes to read-all as requested. Also fixed the gh search prs query syntax to avoid quote escaping issues: - Changed permissions to read-all (compiled as permissions: read-all at job level) - Removed outer quotes from search query to prevent escaping issues - Query format: gh search prs repo:REPO created:">=$DATE" This fixes the "you do not have permission to view them" error by granting broader read permissions, and prevents query parsing errors from escaped quotes. Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
Fixed in commit
This prevents quote escaping issues that were causing the invalid search query error.
Updated permissions to read-all in commit |
| - Average duration trend (faster/slower/stable) | ||
| - Comment count trend (more engagement/less engagement) | ||
| - Volume trend (more/fewer PRs) | ||
|
|
There was a problem hiding this comment.
Restored monthly summary in commit 8f5c3d1. The monthly summary section is now back in:
- Phase 4.4 trend analysis (after week-over-week comparison)
- Discussion template with full 30-day metrics, weekly trends table, and monthly trends analysis
- Success criteria updated to include monthly summary requirement
| ### Cache Memory Management | ||
| - **Organize data**: Keep historical data well-structured in JSON format | ||
| - **Limit retention**: Consider keeping only last 90 days of daily data | ||
| - **Limit retention**: Keep only last 7 days of daily data (maximum) |
There was a problem hiding this comment.
@copilot keep 1 year of data, the cache can be cleared to delete the data
There was a problem hiding this comment.
Updated cache retention to 1 year in commit 8f5c3d1. The cache memory management now keeps the last 365 days of daily data, with a note that the cache can be cleared to delete old data.
As requested in code review: 1. Restored monthly summary section in Phase 4.4 trend analysis 2. Restored monthly summary in the discussion template with: - Performance metrics (30-day averages) - Weekly trends table (4 weeks) - Monthly trends analysis - Historical comparison section 3. Extended cache retention from 7 days to 1 year (365 days) 4. Updated success criteria to include monthly summary requirement The workflow now supports both weekly (7+ days) and monthly (30+ days) trend analysis, while maintaining the 7-day maximum for historical data rebuilds. Cache can store up to 1 year of data and can be cleared as needed. Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
|
Agentic Changeset Generator triggered by this pull request. |
🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Updated the copilot-agent-analysis workflow to optimize data collection and enable comprehensive trend analysis while avoiding context overload.
Changes Made
Historical Data Rebuild Optimization
gh CLI Data Pre-fetch
shared/jqschema.mdfor JSON schema utilities/tmp/gh-aw/pr-data/copilot-prs.jsonin JSON format/tmp/gh-aw/pr-data/copilot-prs-schema.jsongh search prs repo:REPO created:">=$DATE"Permissions and Security
read-allfor broader repository accessTrend Analysis and Cache Management
Architecture
The workflow now uses a balanced approach:
Testing
The workflow now efficiently balances active data collection with comprehensive long-term trend analysis capabilities, reducing API calls while enabling both weekly and monthly performance insights.
Original prompt
💬 Share your feedback on Copilot coding agent for the chance to win a $200 gift card! Click here to start the survey.