Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
420 changes: 391 additions & 29 deletions .github/workflows/semantic-function-refactor.lock.yml

Large diffs are not rendered by default.

66 changes: 52 additions & 14 deletions .github/workflows/semantic-function-refactor.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,18 @@ imports:
- shared/reporting.md

safe-outputs:
close-issue:
required-title-prefix: "[refactor] "
target: "*"
max: 10
create-issue:
title-prefix: "[refactor] "
labels: [refactoring, code-quality, automated-analysis]
max: 1

tools:
github:
toolsets: [default]
toolsets: [default, issues]
edit:
bash:
- "find pkg -name '*.go' ! -name '*_test.go' -type f"
Expand All @@ -47,12 +51,15 @@ You are an AI agent that analyzes Go code to identify potential refactoring oppo

## Mission

**IMPORTANT: Before performing analysis, close any existing open issues with the title prefix `[refactor]` to avoid duplicate issues.**

Analyze all Go source files (`.go` files, excluding test files) in the repository to:
1. Collect all function names per file
2. Cluster functions semantically by name and purpose
3. Identify outliers (functions that might be in the wrong file)
4. Use Serena's semantic analysis to detect potential duplicates
5. Suggest refactoring fixes
1. **First, close existing open issues** with the `[refactor]` prefix
2. Collect all function names per file
3. Cluster functions semantically by name and purpose
4. Identify outliers (functions that might be in the wrong file)
5. Use Serena's semantic analysis to detect potential duplicates
6. Suggest refactoring fixes

## Important Constraints

Expand All @@ -70,11 +77,42 @@ The Serena MCP server is configured for this workspace:
- **Context**: codex
- **Language service**: Go (gopls)

## Close Existing Refactor Issues (CRITICAL FIRST STEP)

**Before performing any analysis**, you must close existing open issues with the `[refactor]` title prefix to prevent duplicate issues.

Use the GitHub API tools to:
1. Search for open issues with title containing `[refactor]` in repository ${{ github.repository }}
2. Close each found issue with a comment explaining a new analysis is being performed
3. Use the `close_issue` safe output to close these issues

**Important**: The `close-issue` safe output is configured with:
- `required-title-prefix: "[refactor]"` - Only issues starting with this prefix will be closed
- `target: "*"` - Can close any issue by number (not just triggering issue)
- `max: 10` - Can close up to 10 issues in one run

To close an existing refactor issue, emit:
```
close_issue(issue_number=123, body="Closing this issue as a new semantic function refactoring analysis is being performed.")
```

**Do not proceed with analysis until all existing `[refactor]` issues are closed.**

## Task Steps

### 1. Activate Serena Project
### 1. Close Existing Refactor Issues

**CRITICAL FIRST STEP**: Before performing any analysis, close existing open issues with the `[refactor]` prefix to prevent duplicate issues.

1. Use GitHub search to find open issues with `[refactor]` in the title
2. For each found issue, use `close_issue` to close it with an explanatory comment
3. Example: `close_issue(issue_number=4542, body="Closing this issue as a new semantic function refactoring analysis is being performed.")`

**Do not proceed to step 2 until all existing `[refactor]` issues are closed.**

### 2. Activate Serena Project

First, activate the project in Serena to enable semantic analysis:
After closing existing issues, activate the project in Serena to enable semantic analysis:

```bash
# Serena's activate_project tool should be called with the workspace path
Expand All @@ -83,7 +121,7 @@ First, activate the project in Serena to enable semantic analysis:

Use Serena's `activate_project` tool with the workspace path.

### 2. Discover Go Source Files
### 3. Discover Go Source Files

Find all non-test Go files in the repository:

Expand All @@ -94,7 +132,7 @@ find pkg -name "*.go" ! -name "*_test.go" -type f | sort

Group files by package/directory to understand the organization.

### 3. Collect Function Names Per File
### 4. Collect Function Names Per File

For each discovered Go file:

Expand All @@ -117,7 +155,7 @@ Functions:
- validateFrontmatter(fm map[string]interface{}) error
```

### 4. Semantic Clustering Analysis
### 5. Semantic Clustering Analysis

Analyze the collected functions to identify patterns:

Expand All @@ -141,7 +179,7 @@ Look for functions that don't match their file's primary purpose:
- Helper functions scattered across multiple files
- Generic utility functions not in a dedicated utils file

### 5. Use Serena for Semantic Duplicate Detection
### 6. Use Serena for Semantic Duplicate Detection

For each cluster of similar functions:

Expand All @@ -160,7 +198,7 @@ Example Serena tool usage:
# Use search_for_pattern to find similar implementations
```

### 6. Deep Reasoning Analysis
### 7. Deep Reasoning Analysis

Apply deep reasoning to identify refactoring opportunities:

Expand All @@ -177,7 +215,7 @@ Apply deep reasoning to identify refactoring opportunities:
- **Use Generics**: When similar functions differ only by type
- **Extract Interface**: When similar methods are defined on different types

### 7. Generate Refactoring Report
### 8. Generate Refactoring Report

Create a comprehensive issue with findings:

Expand Down
53 changes: 53 additions & 0 deletions pkg/parser/schemas/main_workflow_schema.json
Original file line number Diff line number Diff line change
Expand Up @@ -2537,6 +2537,59 @@
}
]
},
"close-issue": {
"oneOf": [
{
"type": "object",
"description": "Configuration for closing GitHub issues with comment from agentic workflow output",
"properties": {
"required-labels": {
"type": "array",
"items": {
"type": "string"
},
"description": "Only close issues that have all of these labels"
},
"required-title-prefix": {
"type": "string",
"description": "Only close issues with this title prefix"
},
"target": {
"type": "string",
"description": "Target for closing: 'triggering' (default, current issue), or '*' (any issue with issue_number field)"
},
"max": {
"type": "integer",
"description": "Maximum number of issues to close (default: 1)",
"minimum": 1,
"maximum": 100
},
"target-repo": {
"type": "string",
"description": "Target repository in format 'owner/repo' for cross-repository operations. Takes precedence over trial target repo settings."
},
"github-token": {
"$ref": "#/$defs/github_token",
"description": "GitHub token to use for this specific output type. Overrides global github-token if specified."
}
},
"additionalProperties": false,
"examples": [
{
"required-title-prefix": "[refactor] "
},
{
"required-labels": ["automated", "stale"],
"max": 10
}
]
},
{
"type": "null",
"description": "Enable issue closing with default configuration"
}
]
},
"add-comment": {
"oneOf": [
{
Expand Down
140 changes: 140 additions & 0 deletions pkg/workflow/close_issue.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,140 @@
package workflow

import (
"fmt"
"strings"

"github.com/githubnext/gh-aw/pkg/logger"
)

var closeIssueLog = logger.New("workflow:close_issue")

// CloseIssuesConfig holds configuration for closing GitHub issues from agent output
type CloseIssuesConfig struct {
BaseSafeOutputConfig `yaml:",inline"`
RequiredLabels []string `yaml:"required-labels,omitempty"` // Required labels for closing
RequiredTitlePrefix string `yaml:"required-title-prefix,omitempty"` // Required title prefix for closing
Target string `yaml:"target,omitempty"` // Target for close: "triggering" (default), "*" (any issue), or explicit number
TargetRepoSlug string `yaml:"target-repo,omitempty"` // Target repository for cross-repo operations
}

// parseCloseIssuesConfig handles close-issue configuration
func (c *Compiler) parseCloseIssuesConfig(outputMap map[string]any) *CloseIssuesConfig {
if configData, exists := outputMap["close-issue"]; exists {
closeIssueLog.Print("Parsing close-issue configuration")
closeIssuesConfig := &CloseIssuesConfig{}

if configMap, ok := configData.(map[string]any); ok {
// Parse required-labels
if requiredLabels, exists := configMap["required-labels"]; exists {
if labelList, ok := requiredLabels.([]any); ok {
for _, label := range labelList {
if labelStr, ok := label.(string); ok {
closeIssuesConfig.RequiredLabels = append(closeIssuesConfig.RequiredLabels, labelStr)
}
}
}
closeIssueLog.Printf("Required labels configured: %v", closeIssuesConfig.RequiredLabels)
}

// Parse required-title-prefix
if requiredTitlePrefix, exists := configMap["required-title-prefix"]; exists {
if prefix, ok := requiredTitlePrefix.(string); ok {
closeIssuesConfig.RequiredTitlePrefix = prefix
closeIssueLog.Printf("Required title prefix configured: %q", prefix)
}
}

// Parse target
if target, exists := configMap["target"]; exists {
if targetStr, ok := target.(string); ok {
closeIssuesConfig.Target = targetStr
closeIssueLog.Printf("Target configured: %q", targetStr)
}
}

// Parse target-repo using shared helper with validation
targetRepoSlug, isInvalid := parseTargetRepoWithValidation(configMap)
if isInvalid {
closeIssueLog.Print("Invalid target-repo configuration")
return nil // Invalid configuration, return nil to cause validation error
}
if targetRepoSlug != "" {
closeIssueLog.Printf("Target repository configured: %s", targetRepoSlug)
}
closeIssuesConfig.TargetRepoSlug = targetRepoSlug

// Parse common base fields with default max of 1
c.parseBaseSafeOutputConfig(configMap, &closeIssuesConfig.BaseSafeOutputConfig, 1)
} else {
// If configData is nil or not a map (e.g., "close-issue:" with no value),
// still set the default max
closeIssuesConfig.Max = 1
}

return closeIssuesConfig
}

return nil
}

// buildCreateOutputCloseIssueJob creates the close_issue job
func (c *Compiler) buildCreateOutputCloseIssueJob(data *WorkflowData, mainJobName string) (*Job, error) {
closeIssueLog.Printf("Building close_issue job for workflow: %s", data.Name)

if data.SafeOutputs == nil || data.SafeOutputs.CloseIssues == nil {
return nil, fmt.Errorf("safe-outputs.close-issue configuration is required")
}

// Build custom environment variables specific to close-issue
var customEnvVars []string

if len(data.SafeOutputs.CloseIssues.RequiredLabels) > 0 {
customEnvVars = append(customEnvVars, fmt.Sprintf(" GH_AW_CLOSE_ISSUE_REQUIRED_LABELS: %q\n", strings.Join(data.SafeOutputs.CloseIssues.RequiredLabels, ",")))
}
if data.SafeOutputs.CloseIssues.RequiredTitlePrefix != "" {
customEnvVars = append(customEnvVars, fmt.Sprintf(" GH_AW_CLOSE_ISSUE_REQUIRED_TITLE_PREFIX: %q\n", data.SafeOutputs.CloseIssues.RequiredTitlePrefix))
}
if data.SafeOutputs.CloseIssues.Target != "" {
customEnvVars = append(customEnvVars, fmt.Sprintf(" GH_AW_CLOSE_ISSUE_TARGET: %q\n", data.SafeOutputs.CloseIssues.Target))
}
closeIssueLog.Printf("Configured %d custom environment variables for issue close", len(customEnvVars))

// Add standard environment variables (metadata + staged/target repo)
customEnvVars = append(customEnvVars, c.buildStandardSafeOutputEnvVars(data, data.SafeOutputs.CloseIssues.TargetRepoSlug)...)

// Create outputs for the job
outputs := map[string]string{
"issue_number": "${{ steps.close_issue.outputs.issue_number }}",
"issue_url": "${{ steps.close_issue.outputs.issue_url }}",
"comment_url": "${{ steps.close_issue.outputs.comment_url }}",
}

// Build job condition with issue event check only for "triggering" target
// If target is "*" (any issue) or explicitly set, allow agent to provide issue_number
jobCondition := BuildSafeOutputType("close_issue")
if data.SafeOutputs.CloseIssues != nil &&
(data.SafeOutputs.CloseIssues.Target == "" || data.SafeOutputs.CloseIssues.Target == "triggering") {
// Only require event issue context for "triggering" target
eventCondition := buildOr(
BuildPropertyAccess("github.event.issue.number"),
BuildPropertyAccess("github.event.comment.issue.number"),
)
jobCondition = buildAnd(jobCondition, eventCondition)
}

// Use the shared builder function to create the job
return c.buildSafeOutputJob(data, SafeOutputJobConfig{
JobName: "close_issue",
StepName: "Close Issue",
StepID: "close_issue",
MainJobName: mainJobName,
CustomEnvVars: customEnvVars,
Script: getCloseIssueScript(),
Permissions: NewPermissionsContentsReadIssuesWrite(),
Outputs: outputs,
Condition: jobCondition,
Token: data.SafeOutputs.CloseIssues.GitHubToken,
TargetRepoSlug: data.SafeOutputs.CloseIssues.TargetRepoSlug,
})
}
1 change: 1 addition & 0 deletions pkg/workflow/compiler.go
Original file line number Diff line number Diff line change
Expand Up @@ -255,6 +255,7 @@ type SafeOutputsConfig struct {
CreateIssues *CreateIssuesConfig `yaml:"create-issues,omitempty"`
CreateDiscussions *CreateDiscussionsConfig `yaml:"create-discussions,omitempty"`
CloseDiscussions *CloseDiscussionsConfig `yaml:"close-discussions,omitempty"`
CloseIssues *CloseIssuesConfig `yaml:"close-issue,omitempty"`
AddComments *AddCommentsConfig `yaml:"add-comments,omitempty"`
CreatePullRequests *CreatePullRequestsConfig `yaml:"create-pull-requests,omitempty"`
CreatePullRequestReviewComments *CreatePullRequestReviewCommentsConfig `yaml:"create-pull-request-review-comments,omitempty"`
Expand Down
18 changes: 18 additions & 0 deletions pkg/workflow/compiler_jobs.go
Original file line number Diff line number Diff line change
Expand Up @@ -208,6 +208,24 @@ func (c *Compiler) buildSafeOutputsJobs(data *WorkflowData, jobName, markdownPat
safeOutputJobNames = append(safeOutputJobNames, closeDiscussionJob.Name)
}

// Build close_issue job if safe-outputs.close-issue is configured
if data.SafeOutputs.CloseIssues != nil {
closeIssueJob, err := c.buildCreateOutputCloseIssueJob(data, jobName)
if err != nil {
return fmt.Errorf("failed to build close_issue job: %w", err)
}
// Safe-output jobs should depend on agent job (always) AND detection job (if enabled)
if threatDetectionEnabled {
closeIssueJob.Needs = append(closeIssueJob.Needs, constants.DetectionJobName)
// Add detection success check to the job condition
closeIssueJob.If = AddDetectionSuccessCheck(closeIssueJob.If)
}
if err := c.jobManager.AddJob(closeIssueJob); err != nil {
return fmt.Errorf("failed to add close_issue job: %w", err)
}
safeOutputJobNames = append(safeOutputJobNames, closeIssueJob.Name)
}

// Build create_pull_request job if output.create-pull-request is configured
// NOTE: This is built BEFORE add_comment so that add_comment can depend on it
if data.SafeOutputs.CreatePullRequests != nil {
Expand Down
Loading
Loading