diff --git a/agent/a.1.assessment.md b/agent/a.1.assessment.md deleted file mode 100644 index 869cbf5c5..000000000 --- a/agent/a.1.assessment.md +++ /dev/null @@ -1,211 +0,0 @@ -# Assessment: CLI Studio Express Integration - -## Current State Analysis - -### How CLI Currently Handles Studio-UI - -The CLI's `studio` command currently: - -1. **Bundled Static Assets**: Studio-UI is built as a static Vue.js app and bundled into the CLI package - - Built via `npm run build:studio-ui` in CLI build process - - Assets copied to `dist/studio-ui-assets/` via `embed:studio-ui` script - - Served via Node.js `serve-static` middleware with dynamic config injection - -2. **Process Management**: CLI manages processes directly in Node.js - - **CouchDB**: Uses `CouchDBManager` class to spawn Docker containers - - **Studio-UI Server**: Creates HTTP server using Node.js `http` module - - **Process Lifecycle**: Handles graceful shutdown via SIGINT/SIGTERM handlers - -3. **Configuration Injection**: Dynamic config injection for studio-ui - - Injects CouchDB connection details into `window.STUDIO_CONFIG` - - Modifies `index.html` at runtime with database connection info - - Uses SPA fallback routing for client-side routing - -### Express Backend Architecture - -The Express backend (`@vue-skuilder/express`) is: - -1. **Standalone Service**: Designed as independent Node.js/Express application - - Main entry: `src/app.ts` - - Hardcoded port: 3000 - - Manages own CouchDB connections via `nano` client - - Handles authentication, course management, classroom operations - -2. **External Dependencies**: Requires external CouchDB instance - - Connects to CouchDB via environment configuration - - Manages multiple databases (courses, classrooms, users) - - Includes its own initialization and setup logic - -3. **Heavyweight Service**: Full-featured API server - - Authentication middleware - - File upload processing - - Complex business logic for course/classroom management - - Logging and error handling - -## Integration Options Analysis - -### Option A: Bundle Express and Run as Subprocess - -**Approach**: Bundle express into CLI and spawn it as a child process - -**Pros**: -- Clean separation of concerns -- Express runs in its own process space -- Can leverage existing Express configuration -- Easy to manage process lifecycle (start/stop) -- Familiar process management pattern (similar to CouchDB) - -**Cons**: -- Requires bundling entire Express app with CLI -- Multiple Node.js processes running -- More complex communication between CLI and Express -- Harder to pass configuration dynamically -- Potential port conflicts - -**Technical Implementation**: -```typescript -// Similar to how CLI spawns CouchDB -const expressProcess = spawn('node', [expressDistPath], { - env: { ...process.env, COUCHDB_URL: couchUrl } -}); -``` - -### Option B: Import Express Directly (Same Process) - -**Approach**: Import Express app and run it in the same Node.js process as CLI - -**Pros**: -- Single process - more efficient resource usage -- Direct communication between CLI and Express -- Easy to pass configuration objects -- Simpler deployment (single Node.js process) -- Can share CouchDB connection instances - -**Cons**: -- Tight coupling between CLI and Express -- Harder to isolate Express errors from CLI -- Express initialization could block CLI startup -- More complex to handle Express-specific configuration -- Potential conflicts with CLI's HTTP server - -**Technical Implementation**: -```typescript -// Import Express app and configure it -import { createExpressApp } from '@vue-skuilder/express'; -const expressApp = createExpressApp(couchConfig); -expressApp.listen(3000); -``` - -### Option C: Expect Express Running Separately - -**Approach**: CLI expects Express to be running as separate service - -**Pros**: -- Complete separation of concerns -- Express can be managed independently -- No changes needed to CLI architecture -- Easy to scale Express separately -- Clear service boundaries - -**Cons**: -- Additional setup complexity for users -- Need to coordinate between multiple services -- User must manage Express lifecycle manually -- Harder to provide "one-command" studio experience -- Complex error handling when Express is down - -**Technical Implementation**: -```typescript -// CLI just checks if Express is available -const expressHealthCheck = await fetch('http://localhost:3000/health'); -if (!expressHealthCheck.ok) { - throw new Error('Express server not running'); -} -``` - -### Option D: Hybrid Approach - Express Module - -**Approach**: Refactor Express into a configurable module that CLI can import and control - -**Pros**: -- Best of both worlds - modularity with integration -- CLI maintains control over process lifecycle -- Express can be configured per CLI session -- Clean API boundaries -- Reusable Express module - -**Cons**: -- Requires significant refactoring of Express package -- Breaking changes to Express architecture -- More complex implementation -- Need to maintain backward compatibility - -**Technical Implementation**: -```typescript -// Express as configurable module -import { ExpressService } from '@vue-skuilder/express'; -const expressService = new ExpressService({ - port: 3001, - couchdb: couchConfig, - logger: cliLogger -}); -await expressService.start(); -``` - -## Key Considerations - -### 1. **Process Management Consistency** -- CLI already manages CouchDB via subprocess (Docker) -- Studio-UI runs as HTTP server within CLI process -- Express subprocess would follow CouchDB pattern - -### 2. **Configuration Management** -- CLI injects config into Studio-UI at runtime -- Express needs CouchDB connection details -- Studio-UI needs to know Express endpoint - -### 3. **Port Management** -- CLI finds available ports dynamically (Studio-UI: 7174+) -- Express hardcoded to port 3000 -- Need to avoid port conflicts - -### 4. **Error Handling & Lifecycle** -- CLI handles graceful shutdown for all services -- Express needs to integrate with CLI's process management -- Studio-UI depends on both CouchDB and Express - -### 5. **User Experience** -- Current: Single `skuilder studio` command starts everything -- Goal: Maintain single-command simplicity -- Express adds complexity but provides powerful features - -## Recommendation - -**Option A: Bundle Express and Run as Subprocess** is the best approach because: - -1. **Architectural Consistency**: Matches existing CouchDB subprocess pattern -2. **Clean Separation**: Express runs independently but managed by CLI -3. **Minimal Changes**: Can reuse existing Express code with minimal refactoring -4. **Process Management**: Leverages CLI's existing process lifecycle handling -5. **Configuration**: Can pass config via environment variables (established pattern) - -### Implementation Plan - -1. **Express Modifications**: - - Make port configurable via environment variable - - Add health check endpoint - - Ensure clean shutdown on SIGTERM/SIGINT - -2. **CLI Integration**: - - Add Express process management (similar to CouchDB) - - Bundle Express dist in CLI build process - - Dynamic port allocation for Express - - Update Studio-UI config injection to include Express endpoint - -3. **Process Orchestration**: - - Start CouchDB first (as currently done) - - Start Express with CouchDB connection details - - Start Studio-UI with both CouchDB and Express endpoints - - Coordinate shutdown of all services - -This approach maintains the current architecture's clarity while adding the powerful Express backend capabilities that users need for full studio functionality. \ No newline at end of file diff --git a/cron/README.md b/cron/README.md new file mode 100644 index 000000000..4cf102942 --- /dev/null +++ b/cron/README.md @@ -0,0 +1,41 @@ +# Cron Automation Scripts + +This directory contains dev-local cron automation scripts for agentic quality control and CI/CD monitoring. + +## Scripts + +### `nightly-ci-check.ts` + +**Purpose**: Automated nightly CI health check and fix attempt + +**Requirements**: +- Node.js with `tsx` for TypeScript execution +- GitHub CLI (`gh`) authenticated +- Claude Code CLI (`claude`) +- Git 2.5+ with worktree support + +**Schedule**: Runs at 2:00 AM daily via cron + +**Workflow**: +1. Checks recent GitHub workflow runs using GitHub CLI +2. Filters for scheduled workflows with unresolved failures/cancellations +3. For each failure, creates a dedicated worktree using `nt` command +4. Collects failure data: logs, metadata, commit ranges, PR information +5. Invokes Claude Code for root cause analysis in isolated worktree +6. Generates assessment reports and attempts fixes based on confidence level + +### Configure Cron Job + +Add to your crontab (`crontab -e`): + +```cron +# Nightly CI health check at 2:00 AM +0 2 * * * cd /path/to/your/repo && npx tsx cron/nightly-ci-check.ts +``` + +### Manual Testing + +```bash +cd /path/to/your/repo +npx tsx cron/nightly-ci-check.ts +``` \ No newline at end of file diff --git a/cron/nightly-ci-check.ts b/cron/nightly-ci-check.ts new file mode 100755 index 000000000..1297a4b3c --- /dev/null +++ b/cron/nightly-ci-check.ts @@ -0,0 +1,539 @@ +#!/usr/bin/env tsx + +import { execSync, spawn } from 'child_process'; +import { writeFileSync, mkdirSync, existsSync } from 'fs'; +import { join, dirname } from 'path'; +import { fileURLToPath } from 'url'; + +// Configuration +const CONFIG = { + CLAUDE_TIMEOUT: 300000, // 5 minutes in milliseconds for testing + REPO_DIR: join(dirname(fileURLToPath(import.meta.url)), '..'), + REPORTS_DIR: '', + LOG_FILE: '', + DATE_STAMP: new Date().toISOString().slice(0, 10).replace(/-/g, '') +}; + +// Initialize paths +CONFIG.REPORTS_DIR = join(CONFIG.REPO_DIR, 'cron', 'reports'); +CONFIG.LOG_FILE = join(CONFIG.REPORTS_DIR, `nightly-ci-${CONFIG.DATE_STAMP}.log`); + +// Types +interface WorkflowRun { + databaseId: number; + workflowName: string; + conclusion: string | null; + headSha: string; + url: string; + createdAt: string; + event: string; +} + +interface FailureInfo { + runId: number; + workflowName: string; + headSha: string; + url: string; +} + +// Logging utility +function log(message: string): void { + const timestamp = new Date().toISOString().replace('T', ' ').slice(0, 19); + const logLine = `[${timestamp}] ${message}`; + + // Write to log file + try { + writeFileSync(CONFIG.LOG_FILE, logLine + '\n', { flag: 'a' }); + } catch (err) { + // If log file doesn't exist yet, create the directory and try again + mkdirSync(CONFIG.REPORTS_DIR, { recursive: true }); + writeFileSync(CONFIG.LOG_FILE, logLine + '\n', { flag: 'a' }); + } + + // Also output to stderr for real-time feedback + console.error(logLine); +} + +// Execute shell command and return output +function execCommand(command: string, options: { cwd?: string; silent?: boolean } = {}): string { + try { + const result = execSync(command, { + encoding: 'utf8', + cwd: options.cwd || CONFIG.REPO_DIR, + stdio: options.silent ? 'pipe' : ['inherit', 'pipe', 'inherit'] + }); + return result.trim(); + } catch (error: any) { + throw new Error(`Command failed: ${command}\n${error.message}`); + } +} + +// Clean workflow name for file/directory names +function cleanWorkflowName(name: string): string { + return name.replace(/[^a-zA-Z0-9]/g, '-').replace(/-+/g, '-').replace(/^-|-$/g, ''); +} + +// Get scheduled workflow failures that haven't been superseded by recent successes +async function getScheduledFailures(): Promise { + log("Checking for scheduled workflow failures..."); + + try { + // Get recent scheduled runs + const runsJson = execCommand( + 'gh run list --limit 50 --json status,conclusion,workflowName,createdAt,headSha,url,event,databaseId', + { silent: true } + ); + + const allRuns: WorkflowRun[] = JSON.parse(runsJson); + const scheduledRuns = allRuns.filter(run => run.event === 'schedule'); + + log(`DEBUG: Found ${scheduledRuns.length} scheduled runs`); + + // Group by workflow and find most recent run per workflow + const workflowGroups = new Map(); + scheduledRuns.forEach(run => { + if (!workflowGroups.has(run.workflowName)) { + workflowGroups.set(run.workflowName, []); + } + workflowGroups.get(run.workflowName)!.push(run); + }); + + const failures: FailureInfo[] = []; + + for (const [workflowName, runs] of workflowGroups) { + // Sort by creation date (newest first) + runs.sort((a, b) => new Date(b.createdAt).getTime() - new Date(a.createdAt).getTime()); + const mostRecent = runs[0]; + + log(`DEBUG: Processing workflow: ${workflowName}`); + log(`DEBUG: Latest run has conclusion: ${mostRecent.conclusion}`); + + if (mostRecent.conclusion === 'failure' || mostRecent.conclusion === 'cancelled') { + log(`DEBUG: Adding ${workflowName} to failures list`); + failures.push({ + runId: mostRecent.databaseId, + workflowName: mostRecent.workflowName, + headSha: mostRecent.headSha, + url: mostRecent.url + }); + } + } + + log(`DEBUG: Final failures list has ${failures.length} entries`); + + if (failures.length === 0) { + log("No unresolved scheduled workflow failures found"); + return []; + } + + log(`Found ${failures.length} unresolved scheduled workflow failures`); + return failures; + } catch (error: any) { + log(`ERROR: Failed to get scheduled failures: ${error.message}`); + return []; + } +} + +// Setup worktree for a specific failure +async function setupWorktreeForFailure(runId: number, workflowName: string): Promise { + const cleanWorkflow = cleanWorkflowName(workflowName); + const worktreeName = `cc-resolve-${CONFIG.DATE_STAMP}-${cleanWorkflow}`; + + log(`Setting up worktree for ${workflowName} failure (run: ${runId})`); + + try { + // Create worktree using git worktree command directly (force override if exists) + const worktreePath = join('..', worktreeName); + + // Remove existing worktree if it exists + try { + execCommand(`git worktree remove -f "${worktreePath}"`, { silent: true }); + } catch (e) { + // Ignore errors if worktree doesn't exist + } + + execCommand(`git worktree add "${worktreePath}"`, { silent: true }); + + // Create reports directory in worktree + mkdirSync(join(worktreePath, 'cron', 'reports'), { recursive: true }); + + log(`Created worktree: ${worktreePath}`); + return worktreePath; + } catch (error: any) { + log(`ERROR: Failed to create worktree ${worktreeName}: ${error.message}`); + throw error; + } +} + +// Collect failure data for analysis +async function collectFailureData( + runId: number, + workflowName: string, + headSha: string, + runUrl: string, + worktreePath: string +): Promise { + log(`Collecting failure data for ${workflowName} (run: ${runId})`); + + const cleanWorkflow = cleanWorkflowName(workflowName); + const reportsDir = join(worktreePath, 'cron', 'reports'); + + try { + // Get failure logs + log("Fetching failure logs..."); + const failureLogs = execCommand(`gh run view ${runId} --log`, { silent: true }); + writeFileSync(join(reportsDir, `failure-logs-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), failureLogs); + + // Get detailed run info + const runDetails = execCommand( + `gh run view ${runId} --json jobs,conclusion,workflowName,headSha,url,createdAt`, + { silent: true } + ); + writeFileSync(join(reportsDir, `failure-details-${cleanWorkflow}-${CONFIG.DATE_STAMP}.json`), runDetails); + + // Find last successful run of same workflow + log(`Finding last successful run of ${workflowName}...`); + try { + const successfulRunsJson = execCommand( + `gh run list --workflow="${workflowName}" --status=success --limit 1 --json status,conclusion,workflowName,createdAt,headSha,url,event,databaseId`, + { silent: true } + ); + + const successfulRuns: WorkflowRun[] = JSON.parse(successfulRunsJson); + const lastGoodRun = successfulRuns.find(run => run.event === 'schedule'); + + if (lastGoodRun) { + const goodSha = lastGoodRun.headSha; + const goodRunId = lastGoodRun.databaseId; + + log(`Found last good run: ${goodRunId} (sha: ${goodSha})`); + + // Get commits between good and bad + log(`Analyzing commits between ${goodSha} and ${headSha}...`); + try { + // Basic commit range + const commitRange = execCommand( + `git log --oneline --pretty=format:'%h|%s|%an|%ad' --date=short ${goodSha}..${headSha}`, + { silent: true } + ); + writeFileSync(join(reportsDir, `commit-range-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), commitRange); + + // Detailed git log with full messages + const detailedLog = execCommand( + `git log --pretty=format:'%H%n%an <%ae>%n%ad%n%s%n%n%b%n---COMMIT-END---' --date=iso ${goodSha}..${headSha}`, + { silent: true } + ); + writeFileSync(join(reportsDir, `detailed-commits-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), detailedLog); + + // Git diff summary (files changed) + const diffStat = execCommand( + `git diff --stat ${goodSha}..${headSha}`, + { silent: true } + ); + writeFileSync(join(reportsDir, `diff-stat-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), diffStat); + + // Git diff --name-status (files and change types) + const nameStatus = execCommand( + `git diff --name-status ${goodSha}..${headSha}`, + { silent: true } + ); + writeFileSync(join(reportsDir, `diff-name-status-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), nameStatus); + + // Full git diff (careful - this could be large) + try { + const fullDiff = execCommand( + `git diff ${goodSha}..${headSha}`, + { silent: true } + ); + // Only write if diff is reasonable size (< 1MB) + if (fullDiff.length < 1024 * 1024) { + writeFileSync(join(reportsDir, `full-diff-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), fullDiff); + } else { + writeFileSync(join(reportsDir, `full-diff-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), + `Diff too large (${Math.round(fullDiff.length / 1024)}KB) - skipped for performance`); + } + } catch (diffError) { + writeFileSync(join(reportsDir, `full-diff-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), "Could not generate full diff"); + } + + } catch (gitError) { + log("Could not get commit range - continuing without it"); + writeFileSync(join(reportsDir, `commit-range-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), "No commit range available"); + writeFileSync(join(reportsDir, `detailed-commits-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), "No detailed commits available"); + writeFileSync(join(reportsDir, `diff-stat-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), "No diff stat available"); + writeFileSync(join(reportsDir, `diff-name-status-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), "No name status available"); + writeFileSync(join(reportsDir, `full-diff-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), "No full diff available"); + } + + // Get PR information for commits in range + const prInfoFile = join(reportsDir, `pr-info-${cleanWorkflow}-${CONFIG.DATE_STAMP}.json`); + try { + const commits = execCommand( + `git log --pretty=format:'%H' ${goodSha}..${headSha}`, + { silent: true } + ).split('\n').filter(sha => sha.trim()); + + const prInfo = []; + for (const commitSha of commits) { + try { + const prData = execCommand( + `gh pr list --search "${commitSha}" --json number,title,author,mergedAt,url --limit 1`, + { silent: true } + ); + const prs = JSON.parse(prData); + if (prs.length > 0) { + prInfo.push({ + commit: commitSha, + pr: prs[0] + }); + } + } catch (prError) { + // Skip if can't find PR for this commit + continue; + } + } + writeFileSync(prInfoFile, JSON.stringify(prInfo, null, 2)); + } catch (prError) { + writeFileSync(prInfoFile, '[]'); + } + + writeFileSync(join(reportsDir, `last-good-run-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), `${goodSha}|${goodRunId}`); + } else { + log(`WARNING: No successful scheduled run found for ${workflowName}`); + writeFileSync(join(reportsDir, `last-good-run-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), "No successful run found"); + } + } catch (error) { + log(`WARNING: Could not find successful runs for ${workflowName}`); + writeFileSync(join(reportsDir, `last-good-run-${cleanWorkflow}-${CONFIG.DATE_STAMP}.txt`), "No successful run found"); + } + + // Create metadata file + const metadata = { + runId, + workflowName, + headSha, + runUrl, + analysisDate: new Date().toISOString(), + worktreePath + }; + writeFileSync(join(reportsDir, `metadata-${cleanWorkflow}-${CONFIG.DATE_STAMP}.json`), JSON.stringify(metadata, null, 2)); + + log(`Data collection completed for ${workflowName}`); + } catch (error: any) { + log(`ERROR: Failed to collect data for ${workflowName}: ${error.message}`); + throw error; + } +} + +// Invoke Claude analysis in worktree +async function invokeClaudeAnalysis(workflowName: string, worktreePath: string): Promise { + const cleanWorkflow = cleanWorkflowName(workflowName); + log(`Invoking Claude analysis for ${workflowName} in ${worktreePath}`); + + const claudePrompt = `# CI Failure Analysis Task + +You are in a dedicated worktree for analyzing a GitHub Actions workflow failure. + +## Your Task + +1. **Analyze the failure data** in the cron/reports/ directory +2. **Identify the root cause** of the ${workflowName} workflow failure +3. **Create a detailed assessment** report +4. **Suggest concrete fixes** if the confidence level is high +5. **Write your findings** to a file called "analysis-${cleanWorkflow}-${CONFIG.DATE_STAMP}.md" + +## Available Data + +The cron/reports/ directory contains: +- failure-logs-*.txt: Full workflow failure logs +- failure-details-*.json: Structured failure information +- commit-range-*.txt: Commits between last success and failure +- detailed-commits-*.txt: Full commit messages and details +- diff-stat-*.txt: Files changed summary +- full-diff-*.txt: Complete code changes +- pr-info-*.json: Related pull request information +- metadata-*.json: Run metadata + +## Analysis Framework + +Please structure your analysis as follows: + +### 1. Executive Summary +- Brief description of the failure +- Impact assessment +- Confidence level in diagnosis (High/Medium/Low) + +### 2. Root Cause Analysis +- Primary cause of failure +- Contributing factors +- Timeline of events + +### 3. Code Analysis +- Specific changes that triggered the failure +- Code quality issues identified +- Test coverage gaps + +### 4. Recommendations +- Immediate fixes needed +- Long-term improvements +- Prevention strategies + +### 5. Implementation Plan +- Step-by-step fix instructions +- Testing recommendations +- Risk assessment + +Focus on actionable insights that will help prevent similar failures.`; + + try { + // Test write permissions in worktree + try { + writeFileSync(join(worktreePath, 'write-test.tmp'), 'test'); + execCommand(`rm -f "${join(worktreePath, 'write-test.tmp')}"`, { silent: true }); + log(`DEBUG: Write permissions confirmed in ${worktreePath}`); + } catch (permError) { + log(`WARNING: Write permission issues in ${worktreePath}: ${permError}`); + } + + // Write prompt to file for debugging + writeFileSync(join(worktreePath, 'claude-prompt.txt'), claudePrompt); + log(`DEBUG: Wrote Claude prompt to ${join(worktreePath, 'claude-prompt.txt')}`); + + // Invoke Claude for actual CI analysis + log(`Starting CI failure analysis for ${workflowName}...`); + + try { + // Use Claude CLI to analyze the failure with all collected data + const analysisResult = execSync(`claude -p "${claudePrompt.replace(/"/g, '\\"')}"`, { + encoding: 'utf8', + cwd: worktreePath, + env: { ...process.env }, + stdio: 'pipe', + timeout: CONFIG.CLAUDE_TIMEOUT + }).trim(); + + log(`Claude analysis completed for ${workflowName}`); + log(`Analysis result length: ${analysisResult.length} characters`); + + // Write the analysis result to a file + const analysisFile = join(worktreePath, 'cron', 'reports', `claude-analysis-${cleanWorkflow}-${CONFIG.DATE_STAMP}.md`); + writeFileSync(analysisFile, analysisResult); + log(`Analysis written to: ${analysisFile}`); + + return true; + } catch (claudeError: any) { + log(`ERROR: Claude analysis failed: ${claudeError.message}`); + if (claudeError.stderr) { + log(`ERROR: Claude stderr: ${claudeError.stderr}`); + } + if (claudeError.stdout) { + log(`ERROR: Claude stdout: ${claudeError.stdout}`); + } + return false; + } + } catch (error: any) { + log(`ERROR: Claude analysis failed for ${workflowName}: ${error.message}`); + return false; + } +} + +// Main execution +async function main(): Promise { + log(`Starting nightly CI check for ${CONFIG.REPO_DIR}`); + + try { + // Ensure we're in a git repository + execCommand('git rev-parse --is-inside-work-tree', { silent: true }); + } catch (error) { + log("ERROR: Not in a git repository. Exiting."); + process.exit(1); + } + + // Check if required commands are available + const requiredCommands = ['gh', 'claude']; + for (const cmd of requiredCommands) { + try { + execCommand(`command -v ${cmd}`, { silent: true }); + } catch (error) { + log(`ERROR: ${cmd} command not found. Please ensure it's available in PATH.`); + process.exit(1); + } + } + + // Check if git worktree is available + try { + execCommand('git worktree --help', { silent: true }); + } catch (error) { + log("ERROR: git worktree not available. Please ensure you have Git 2.5+."); + process.exit(1); + } + + log("Starting automated CI analysis..."); + + try { + // Get scheduled workflow failures + const failures = await getScheduledFailures(); + + if (failures.length === 0) { + log("No scheduled workflow failures to process. Exiting."); + return; + } + + let failureCount = 0; + let successCount = 0; + + // Process each failure + for (const failure of failures) { + failureCount++; + log(`Processing failure ${failureCount}: ${failure.workflowName} (run: ${failure.runId})`); + + try { + // Setup worktree for this failure + const worktreePath = await setupWorktreeForFailure(failure.runId, failure.workflowName); + + // Collect failure data + await collectFailureData( + failure.runId, + failure.workflowName, + failure.headSha, + failure.url, + worktreePath + ); + + // Invoke Claude analysis + const success = await invokeClaudeAnalysis(failure.workflowName, worktreePath); + if (success) { + successCount++; + log(`Successfully completed analysis for ${failure.workflowName}`); + } else { + log(`ERROR: Analysis failed for ${failure.workflowName}`); + } + } catch (error: any) { + log(`ERROR: Failed to process ${failure.workflowName}: ${error.message}`); + } + } + + log(`Nightly CI check completed. Processed ${failureCount} failures, ${successCount} successful analyses.`); + log(`See full log at: ${CONFIG.LOG_FILE}`); + + if (successCount > 0) { + log(`Analysis reports generated in worktree directories:`); + for (const failure of failures) { + const cleanWorkflow = cleanWorkflowName(failure.workflowName); + const worktreePath = join('..', `cc-resolve-${CONFIG.DATE_STAMP}-${cleanWorkflow}`); + log(` - ${worktreePath}/cron/reports/`); + } + } + } catch (error: any) { + log(`ERROR: Main execution failed: ${error.message}`); + process.exit(1); + } +} + +// Run if this is the main module +if (import.meta.url === `file://${process.argv[1]}`) { + main().catch((error) => { + console.error('Unhandled error:', error); + process.exit(1); + }); +} \ No newline at end of file diff --git a/packages/cli/package.json b/packages/cli/package.json index 14cbf845b..ca5943550 100644 --- a/packages/cli/package.json +++ b/packages/cli/package.json @@ -27,7 +27,7 @@ "lint": "npx eslint .", "lint:fix": "npx eslint . --fix", "lint:check": "npx eslint . --max-warnings 0", - "try:init": "node dist/cli.js init testproject --dangerously-clobber --no-interactive --data-layer static --import-course-data --import-server-url http://localhost:5984 --import-username admin --import-password password --import-course-ids 2aeb8315ef78f3e89ca386992d00825b && cd testproject && npm i && npm install --save-dev @vue-skuilder/cli@file:.. && npm install @vue-skuilder/db@file:../../db && npm install @vue-skuilder/courses@file:../../courses && npm install @vue-skuilder/common-ui@file:../../common-ui" + "try:init": "node dist/cli.js init testproject --dangerously-clobber --no-interactive --data-layer static --import-course-data --import-server-url http://localhost:5984 --import-username admin --import-password password --import-course-ids 2aeb8315ef78f3e89ca386992d00825b && cd testproject && npm i && npm install --save-dev @vue-skuilder/cli@file:.. && npm install @vue-skuilder/db@file:../../db @vue-skuilder/courses@file:../../courses @vue-skuilder/common-ui@file:../../common-ui" }, "keywords": [ "cli", diff --git a/packages/common-ui/src/components/CourseCardBrowser.vue b/packages/common-ui/src/components/CourseCardBrowser.vue index 3268b6fae..c10834aa7 100644 --- a/packages/common-ui/src/components/CourseCardBrowser.vue +++ b/packages/common-ui/src/components/CourseCardBrowser.vue @@ -32,14 +32,13 @@ {{ cardPreview[c.id] }} - {{ c.id.split('-').length === 3 ? c.id.split('-')[2] : '' }} + ELO: {{ cardElos[idParse(c.id)]?.global.score || '(unknown)' }}