Skip to content

jxwalker/cursor-git-workflow

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

64 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🚀 Cursor Git Workflow

Automated AI-powered code review system for GitHub pull requests. Get comprehensive feedback in minutes.

License: MIT

Overview

Cursor Git Workflow is a GitHub Actions-based tool that automatically reviews code changes using AI (GPT-4) and provides detailed feedback directly on pull requests. It helps development teams maintain high code quality standards through automated analysis.

Key Features

  • 🤖 AI-Powered Code Reviews - Leverages GPT-4 for intelligent code analysis
  • 📊 Quality Metrics - Provides objective scoring (0-10) for code changes
  • 💡 Actionable Feedback - Specific suggestions with line-level annotations
  • 🔧 Auto-Formatting - Optionally applies style fixes automatically
  • 📥 IDE Integration - Download feedback directly to Cursor/VS Code
  • ⚙️ Configurable - Customize review criteria and standards via YAML

Quick Start

See QUICKSTART_5MIN.md for detailed setup instructions.

Installation Summary

# 1. Clone repository
git clone https://github.com/jxwalker/cursor-git-workflow
cd cursor-git-workflow

# 2. Run setup
./scripts/setup-environment.sh

# 3. Copy workflow to your project
cp -r .github/workflows /your/project/
cp -r scripts/ci-cd /your/project/scripts/

# 4. Add OPENAI_API_KEY to GitHub Secrets

# 5. Create a pull request to trigger reviews

How It Works

  1. Developer creates a pull request
  2. GitHub Actions triggers the review workflow
  3. AI analyzes code changes against configured standards
  4. Detailed feedback appears as PR comments
  5. Optional: Feedback downloads to developer's IDE

Example Review Output

## 🤖 AI Code Review

### 🟢 Overall Score: 9/10
### ✅ Ready to Merge: Yes
### 🚀 Production Readiness: 95%

**Summary:** Code demonstrates good structure and error handling...

### 🚨 Issues Found
- ⚠️ **High - Security** (Line 45): Potential SQL injection vulnerability
  💡 **Suggestion:** Use parameterized queries instead

### ✅ Good Practices Found
- Comprehensive error handling
- Well-documented functions
- Consistent code style

Requirements

  • GitHub repository
  • OpenAI API key (Get one here)
  • Python 3.8+ (for local development tools)

Configuration

Basic Configuration

Add to GitHub repository secrets:

  • OPENAI_API_KEY - Your OpenAI API key

Advanced Configuration

Create .cursor-workflow.yml in your repository root:

# AI Model Settings
ai:
  model: gpt-4  # or gpt-3.5-turbo for lower cost
  temperature: 0.1
  max_tokens: 2000

# Review Standards
review:
  strictness: moderate  # lenient, moderate, or strict
  passing_score: 8
  
# File Filtering  
files:
  exclude:
    - "*.md"
    - "tests/*"
    - ".github/*"

See .cursor-workflow.example.yml for all options.

Local Development Tools

Feedback Download

Monitor and download AI feedback locally:

# One-time check
./scripts/auto-download-feedback.sh

# Continuous monitoring
./scripts/auto-download-feedback.sh --watch

Background Monitoring

Run a daemon to monitor all PRs:

# Start monitoring
./scripts/feedback-daemon.sh start

# Check status
./scripts/feedback-daemon.sh status

# Stop monitoring
./scripts/feedback-daemon.sh stop

Cost Considerations

Typical costs per pull request:

  • GPT-4: ~$0.03
  • GPT-3.5-turbo: ~$0.002

Monitor usage at OpenAI Platform.

Troubleshooting

Common Issues

Workflow not triggering?

  • Verify workflow file exists in .github/workflows/
  • Check Actions tab for error messages

No AI comments appearing?

  • Confirm OPENAI_API_KEY is set in repository secrets
  • Review workflow logs for specific errors

Rate limit errors?

  • Workflow includes automatic retry logic
  • Consider using GPT-3.5-turbo for higher rate limits

For detailed troubleshooting, see docs/TROUBLESHOOTING.md.

Contributing

We welcome contributions! Please see CONTRIBUTING.md for guidelines.

Development Setup

# Create virtual environment
python -m venv venv
source venv/bin/activate

# Install dependencies
pip install -r requirements.txt

# Run tests
pytest

License

MIT License - see LICENSE file for details.

Support


Built to help teams maintain high code quality standards through intelligent automation.

About

A repository for demonstrating Git workflows with Cursor IDE

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •