Skip to content

Workflow is an AI-powered system that converts natural language requests into structured implementation plans. It analyzes repository context via Gitea-MCP integration, then uses AI to generate detailed, phase-based plans in JSON format, streamlining the development process with smart caching and cross-platform support.

License

Notifications You must be signed in to change notification settings

Smcgorry/workflow

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

4 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸš€ Workflow - AI-Powered Development Workflow System

An intelligent workflow system that generates structured implementation plans from natural language requests using AI and repository context analysis.

Python 3.8+ License: MIT Code style: black

✨ Features

  • πŸ€– AI-Powered Planning: Converts natural language requests into detailed implementation plans
  • πŸ“‹ Structured Output: Generates JSON plans with phases, file targets, and validation steps
  • πŸ”— Repository Integration: Automatically fetches and analyzes repository context via Gitea
  • 🌐 Cross-Platform: Works on Windows, Linux, and macOS
  • ⚑ Easy Setup: Simple configuration and one-command execution
  • πŸš€ MCP Integration: Uses Model Context Protocol for efficient repository access
  • πŸ’Ύ Smart Caching: File-based caching for improved performance

πŸ—οΈ Architecture

User Input β†’ Gitea-MCP β†’ Context Builder β†’ AI Model β†’ Implementation Plan

Core Components

  • workflow/main.py: Main orchestrator and entry point
  • workflow/trigger.py: Input parsing and validation
  • workflow/gitea_connector.py: Gitea repository context fetching via MCP
  • workflow/context_builder.py: Context combination and analysis
  • workflow/model_interface.py: AI model interface and plan generation
  • mcp/server.py: Gitea MCP server implementation
  • mcp/client.py: Simplified MCP client for repository operations

πŸš€ Quick Start

1. Setup

# Clone and navigate to the project
cd workflow

# Install dependencies and create secure config templates
python setup.py

# Configure your credentials (IMPORTANT: Use local config files)
# Linux/Mac:
cp config/config.sh config/local_config.sh
nano config/local_config.sh  # Edit with your actual credentials
source config/local_config.sh

# Windows:
copy config\config.bat config\local_config.bat
notepad config\local_config.bat  # Edit with your actual credentials
config\local_config.bat

πŸ”’ Security Note: Never commit local_config.* files to version control. They contain your sensitive credentials and are automatically excluded by .gitignore.

2. Usage

cd workflow
python main.py

The system will guide you through an interactive workflow:

  1. Select Project: Choose from available projects or enter custom
  2. Select Branch: Choose from common branches or enter custom
  3. Enter Query: Describe what you want to implement
  4. Confirm & Execute: Review and run the workflow

3. Example Session

πŸ” Select a project:
  1. Forge/maestro
  2. Test/Oracle
  3. Enter custom project...

Select option (1-3): 1
βœ… Selected project: Forge/maestro

πŸ” Select a branch:
  1. main
  2. master
  3. develop
  4. Enter custom branch...

Select option (1-4): 2
βœ… Selected branch: master

πŸ€– Your request: Add dark mode toggle to the UI

πŸš€ Proceed with workflow? (y/N): y

πŸ“‹ Configuration

πŸ”’ IMPORTANT: Always use local configuration files for your credentials:

Secure Configuration Method

  1. Create local config files:

    # Linux/Mac
    cp config/config.sh config/local_config.sh
    
    # Windows
    copy config\config.bat config\local_config.bat
  2. Edit with your actual credentials:

    • GITEA_URL: Your Gitea instance URL (e.g., https://gitea.example.com)
    • GITEA_ACCESS_TOKEN: Personal access token from Gitea Settings > Applications
    • GITMCP_SERVER_URL: Gitea-MCP server endpoint
    • OPENAI_API_BASE_URL: OpenAI-compatible API endpoint
  3. Alternative: Use .env file:

    cp config/env.example .env
    nano .env  # Edit with your credentials

⚠️ Security Warning:

  • Never commit local_config.* or .env files
  • Never edit the template files (config.sh, config.bat) with real credentials
  • Use HTTPS URLs when possible
  • Generate tokens with minimal required permissions

πŸ“ Output

The system generates:

  • plan.json: Structured implementation plan
  • Phase-based breakdown: Clear steps with file targets and validation
  • Context-aware suggestions: Based on actual repository analysis

πŸ“‹ Dependencies

System Requirements

  • Python 3.8+ (3.11+ recommended)
  • Git (recommended for repository operations)
  • Access to Gitea instance (for repository context)
  • OpenAI-compatible LLM server (for plan generation)

Python Dependencies

Core Dependencies

requests>=2.26.0          # HTTP requests for API interactions
python-dotenv>=0.19.0     # Environment variable management
pydantic>=1.8.0           # JSON handling and validation
GitPython>=3.1.0          # Git operations (optional)

AI/LLM Integration

langchain>=0.0.267        # LangChain framework
langchain-openai>=0.0.1   # OpenAI integration
langchain-core>=0.0.1     # LangChain core components
langchain-community       # Community integrations (VLLMOpenAI)

MCP (Model Context Protocol)

mcp>=0.0.1                    # Base MCP package
langchain-mcp-adapters>=0.0.1 # LangChain MCP integration
python-multipart>=0.0.6       # Multipart request support

Server Components

aiohttp>=3.8.0            # Async HTTP server (for MCP server)
asyncio                   # Async programming support

Optional Dependencies

Development Tools

pytest>=6.0.0             # Testing framework
pytest-cov>=2.0.0         # Coverage reporting
black>=22.0.0             # Code formatting
flake8>=4.0.0             # Code linting
mypy>=0.900               # Type checking

Performance & Monitoring

memory-profiler           # Memory usage profiling
psutil                    # System monitoring

Installation

Quick Install

# Install all dependencies
pip install -r requirements.txt

# Or use the setup script
python setup.py

Manual Installation

# Core dependencies
pip install requests python-dotenv pydantic GitPython

# AI/LLM components
pip install langchain langchain-openai langchain-core langchain-community

# MCP components
pip install mcp langchain-mcp-adapters python-multipart

# Server components
pip install aiohttp

Development Setup

# Install development dependencies
pip install pytest pytest-cov black flake8 mypy memory-profiler

# Or install from dev requirements
pip install -r requirements-dev.txt  # If available

External Services

Required Services

  • Gitea Server: Repository hosting and API access
  • OpenAI-compatible LLM: For plan generation (e.g., OpenAI API, local models via vLLM, Ollama)

Optional Services

  • Redis/Memcached: For advanced caching (future enhancement)
  • PostgreSQL/MySQL: For persistent storage (future enhancement)

πŸ“ Project Structure

workflow/
β”œβ”€β”€ workflow/               # Core workflow components
β”‚   β”œβ”€β”€ __init__.py
β”‚   β”œβ”€β”€ main.py            # Main entry point
β”‚   β”œβ”€β”€ trigger.py         # User input handling
β”‚   β”œβ”€β”€ gitea_connector.py # Repository context fetching
β”‚   β”œβ”€β”€ context_builder.py # Context combination
β”‚   └── model_interface.py # AI model interface
β”œβ”€β”€ mcp/                   # MCP server and client
β”‚   β”œβ”€β”€ __init__.py
β”‚   β”œβ”€β”€ server.py         # Gitea MCP server
β”‚   └── client.py         # Simplified MCP client
β”œβ”€β”€ utils/                 # Utility modules
β”‚   β”œβ”€β”€ __init__.py
β”‚   β”œβ”€β”€ banner.py         # UI utilities
β”‚   β”œβ”€β”€ context_optimizer.py
β”‚   └── file_cache.py     # Caching system
β”œβ”€β”€ config/               # Configuration files
β”‚   β”œβ”€β”€ config.sh         # Linux/Mac config
β”‚   └── config.bat        # Windows config
β”œβ”€β”€ docs/                 # Documentation
β”‚   β”œβ”€β”€ ARCHITECTURE.md   # System architecture
β”‚   └── DEVELOPMENT.md    # Development guide
β”œβ”€β”€ examples/             # Example files
β”œβ”€β”€ tests/                # Test suite
β”œβ”€β”€ .github/              # GitHub workflows
β”œβ”€β”€ requirements.txt      # Dependencies
β”œβ”€β”€ requirements-dev.txt  # Development dependencies
β”œβ”€β”€ setup.py             # Package setup
β”œβ”€β”€ .gitignore           # Git ignore rules
β”œβ”€β”€ LICENSE              # MIT License
└── README.md            # This file

πŸš€ Advanced Usage

Running MCP Server Separately

# Start the MCP server in background
python mcp/server.py &

# Run workflow with existing server
cd workflow
python main.py

Programmatic Usage

from workflow import WorkflowTrigger, GiteaMCPConnector, ContextBuilder, OpenAIModelInterface

# Initialize components
trigger = WorkflowTrigger()
connector = GiteaMCPConnector()
builder = ContextBuilder()
model = OpenAIModelInterface()

# Get user input
user_input = trigger.submit_issue("Forge/maestro,main,Add dark mode toggle")

# Fetch repository context
repo_context = connector.get_project_repository_context("Forge/maestro", "main")

# Build combined context
combined_context = builder.build_combined_context(user_input, repo_context)

# Generate plan
plan = model.generate_llm_template_and_send(combined_context)

Custom Configuration

# Create custom config
cp config/config.sh my_config.sh
# Edit my_config.sh with your settings
source my_config.sh

# Run with custom config
cd workflow
python main.py

πŸ”§ Troubleshooting

Common Issues

  1. MCP Server Connection Failed

    # Check if server is running
    curl http://localhost:8080/version
    
    # Start server manually
    python mcp/server.py
  2. Gitea Authentication Failed

    # Verify token in config
    echo $GITEA_ACCESS_TOKEN
    
    # Test Gitea API access
    curl -H "Authorization: token $GITEA_ACCESS_TOKEN" $GITEA_URL/api/v1/user
  3. AI Model Connection Failed

    # Test model server
    curl $OPENAI_API_BASE_URL/models
    
    # Check model availability
    curl -X POST $OPENAI_API_BASE_URL/chat/completions \
      -H "Content-Type: application/json" \
      -d '{"model":"your-model","messages":[{"role":"user","content":"test"}]}'

Debug Mode

# Enable debug logging
export PYTHONPATH=.
cd workflow
python -c "
import logging
logging.basicConfig(level=logging.DEBUG)
from main import main
main()
"

πŸ“Š Performance

  • File Caching: Reduces API calls by 80-90% for repeated requests
  • Context Optimization: Handles repositories up to 10,000 files efficiently
  • Async Operations: MCP server supports concurrent requests
  • Memory Efficient: Streaming file processing for large repositories

πŸ”’ Security

This project follows security best practices:

  • πŸ” Credential Protection: All sensitive data in local config files (excluded from version control)
  • πŸ›‘οΈ Input Validation: Sanitized user inputs and API responses
  • 🌐 HTTPS Support: SSL/TLS validation for secure connections
  • πŸ“ Secure File Permissions: Restricted access to configuration files
  • 🚫 No Hardcoded Secrets: No credentials committed to repository
  • ⚑ Minimal Permissions: API tokens require only necessary access rights
  • πŸ”„ Regular Updates: Dependencies monitored for security vulnerabilities

Security Features

  • Automatic .gitignore rules for sensitive files
  • Cross-platform secure file permissions
  • Environment variable validation
  • SSL certificate verification
  • Request timeout protection
  • File size limits for uploads

πŸ“– For detailed security guidelines, see docs/SECURITY.md

🀝 Contributing

We welcome contributions! Please see our Development Guide for details.

Quick Start for Contributors

# Fork and clone the repository
git clone https://github.com/your-username/workflow.git
cd workflow

# Create virtual environment
python -m venv venv
source venv/bin/activate  # Linux/Mac
# or venv\Scripts\activate  # Windows

# Install development dependencies
pip install -r requirements-dev.txt

# Run tests
pytest tests/

# Format code
black .

# Submit pull request

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

About

Workflow is an AI-powered system that converts natural language requests into structured implementation plans. It analyzes repository context via Gitea-MCP integration, then uses AI to generate detailed, phase-based plans in JSON format, streamlining the development process with smart caching and cross-platform support.

Resources

License

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published