diff --git a/.claudeignore b/.claudeignore
index e2698d3b0..1450cf91a 100644
--- a/.claudeignore
+++ b/.claudeignore
@@ -1,9 +1,6 @@
# Claude Code ignore file
# Directories and files that Claude Code should not analyze
-# Archived demos - outdated system, should not be used for documentation
-demos/rfe-builder-archived/
-
# Git internal files
.git/
diff --git a/.gitignore b/.gitignore
index 6d23e808c..32afc53a4 100644
--- a/.gitignore
+++ b/.gitignore
@@ -102,8 +102,6 @@ venv.bak/
.dmypy.json
dmypy.json
-demos/rfe-builder/.streamlit/secrets.toml
-
.claude/settings.local.json
# macOS system files
diff --git a/archive/mcp_client_integration/INSTALLATION.md b/archive/mcp_client_integration/INSTALLATION.md
deleted file mode 100644
index e706db7a1..000000000
--- a/archive/mcp_client_integration/INSTALLATION.md
+++ /dev/null
@@ -1,415 +0,0 @@
-# MCP Client Integration - Installation Guide
-
-This guide covers installation and usage of the MCP Client Integration library for various deployment scenarios.
-
-## Quick Start
-
-### Prerequisites
-
-- Python 3.8 or higher
-- Virtual environment (recommended)
-
-### Basic Installation
-
-```bash
-# From the vTeam project root
-cd src/mcp_client_integration
-pip install -e .
-```
-
-### Development Installation
-
-```bash
-# Install with development dependencies
-cd src/mcp_client_integration
-pip install -e ".[dev]"
-```
-
-## Installation Scenarios
-
-### 1. Local Development
-
-For local development and testing:
-
-```bash
-# Clone the repository
-git clone https://github.com/red-hat-data-services/vTeam.git
-cd vTeam/src/mcp_client_integration
-
-# Create virtual environment
-python -m venv venv
-source venv/bin/activate # Linux/Mac
-# or
-venv\Scripts\activate # Windows
-
-# Install in development mode
-pip install -e ".[dev]"
-```
-
-### 2. Integration in Other Projects
-
-#### Option A: Direct Installation from Source
-
-Add to your `requirements.txt`:
-```
--e git+https://github.com/red-hat-data-services/vTeam.git#egg=mcp-client-integration&subdirectory=src/mcp_client_integration
-```
-
-#### Option B: Local Path Installation
-
-```bash
-# Install from local path
-pip install -e /path/to/vTeam/src/mcp_client_integration
-```
-
-#### Option C: Copy Package
-
-Copy the entire `src/mcp_client_integration` directory to your project and install:
-
-```bash
-pip install -e ./mcp_client_integration
-```
-
-### 3. Production Deployment
-
-For production deployments, create a wheel package:
-
-```bash
-cd src/mcp_client_integration
-pip install build
-python -m build
-pip install dist/mcp_client_integration-1.0.0-py3-none-any.whl
-```
-
-## Configuration
-
-### Environment Variables
-
-Set up MCP servers via environment variable:
-
-```bash
-export MCP_SERVERS='{
- "atlassian": "https://mcp-atlassian.apps.cluster.com/sse",
- "github": "https://mcp-github.apps.cluster.com/sse",
- "confluence": "mcp-confluence.default.svc.cluster.local:8080"
-}'
-```
-
-### Security Configuration
-
-For production deployments, use production mode for enhanced security:
-
-```python
-from mcp_client_integration.common import MCPConfigurationManager
-
-# Production mode - strict security validation
-config_manager = MCPConfigurationManager(production_mode=True)
-
-# Development mode - more permissive for testing
-config_manager = MCPConfigurationManager(production_mode=False)
-```
-
-⚠️ **Security Note**: Always use `production_mode=True` in production environments for enhanced security validation. See [SECURITY.md](SECURITY.md) for detailed security guidelines.
-
-### Advanced Configuration
-
-```bash
-export MCP_SERVERS='{
- "atlassian": {
- "endpoint": "https://mcp-atlassian.example.com/sse",
- "timeout": 60,
- "connection_type": "external_route",
- "enabled": true,
- "metadata": {"team": "platform"}
- },
- "confluence": {
- "endpoint": "mcp-confluence.default.svc.cluster.local:8080",
- "timeout": 30,
- "connection_type": "cluster_service",
- "enabled": true
- }
-}'
-```
-
-## Usage Examples
-
-### Basic Usage
-
-```python
-import asyncio
-from mcp_client_integration import SimpleMCPClient
-
-async def main():
- # Initialize client
- client = SimpleMCPClient()
-
- # Connect to all servers
- await client.connect_all()
-
- # Send queries
- response = await client.query("What Jira tickets are assigned to me?")
- print(response)
-
- # Health check
- health = await client.health_check()
- print(f"Server health: {health}")
-
- # Cleanup
- await client.disconnect_all()
-
-# Run the client
-asyncio.run(main())
-```
-
-### LlamaIndex Integration
-
-```python
-from mcp_client_integration import MCPLlamaIndexTool
-
-# Create tool
-mcp_tool = MCPLlamaIndexTool()
-
-# Use with LlamaIndex (requires llama-index to be installed separately)
-try:
- from llama_index.core.agent import ReActAgent
-
- # Create agent with MCP tool
- agent = ReActAgent.from_tools([mcp_tool.to_llama_index_tool()])
- response = agent.chat("Search for recent Jira tickets")
- print(response)
-except ImportError:
- # Use directly without LlamaIndex
- result = mcp_tool("Search for recent Jira tickets")
- print(result)
-```
-
-### Endpoint Validation
-
-```python
-from mcp_client_integration import MCPEndpointConnector
-
-async def validate_endpoints():
- connector = MCPEndpointConnector()
-
- # Validate endpoint formats
- valid = connector.validate_endpoint_config("https://mcp-server.com/sse")
- print(f"Endpoint valid: {valid}")
-
- # Test connectivity
- result = await connector.test_connectivity("https://mcp-server.com/sse")
- print(f"Connectivity: {result}")
-
-asyncio.run(validate_endpoints())
-```
-
-## Integration with demos/rfe-builder
-
-The MCP client integration is designed to work seamlessly with the RFE builder demo:
-
-```python
-# In demos/rfe-builder application
-from mcp_client_integration import SimpleMCPClient, MCPLlamaIndexTool
-
-# Use in your RFE workflow
-async def enhance_rfe_with_data():
- client = SimpleMCPClient()
- await client.connect_all()
-
- # Query for related tickets
- jira_data = await client.query("Find related tickets for this RFE", "atlassian")
-
- # Process with LlamaIndex
- tool = MCPLlamaIndexTool()
- enhanced_result = await tool.call("Analyze this data for RFE insights")
-
- return enhanced_result
-```
-
-## Docker Deployment
-
-### Dockerfile Example
-
-```dockerfile
-FROM python:3.11-slim
-
-WORKDIR /app
-
-# Copy MCP client integration
-COPY src/mcp_client_integration ./mcp_client_integration
-
-# Install dependencies
-RUN pip install -e ./mcp_client_integration
-
-# Copy your application code
-COPY your_app ./your_app
-
-# Set environment variables
-ENV MCP_SERVERS='{"atlassian": "https://mcp-atlassian.svc.cluster.local/sse"}'
-
-CMD ["python", "-m", "your_app"]
-```
-
-### Docker Compose Example
-
-```yaml
-version: '3.8'
-services:
- mcp-client-app:
- build: .
- environment:
- - MCP_SERVERS={"atlassian": "https://mcp-atlassian.apps.cluster.local/sse"}
- depends_on:
- - mcp-atlassian-server
-
- mcp-atlassian-server:
- image: mcp-atlassian:latest
- ports:
- - "8080:8080"
-```
-
-## Kubernetes Deployment
-
-### ConfigMap for Configuration
-
-```yaml
-apiVersion: v1
-kind: ConfigMap
-metadata:
- name: mcp-client-config
-data:
- MCP_SERVERS: |
- {
- "atlassian": "https://mcp-atlassian.apps.cluster.local/sse",
- "github": "mcp-github.default.svc.cluster.local:8080"
- }
-```
-
-### Deployment with ConfigMap
-
-```yaml
-apiVersion: apps/v1
-kind: Deployment
-metadata:
- name: mcp-client-app
-spec:
- replicas: 1
- selector:
- matchLabels:
- app: mcp-client-app
- template:
- metadata:
- labels:
- app: mcp-client-app
- spec:
- containers:
- - name: app
- image: your-app:latest
- envFrom:
- - configMapRef:
- name: mcp-client-config
-```
-
-## Testing
-
-### Run Tests
-
-```bash
-# Run all tests
-pytest
-
-# Run with coverage
-pytest --cov=. --cov-report=html
-
-# Run specific test types
-pytest tests/unit/
-pytest tests/integration/
-```
-
-### Mock Mode for Testing
-
-```python
-# Use mock mode for testing
-client = SimpleMCPClient(mock=True)
-await client.connect_all() # Uses mock connections
-
-# Mock responses for specific tests
-from unittest.mock import patch
-with patch.object(client.connection_pool, 'send_message') as mock_send:
- mock_send.return_value = {"result": "test data"}
- response = await client.query("test query")
-```
-
-## Troubleshooting
-
-### Common Issues
-
-1. **Import Errors**
- ```bash
- # Ensure package is installed
- pip install -e .
-
- # Check Python path
- python -c "import mcp_client_integration; print(mcp_client_integration.__file__)"
- ```
-
-2. **Configuration Errors**
- ```bash
- # Validate JSON configuration
- python -c "import json; json.loads('$MCP_SERVERS')"
-
- # Test basic connectivity
- python -c "from mcp_client_integration import MCPEndpointConnector; print('Import successful')"
- ```
-
-3. **Connection Issues**
- ```python
- # Debug connectivity
- import asyncio
- from mcp_client_integration import MCPEndpointConnector
-
- async def debug():
- connector = MCPEndpointConnector()
- result = await connector.test_connectivity("your-endpoint")
- print(result)
-
- asyncio.run(debug())
- ```
-
-### Logging
-
-Enable debug logging:
-
-```python
-import logging
-logging.basicConfig(level=logging.DEBUG)
-
-# MCP-specific logging
-mcp_logger = logging.getLogger('mcp_client_integration')
-mcp_logger.setLevel(logging.DEBUG)
-```
-
-## Dependencies
-
-### Core Dependencies
-- `httpx[http2]>=0.28.1` - HTTP client with HTTP/2 support
-- `websockets>=13.1` - WebSocket support for SSE
-- `certifi>=2024.8.30` - Up-to-date CA certificates for secure connections
-- `cryptography>=41.0.0` - Cryptographic operations for secure connections
-
-### Development Dependencies
-- `pytest>=8.3.5` - Testing framework
-- `pytest-asyncio>=0.24.0` - Async test support
-- `pytest-cov>=5.0.0` - Test coverage reporting
-- `bandit` - Security linting
-- `safety` - Dependency vulnerability scanning
-- Development tools (black, isort, flake8, mypy)
-
-### Optional Dependencies
-- LlamaIndex components (install separately as needed)
-
-## Support
-
-For issues and questions:
-- GitHub Issues: https://github.com/red-hat-data-services/vTeam/issues
-- Documentation: See README.md in the package directory
-- Examples: Check the `demos/rfe-builder` integration
\ No newline at end of file
diff --git a/archive/mcp_client_integration/README.md b/archive/mcp_client_integration/README.md
deleted file mode 100644
index 2be5e922f..000000000
--- a/archive/mcp_client_integration/README.md
+++ /dev/null
@@ -1,176 +0,0 @@
-# MCP Client Integration
-
-A Python library for Model Context Protocol (MCP) client integration with llama-index and other AI workflows.
-
-## Features
-
-- **Multi-server MCP client support** with JSON configuration
-- **Connection pooling** and health monitoring
-- **OpenShift service discovery** patterns (external routes vs cluster services)
-- **Standardized error handling** and validation
-- **LlamaIndex integration** for AI workflows
-- **Test-driven development** with comprehensive test coverage
-
-## Installation
-
-### As a dependency in your project
-
-```bash
-# Install from source (development)
-pip install -e /path/to/vTeam/src/mcp_client_integration
-
-# Or add to your requirements.txt
--e git+https://github.com/red-hat-data-services/vTeam.git#egg=mcp-client-integration&subdirectory=src/mcp_client_integration
-```
-
-### For development
-
-```bash
-cd src/mcp_client_integration
-pip install -e ".[dev]"
-```
-
-## Quick Start
-
-### Basic MCP Client Usage
-
-```python
-import asyncio
-from mcp_client_integration import SimpleMCPClient
-
-async def main():
- # Initialize client with JSON configuration from environment
- client = SimpleMCPClient()
-
- # Connect to all configured MCP servers
- await client.connect_all()
-
- # Send queries with automatic capability routing
- response = await client.query("What Jira tickets are assigned to me?")
-
- # Health check
- health = await client.health_check()
- print(f"Server health: {health}")
-
- # Cleanup
- await client.disconnect_all()
-
-# Run the client
-asyncio.run(main())
-```
-
-### Configuration
-
-Set up MCP servers via environment variable:
-
-```bash
-export MCP_SERVERS='{
- "atlassian": "https://mcp-atlassian.apps.cluster.com/sse",
- "github": "https://mcp-github.apps.cluster.com/sse",
- "confluence": "mcp-confluence.default.svc.cluster.local:8080"
-}'
-```
-
-### LlamaIndex Integration
-
-```python
-from mcp_client_integration import MCPLlamaIndexTool
-
-# Create LlamaIndex tool
-mcp_tool = MCPLlamaIndexTool()
-
-# Add to your LlamaIndex agent
-from llama_index.core.agent import ReActAgent
-
-agent = ReActAgent.from_tools([mcp_tool])
-response = agent.chat("Search for recent Jira tickets")
-```
-
-## Architecture
-
-### Core Components
-
-- **SimpleMCPClient**: Main client class with multi-server support
-- **MCPConnectionPool**: Connection management and health monitoring
-- **MCPConfigurationManager**: JSON configuration loading and validation
-- **MCPEndpointConnector**: Endpoint validation and connectivity testing
-- **MCPLlamaIndexTool**: LlamaIndex integration tool
-
-### Common Utilities
-
-- **Connection Management**: Standardized connection interfaces and pooling
-- **Validation**: Endpoint and configuration validation utilities
-- **Error Handling**: Structured error handling with context
-- **Configuration**: Unified configuration management
-
-## Testing
-
-```bash
-# Run tests
-pytest
-
-# With coverage
-pytest --cov=. --cov-report=html
-
-# Run specific test types
-pytest tests/unit/
-pytest tests/integration/
-```
-
-## Development
-
-### Code Quality
-
-```bash
-# Format code
-black .
-isort .
-
-# Lint
-flake8 .
-
-# Type check
-mypy .
-```
-
-### Contributing
-
-1. Follow TDD methodology - write tests first
-2. Maintain >90% unit test coverage, >80% integration test coverage
-3. Use common utilities to avoid code duplication
-4. Follow the established error handling patterns
-
-## Configuration Examples
-
-### Simple Configuration
-
-```json
-{
- "atlassian": "https://mcp-atlassian.example.com/sse",
- "github": "https://mcp-github.example.com/sse"
-}
-```
-
-### Advanced Configuration
-
-```json
-{
- "atlassian": {
- "endpoint": "https://mcp-atlassian.example.com/sse",
- "timeout": 60,
- "connection_type": "external_route",
- "enabled": true,
- "metadata": {"team": "platform"}
- },
- "confluence": {
- "endpoint": "mcp-confluence.default.svc.cluster.local:8080",
- "timeout": 30,
- "connection_type": "cluster_service",
- "enabled": true
- }
-}
-```
-
-## License
-
-MIT License - see LICENSE file for details.
\ No newline at end of file
diff --git a/archive/mcp_client_integration/RFE_BUILDER_INTEGRATION.md b/archive/mcp_client_integration/RFE_BUILDER_INTEGRATION.md
deleted file mode 100644
index 669452202..000000000
--- a/archive/mcp_client_integration/RFE_BUILDER_INTEGRATION.md
+++ /dev/null
@@ -1,1477 +0,0 @@
-# MCP Client Integration with RFE Builder - Step-by-Step Guide
-
-This document provides detailed, granular steps to integrate the MCP Client Integration library with the `demos/rfe-builder` application.
-
-## Overview
-
-The integration will enable the RFE Builder to:
-- Connect to MCP servers (Atlassian, GitHub, Confluence, etc.)
-- Fetch real-time data from external systems
-- Enhance RFE content with actual project data
-- Provide contextual information for better RFE quality
-
-## Prerequisites
-
-Before starting the integration:
-
-1. **RFE Builder Setup**: The `demos/rfe-builder` application should be running
-2. **MCP Client Library**: Installed and tested (from `src/mcp_client_integration`)
-3. **MCP Servers**: At least one MCP server (Atlassian, GitHub, etc.) deployed and accessible
-4. **Python Environment**: Python 3.11+ with virtual environment
-
-## Step 1: Install MCP Client Integration in RFE Builder
-
-### 1.1 Update RFE Builder Dependencies
-
-First, add the MCP client integration to the RFE Builder's `pyproject.toml`:
-
-```bash
-cd demos/rfe-builder
-```
-
-Edit `pyproject.toml`:
-
-```toml
-[project]
-name = "backend"
-version = "0.1.0"
-description = "RHOAI AI Feature Sizing Backend with Multi-Agent RAG"
-readme = "README.md"
-requires-python = ">=3.11,<3.14"
-dependencies = [
- # ... existing dependencies ...
-
- # MCP Client Integration
- "mcp-client-integration",
- # OR for local development:
- # {path = "../../src/mcp_client_integration", develop = true},
-]
-```
-
-### 1.2 Install the Local Package
-
-For development, install the local MCP client integration:
-
-```bash
-# From demos/rfe-builder directory
-uv add --editable ../../src/mcp_client_integration
-```
-
-Or manually add to pyproject.toml:
-
-```toml
-[tool.uv.sources]
-mcp-client-integration = { path = "../../src/mcp_client_integration", editable = true }
-```
-
-### 1.3 Verify Installation
-
-Test the installation:
-
-```bash
-cd demos/rfe-builder
-uv run python -c "from mcp_client_integration import SimpleMCPClient; print('✅ MCP Integration installed successfully')"
-```
-
-## Step 2: Configure MCP Servers
-
-### 2.1 Create MCP Configuration
-
-Create or update the `.env` file in `demos/rfe-builder/src/`:
-
-```bash
-# MCP Server Configuration
-MCP_SERVERS='{
- "atlassian": "https://mcp-atlassian-route.apps.cluster.com/sse",
- "github": "https://mcp-github-route.apps.cluster.com/sse",
- "confluence": "mcp-confluence.vteam-mcp.svc.cluster.local:8000"
-}'
-
-# MCP Security Settings
-MCP_PRODUCTION_MODE=false
-MCP_VERIFY_SSL=true
-MCP_MAX_CONNECTIONS=5
-MCP_TIMEOUT=30
-```
-
-### 2.2 Environment Variable Validation
-
-Add validation script `demos/rfe-builder/src/mcp_config_validator.py`:
-
-```python
-#!/usr/bin/env python3
-"""
-MCP Configuration Validator for RFE Builder
-
-Validates MCP server configuration before application startup.
-"""
-
-import os
-import json
-import logging
-from typing import Dict, Any, Optional
-
-from mcp_client_integration.common import (
- MCPConfigurationManager,
- MCPSecurityValidator,
- MCPConfigurationError
-)
-
-logger = logging.getLogger(__name__)
-
-class RFEBuilderMCPConfig:
- """MCP configuration validator for RFE Builder"""
-
- def __init__(self, production_mode: Optional[bool] = None):
- """
- Initialize MCP configuration for RFE Builder.
-
- Args:
- production_mode: Override production mode detection
- """
- # Auto-detect production mode from environment
- if production_mode is None:
- production_mode = os.getenv("MCP_PRODUCTION_MODE", "false").lower() == "true"
-
- self.production_mode = production_mode
- self.config_manager = MCPConfigurationManager(production_mode=production_mode)
- self.security_validator = MCPSecurityValidator(production_mode=production_mode)
-
- logger.info(f"MCP Configuration initialized (production_mode={production_mode})")
-
- def validate_configuration(self) -> Dict[str, Any]:
- """
- Validate current MCP configuration.
-
- Returns:
- Dict with validation results and configuration details
- """
- try:
- # Load and validate configuration
- config = self.config_manager.load_configuration()
-
- # Get configuration summary
- summary = self.config_manager.get_configuration_summary(config)
-
- # Validate security
- servers = config.get_server_endpoints()
- security_result = self.security_validator.validate_configuration_security(servers)
-
- return {
- "valid": True,
- "production_mode": self.production_mode,
- "summary": summary,
- "security_validation": security_result.to_dict(),
- "servers": servers
- }
-
- except MCPConfigurationError as e:
- logger.error(f"MCP configuration validation failed: {e}")
- return {
- "valid": False,
- "error": str(e),
- "production_mode": self.production_mode
- }
-
- def get_mcp_client(self) -> 'SimpleMCPClient':
- """
- Get configured MCP client for RFE Builder.
-
- Returns:
- Configured SimpleMCPClient instance
- """
- from mcp_client_integration import SimpleMCPClient
-
- return SimpleMCPClient()
-
-def validate_mcp_config_for_rfe_builder() -> Dict[str, Any]:
- """
- Validate MCP configuration for RFE Builder startup.
-
- Returns:
- Configuration validation results
- """
- validator = RFEBuilderMCPConfig()
- return validator.validate_configuration()
-
-if __name__ == "__main__":
- # Command-line validation
- result = validate_mcp_config_for_rfe_builder()
-
- if result["valid"]:
- print("✅ MCP Configuration is valid")
- print(f"Production Mode: {result['production_mode']}")
- print(f"Servers: {list(result['servers'].keys())}")
- else:
- print("❌ MCP Configuration validation failed")
- print(f"Error: {result['error']}")
- exit(1)
-```
-
-### 2.3 Test Configuration
-
-```bash
-cd demos/rfe-builder
-uv run python src/mcp_config_validator.py
-```
-
-## Step 3: Create MCP Service Layer
-
-### 3.1 Create MCP Service Module
-
-Create `demos/rfe-builder/src/services/mcp_service.py`:
-
-```python
-#!/usr/bin/env python3
-"""
-MCP Service Layer for RFE Builder
-
-Provides high-level MCP operations for RFE Builder workflows.
-"""
-
-import logging
-import asyncio
-from typing import Dict, Any, List, Optional, Union
-from datetime import datetime
-
-from mcp_client_integration import SimpleMCPClient
-from mcp_client_integration.common import MCPError, MCPConnectionError
-
-logger = logging.getLogger(__name__)
-
-class RFEBuilderMCPService:
- """
- High-level MCP service for RFE Builder operations.
-
- This service provides RFE-specific operations using MCP servers.
- """
-
- def __init__(self, auto_connect: bool = True):
- """
- Initialize MCP service for RFE Builder.
-
- Args:
- auto_connect: Whether to automatically connect to MCP servers
- """
- self.client = SimpleMCPClient()
- self._connected = False
- self._connection_status = {}
-
- if auto_connect:
- asyncio.create_task(self._initialize_connections())
-
- async def _initialize_connections(self) -> None:
- """Initialize connections to all configured MCP servers."""
- try:
- await self.client.connect_all()
- self._connected = True
- self._connection_status = await self.client.health_check()
-
- logger.info(f"MCP Service initialized. Healthy servers: {sum(self._connection_status.values())}")
-
- except Exception as e:
- logger.error(f"Failed to initialize MCP connections: {e}")
- self._connected = False
-
- async def get_connection_status(self) -> Dict[str, Any]:
- """
- Get current MCP connection status.
-
- Returns:
- Dictionary with connection status information
- """
- if not self._connected:
- await self._initialize_connections()
-
- status = self.client.get_server_status()
- health = await self.client.health_check()
-
- return {
- "connected": self._connected,
- "servers": status,
- "health": health,
- "healthy_count": sum(health.values()),
- "total_count": len(health),
- "last_check": datetime.now().isoformat()
- }
-
- async def search_jira_tickets(self,
- project_key: Optional[str] = None,
- query: Optional[str] = None,
- max_results: int = 10) -> Dict[str, Any]:
- """
- Search for JIRA tickets related to RFE context.
-
- Args:
- project_key: JIRA project key to search in
- query: Free-text search query
- max_results: Maximum number of results to return
-
- Returns:
- Dictionary with search results and metadata
- """
- try:
- # Build search query for Atlassian MCP server
- search_params = {
- "action": "search_tickets",
- "project": project_key,
- "query": query,
- "max_results": max_results
- }
-
- # Remove None values
- search_params = {k: v for k, v in search_params.items() if v is not None}
-
- # Query Atlassian MCP server
- response = await self.client.query(
- f"Search JIRA tickets: {query or 'all tickets'}",
- capability="atlassian"
- )
-
- return {
- "success": True,
- "tickets": response.get("data", []),
- "query": search_params,
- "server": "atlassian",
- "timestamp": datetime.now().isoformat()
- }
-
- except MCPConnectionError as e:
- logger.error(f"Failed to search JIRA tickets: {e}")
- return {
- "success": False,
- "error": f"Connection error: {e}",
- "tickets": [],
- "fallback_used": False
- }
-
- except MCPError as e:
- logger.error(f"MCP error searching JIRA tickets: {e}")
- return {
- "success": False,
- "error": f"MCP error: {e}",
- "tickets": []
- }
-
- async def get_github_repository_info(self,
- repo_owner: str,
- repo_name: str) -> Dict[str, Any]:
- """
- Get GitHub repository information for RFE context.
-
- Args:
- repo_owner: GitHub repository owner
- repo_name: GitHub repository name
-
- Returns:
- Dictionary with repository information
- """
- try:
- query = f"Get repository information for {repo_owner}/{repo_name}"
-
- response = await self.client.query(query, capability="github")
-
- return {
- "success": True,
- "repository": response.get("data", {}),
- "owner": repo_owner,
- "name": repo_name,
- "server": "github",
- "timestamp": datetime.now().isoformat()
- }
-
- except MCPError as e:
- logger.error(f"Failed to get GitHub repository info: {e}")
- return {
- "success": False,
- "error": str(e),
- "repository": {}
- }
-
- async def search_confluence_docs(self,
- search_query: str,
- space_key: Optional[str] = None,
- max_results: int = 5) -> Dict[str, Any]:
- """
- Search Confluence documentation for RFE context.
-
- Args:
- search_query: Search query for Confluence
- space_key: Optional Confluence space to search in
- max_results: Maximum number of results
-
- Returns:
- Dictionary with search results
- """
- try:
- query_text = f"Search Confluence for: {search_query}"
- if space_key:
- query_text += f" in space {space_key}"
-
- response = await self.client.query(query_text, capability="confluence")
-
- return {
- "success": True,
- "documents": response.get("data", []),
- "query": search_query,
- "space": space_key,
- "server": "confluence",
- "timestamp": datetime.now().isoformat()
- }
-
- except MCPError as e:
- logger.error(f"Failed to search Confluence: {e}")
- return {
- "success": False,
- "error": str(e),
- "documents": []
- }
-
- async def get_contextual_data_for_rfe(self,
- rfe_title: str,
- rfe_description: str,
- project_context: Optional[Dict] = None) -> Dict[str, Any]:
- """
- Get contextual data from all MCP servers for an RFE.
-
- Args:
- rfe_title: Title of the RFE
- rfe_description: Description of the RFE
- project_context: Optional project context (repo, JIRA project, etc.)
-
- Returns:
- Aggregated contextual data from all available MCP servers
- """
- contextual_data = {
- "rfe_title": rfe_title,
- "rfe_description": rfe_description,
- "project_context": project_context or {},
- "data_sources": {},
- "summary": {},
- "timestamp": datetime.now().isoformat()
- }
-
- # Get connection status
- status = await self.get_connection_status()
- healthy_servers = [server for server, healthy in status["health"].items() if healthy]
-
- if not healthy_servers:
- logger.warning("No healthy MCP servers available for contextual data")
- contextual_data["summary"]["error"] = "No healthy MCP servers available"
- return contextual_data
-
- # Parallel data collection from available servers
- tasks = []
-
- # JIRA tickets if Atlassian is available
- if "atlassian" in healthy_servers:
- tasks.append(("jira_tickets", self.search_jira_tickets(
- query=f"{rfe_title} {rfe_description}"[:100], # Limit query length
- max_results=5
- )))
-
- # GitHub repository info if GitHub is available and project context provided
- if "github" in healthy_servers and project_context:
- repo_owner = project_context.get("github_owner")
- repo_name = project_context.get("github_repo")
- if repo_owner and repo_name:
- tasks.append(("github_repo", self.get_github_repository_info(
- repo_owner, repo_name
- )))
-
- # Confluence documentation if Confluence is available
- if "confluence" in healthy_servers:
- tasks.append(("confluence_docs", self.search_confluence_docs(
- search_query=f"{rfe_title} {rfe_description}"[:100],
- max_results=3
- )))
-
- # Execute all queries in parallel
- if tasks:
- results = await asyncio.gather(*[task[1] for task in tasks], return_exceptions=True)
-
- for i, (data_type, result) in enumerate(zip([task[0] for task in tasks], results)):
- if isinstance(result, Exception):
- logger.error(f"Error fetching {data_type}: {result}")
- contextual_data["data_sources"][data_type] = {
- "success": False,
- "error": str(result)
- }
- else:
- contextual_data["data_sources"][data_type] = result
-
- # Generate summary
- successful_sources = [k for k, v in contextual_data["data_sources"].items() if v.get("success")]
- contextual_data["summary"] = {
- "healthy_servers": healthy_servers,
- "successful_sources": successful_sources,
- "total_data_points": sum(
- len(v.get("tickets", v.get("documents", v.get("repository", {}) and [v.get("repository")] or [])))
- for v in contextual_data["data_sources"].values()
- if v.get("success")
- )
- }
-
- return contextual_data
-
- async def close(self) -> None:
- """Close MCP connections."""
- if self._connected:
- await self.client.disconnect_all()
- self._connected = False
- logger.info("MCP Service connections closed")
-
-# Global MCP service instance
-_mcp_service: Optional[RFEBuilderMCPService] = None
-
-def get_mcp_service() -> RFEBuilderMCPService:
- """
- Get or create the global MCP service instance.
-
- Returns:
- RFEBuilderMCPService instance
- """
- global _mcp_service
- if _mcp_service is None:
- _mcp_service = RFEBuilderMCPService()
- return _mcp_service
-
-async def initialize_mcp_service() -> RFEBuilderMCPService:
- """
- Initialize MCP service for the application.
-
- Returns:
- Initialized RFEBuilderMCPService
- """
- service = get_mcp_service()
- if not service._connected:
- await service._initialize_connections()
- return service
-```
-
-### 3.2 Create Services Directory
-
-```bash
-mkdir -p demos/rfe-builder/src/services
-touch demos/rfe-builder/src/services/__init__.py
-```
-
-Add to `demos/rfe-builder/src/services/__init__.py`:
-
-```python
-"""
-Services package for RFE Builder
-
-Provides high-level service interfaces for external system integration.
-"""
-
-from .mcp_service import (
- RFEBuilderMCPService,
- get_mcp_service,
- initialize_mcp_service
-)
-
-__all__ = [
- "RFEBuilderMCPService",
- "get_mcp_service",
- "initialize_mcp_service"
-]
-```
-
-## Step 4: Integrate with RFE Builder Workflow
-
-### 4.1 Enhance the RFE Builder Workflow
-
-Edit `demos/rfe-builder/src/rfe_builder_workflow.py` to integrate MCP:
-
-```python
-# Add these imports at the top
-from .services.mcp_service import get_mcp_service, initialize_mcp_service
-from .mcp_config_validator import RFEBuilderMCPConfig
-
-# Add this method to the RFEBuilderWorkflow class
-class RFEBuilderWorkflow:
- # ... existing code ...
-
- async def initialize_mcp_integration(self) -> Dict[str, Any]:
- """
- Initialize MCP integration for enhanced RFE building.
-
- Returns:
- MCP initialization status
- """
- try:
- # Validate MCP configuration
- config_validator = RFEBuilderMCPConfig()
- config_result = config_validator.validate_configuration()
-
- if not config_result["valid"]:
- logger.warning(f"MCP configuration invalid: {config_result.get('error')}")
- return {
- "mcp_enabled": False,
- "error": config_result.get('error'),
- "status": "configuration_invalid"
- }
-
- # Initialize MCP service
- mcp_service = await initialize_mcp_service()
- connection_status = await mcp_service.get_connection_status()
-
- logger.info(f"MCP Integration initialized. Healthy servers: {connection_status['healthy_count']}")
-
- return {
- "mcp_enabled": True,
- "connection_status": connection_status,
- "status": "initialized"
- }
-
- except Exception as e:
- logger.error(f"Failed to initialize MCP integration: {e}")
- return {
- "mcp_enabled": False,
- "error": str(e),
- "status": "initialization_failed"
- }
-
- async def enhance_rfe_with_mcp_data(self,
- rfe_content: Dict[str, Any],
- project_context: Optional[Dict] = None) -> Dict[str, Any]:
- """
- Enhance RFE content with data from MCP servers.
-
- Args:
- rfe_content: Current RFE content
- project_context: Optional project context for targeted queries
-
- Returns:
- Enhanced RFE content with MCP data
- """
- try:
- mcp_service = get_mcp_service()
-
- # Get contextual data from MCP servers
- contextual_data = await mcp_service.get_contextual_data_for_rfe(
- rfe_title=rfe_content.get("title", ""),
- rfe_description=rfe_content.get("description", ""),
- project_context=project_context
- )
-
- # Enhance RFE content with contextual data
- enhanced_content = rfe_content.copy()
- enhanced_content["mcp_context"] = contextual_data
-
- # Add related tickets to requirements if available
- jira_data = contextual_data["data_sources"].get("jira_tickets", {})
- if jira_data.get("success") and jira_data.get("tickets"):
- enhanced_content["related_tickets"] = jira_data["tickets"]
-
- # Add repository information if available
- github_data = contextual_data["data_sources"].get("github_repo", {})
- if github_data.get("success") and github_data.get("repository"):
- enhanced_content["repository_context"] = github_data["repository"]
-
- # Add documentation references if available
- confluence_data = contextual_data["data_sources"].get("confluence_docs", {})
- if confluence_data.get("success") and confluence_data.get("documents"):
- enhanced_content["documentation_references"] = confluence_data["documents"]
-
- logger.info(f"Enhanced RFE with {contextual_data['summary']['total_data_points']} data points from MCP")
-
- return enhanced_content
-
- except Exception as e:
- logger.error(f"Failed to enhance RFE with MCP data: {e}")
- # Return original content if enhancement fails
- rfe_content["mcp_enhancement_error"] = str(e)
- return rfe_content
-```
-
-### 4.2 Update Settings Integration
-
-Edit `demos/rfe-builder/src/settings.py` to include MCP configuration:
-
-```python
-# Add this import at the top
-import os
-from typing import Optional, Dict, Any, Type
-
-# Add this function at the end
-def configure_mcp_integration(config_override: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
- """
- Configure MCP integration for RFE Builder.
-
- Args:
- config_override: Optional configuration overrides
-
- Returns:
- MCP configuration dictionary
- """
- from .mcp_config_validator import RFEBuilderMCPConfig
-
- # Create MCP configuration
- mcp_config = RFEBuilderMCPConfig()
-
- # Validate configuration
- validation_result = mcp_config.validate_configuration()
-
- # Apply any overrides
- if config_override:
- # Apply configuration overrides here if needed
- pass
-
- return {
- "mcp_validation": validation_result,
- "mcp_enabled": validation_result.get("valid", False),
- "production_mode": mcp_config.production_mode
- }
-
-def init_settings_with_mcp(
- llm_provider: Optional[str] = None,
- embedding_provider: Optional[str] = None,
- llm_config: Optional[Dict[str, Any]] = None,
- embedding_config: Optional[Dict[str, Any]] = None,
- mcp_config: Optional[Dict[str, Any]] = None,
- **global_settings,
-) -> Dict[str, Any]:
- """
- Initialize LlamaIndex settings with MCP integration.
-
- Returns:
- Initialization results including MCP status
- """
- # Initialize LlamaIndex settings
- init_settings(llm_provider, embedding_provider, llm_config, embedding_config, **global_settings)
-
- # Configure MCP integration
- mcp_status = configure_mcp_integration(mcp_config)
-
- return {
- "llama_index_initialized": True,
- "mcp_status": mcp_status
- }
-```
-
-## Step 5: Add MCP Health Check and Monitoring
-
-### 5.1 Create Health Check Endpoint
-
-Create `demos/rfe-builder/src/health_check.py`:
-
-```python
-#!/usr/bin/env python3
-"""
-Health Check Module for RFE Builder with MCP Integration
-
-Provides health check endpoints for monitoring MCP connections.
-"""
-
-import asyncio
-import logging
-from typing import Dict, Any
-from datetime import datetime
-
-from .services.mcp_service import get_mcp_service
-from .mcp_config_validator import validate_mcp_config_for_rfe_builder
-
-logger = logging.getLogger(__name__)
-
-class HealthChecker:
- """Health check utility for RFE Builder with MCP integration."""
-
- async def get_system_health(self) -> Dict[str, Any]:
- """
- Get comprehensive system health including MCP status.
-
- Returns:
- Dictionary with system health information
- """
- health_data = {
- "timestamp": datetime.now().isoformat(),
- "overall_status": "unknown",
- "components": {}
- }
-
- # Check MCP configuration
- try:
- config_result = validate_mcp_config_for_rfe_builder()
- health_data["components"]["mcp_config"] = {
- "status": "healthy" if config_result["valid"] else "unhealthy",
- "details": config_result
- }
- except Exception as e:
- health_data["components"]["mcp_config"] = {
- "status": "error",
- "error": str(e)
- }
-
- # Check MCP service connections
- try:
- mcp_service = get_mcp_service()
- connection_status = await mcp_service.get_connection_status()
-
- healthy_servers = connection_status["healthy_count"]
- total_servers = connection_status["total_count"]
-
- if healthy_servers == total_servers and total_servers > 0:
- mcp_health = "healthy"
- elif healthy_servers > 0:
- mcp_health = "degraded"
- else:
- mcp_health = "unhealthy"
-
- health_data["components"]["mcp_connections"] = {
- "status": mcp_health,
- "healthy_servers": healthy_servers,
- "total_servers": total_servers,
- "details": connection_status
- }
-
- except Exception as e:
- health_data["components"]["mcp_connections"] = {
- "status": "error",
- "error": str(e)
- }
-
- # Determine overall status
- component_statuses = [comp["status"] for comp in health_data["components"].values()]
-
- if all(status == "healthy" for status in component_statuses):
- health_data["overall_status"] = "healthy"
- elif any(status == "healthy" for status in component_statuses):
- health_data["overall_status"] = "degraded"
- else:
- health_data["overall_status"] = "unhealthy"
-
- return health_data
-
- async def test_mcp_connectivity(self) -> Dict[str, Any]:
- """
- Test MCP connectivity with sample queries.
-
- Returns:
- Connectivity test results
- """
- test_results = {
- "timestamp": datetime.now().isoformat(),
- "tests": {}
- }
-
- try:
- mcp_service = get_mcp_service()
-
- # Test Atlassian connection
- try:
- jira_result = await mcp_service.search_jira_tickets(
- query="test connectivity",
- max_results=1
- )
- test_results["tests"]["atlassian"] = {
- "status": "success" if jira_result["success"] else "failed",
- "details": jira_result
- }
- except Exception as e:
- test_results["tests"]["atlassian"] = {
- "status": "error",
- "error": str(e)
- }
-
- # Test GitHub connection
- try:
- # Use a well-known public repository for testing
- github_result = await mcp_service.get_github_repository_info(
- repo_owner="octocat",
- repo_name="Hello-World"
- )
- test_results["tests"]["github"] = {
- "status": "success" if github_result["success"] else "failed",
- "details": github_result
- }
- except Exception as e:
- test_results["tests"]["github"] = {
- "status": "error",
- "error": str(e)
- }
-
- # Test Confluence connection
- try:
- confluence_result = await mcp_service.search_confluence_docs(
- search_query="test",
- max_results=1
- )
- test_results["tests"]["confluence"] = {
- "status": "success" if confluence_result["success"] else "failed",
- "details": confluence_result
- }
- except Exception as e:
- test_results["tests"]["confluence"] = {
- "status": "error",
- "error": str(e)
- }
-
- except Exception as e:
- test_results["error"] = str(e)
-
- return test_results
-
-# Global health checker instance
-_health_checker: HealthChecker = HealthChecker()
-
-async def get_health() -> Dict[str, Any]:
- """Get system health status."""
- return await _health_checker.get_system_health()
-
-async def test_connectivity() -> Dict[str, Any]:
- """Test MCP connectivity."""
- return await _health_checker.test_mcp_connectivity()
-```
-
-## Step 6: Create CLI Integration Commands
-
-### 6.1 Add MCP Commands to RFE Builder CLI
-
-Create `demos/rfe-builder/src/mcp_cli.py`:
-
-```python
-#!/usr/bin/env python3
-"""
-MCP CLI Commands for RFE Builder
-
-Provides command-line interface for MCP operations.
-"""
-
-import asyncio
-import click
-import json
-from typing import Optional
-
-from .health_check import get_health, test_connectivity
-from .services.mcp_service import initialize_mcp_service
-from .mcp_config_validator import validate_mcp_config_for_rfe_builder
-
-@click.group()
-def mcp():
- """MCP (Model Context Protocol) management commands."""
- pass
-
-@mcp.command()
-def validate():
- """Validate MCP configuration."""
- click.echo("Validating MCP configuration...")
-
- result = validate_mcp_config_for_rfe_builder()
-
- if result["valid"]:
- click.echo("✅ MCP configuration is valid")
- click.echo(f"Production mode: {result['production_mode']}")
- click.echo(f"Configured servers: {list(result['servers'].keys())}")
- else:
- click.echo("❌ MCP configuration validation failed")
- click.echo(f"Error: {result['error']}")
-
-@mcp.command()
-def health():
- """Check MCP system health."""
- click.echo("Checking MCP system health...")
-
- async def _check_health():
- return await get_health()
-
- result = asyncio.run(_check_health())
-
- click.echo(f"Overall status: {result['overall_status']}")
-
- for component, details in result["components"].items():
- status_icon = {"healthy": "✅", "degraded": "⚠️", "unhealthy": "❌", "error": "💥"}
- click.echo(f"{status_icon.get(details['status'], '❓')} {component}: {details['status']}")
-
-@mcp.command()
-def test():
- """Test MCP connectivity with sample queries."""
- click.echo("Testing MCP connectivity...")
-
- async def _test_connectivity():
- return await test_connectivity()
-
- result = asyncio.run(_test_connectivity())
-
- if "error" in result:
- click.echo(f"❌ Test failed: {result['error']}")
- return
-
- for server, test_result in result["tests"].items():
- status_icon = {"success": "✅", "failed": "❌", "error": "💥"}
- click.echo(f"{status_icon.get(test_result['status'], '❓')} {server}: {test_result['status']}")
-
- if test_result["status"] != "success":
- click.echo(f" Error: {test_result.get('error', 'Unknown error')}")
-
-@mcp.command()
-@click.option("--query", required=True, help="Search query for JIRA tickets")
-@click.option("--project", help="JIRA project key")
-@click.option("--max-results", default=5, help="Maximum number of results")
-def search_jira(query: str, project: Optional[str], max_results: int):
- """Search JIRA tickets via MCP."""
- click.echo(f"Searching JIRA tickets: {query}")
-
- async def _search():
- service = await initialize_mcp_service()
- return await service.search_jira_tickets(
- query=query,
- project_key=project,
- max_results=max_results
- )
-
- result = asyncio.run(_search())
-
- if result["success"]:
- click.echo(f"✅ Found {len(result['tickets'])} tickets")
- for i, ticket in enumerate(result["tickets"], 1):
- click.echo(f"{i}. {ticket}")
- else:
- click.echo(f"❌ Search failed: {result['error']}")
-
-@mcp.command()
-@click.option("--owner", required=True, help="GitHub repository owner")
-@click.option("--repo", required=True, help="GitHub repository name")
-def github_info(owner: str, repo: str):
- """Get GitHub repository information via MCP."""
- click.echo(f"Getting GitHub repository info: {owner}/{repo}")
-
- async def _get_info():
- service = await initialize_mcp_service()
- return await service.get_github_repository_info(owner, repo)
-
- result = asyncio.run(_get_info())
-
- if result["success"]:
- click.echo("✅ Repository information retrieved")
- repo_info = result["repository"]
- click.echo(f"Repository: {repo_info}")
- else:
- click.echo(f"❌ Failed to get repository info: {result['error']}")
-
-@mcp.command()
-@click.option("--query", required=True, help="Search query for Confluence")
-@click.option("--space", help="Confluence space key")
-@click.option("--max-results", default=3, help="Maximum number of results")
-def search_confluence(query: str, space: Optional[str], max_results: int):
- """Search Confluence documentation via MCP."""
- click.echo(f"Searching Confluence: {query}")
-
- async def _search():
- service = await initialize_mcp_service()
- return await service.search_confluence_docs(
- search_query=query,
- space_key=space,
- max_results=max_results
- )
-
- result = asyncio.run(_search())
-
- if result["success"]:
- click.echo(f"✅ Found {len(result['documents'])} documents")
- for i, doc in enumerate(result["documents"], 1):
- click.echo(f"{i}. {doc}")
- else:
- click.echo(f"❌ Search failed: {result['error']}")
-
-if __name__ == "__main__":
- mcp()
-```
-
-### 6.2 Add MCP Commands to Main CLI
-
-Edit `demos/rfe-builder/src/ingestion.py` to include MCP commands:
-
-```python
-# Add this import at the top
-from .mcp_cli import mcp
-
-# Add to the main CLI group
-@cli.add_command(mcp)
-```
-
-## Step 7: Update Application Initialization
-
-### 7.1 Modify Application Startup
-
-Edit the main application file to initialize MCP on startup. This depends on how RFE Builder is structured, but typically in a main.py or app.py file:
-
-```python
-# Add these imports
-import asyncio
-from src.services.mcp_service import initialize_mcp_service
-from src.settings import init_settings_with_mcp
-
-async def initialize_application():
- """Initialize RFE Builder application with MCP integration."""
-
- # Initialize LlamaIndex settings with MCP
- settings_result = init_settings_with_mcp()
-
- if settings_result["mcp_status"]["mcp_enabled"]:
- # Initialize MCP service
- try:
- mcp_service = await initialize_mcp_service()
- print("✅ MCP Integration initialized successfully")
-
- # Test connectivity
- status = await mcp_service.get_connection_status()
- healthy = status["healthy_count"]
- total = status["total_count"]
- print(f"MCP Status: {healthy}/{total} servers healthy")
-
- except Exception as e:
- print(f"⚠️ MCP Integration failed: {e}")
- else:
- print("ℹ️ MCP Integration disabled or not configured")
-
- return settings_result
-
-# Call during application startup
-if __name__ == "__main__":
- # Initialize application
- init_result = asyncio.run(initialize_application())
-
- # Continue with normal application startup
- # ... rest of application code ...
-```
-
-## Step 8: Testing the Integration
-
-### 8.1 Configuration Testing
-
-```bash
-cd demos/rfe-builder
-
-# Test MCP configuration
-uv run python src/mcp_config_validator.py
-
-# Test MCP CLI commands
-uv run python -m src.mcp_cli validate
-uv run python -m src.mcp_cli health
-uv run python -m src.mcp_cli test
-```
-
-### 8.2 Integration Testing
-
-Create `demos/rfe-builder/test_mcp_integration.py`:
-
-```python
-#!/usr/bin/env python3
-"""
-Integration tests for MCP integration with RFE Builder.
-"""
-
-import asyncio
-import pytest
-from src.services.mcp_service import RFEBuilderMCPService, initialize_mcp_service
-from src.mcp_config_validator import validate_mcp_config_for_rfe_builder
-
-@pytest.mark.asyncio
-async def test_mcp_configuration():
- """Test MCP configuration validation."""
- result = validate_mcp_config_for_rfe_builder()
- assert isinstance(result, dict)
- assert "valid" in result
-
-@pytest.mark.asyncio
-async def test_mcp_service_initialization():
- """Test MCP service initialization."""
- service = await initialize_mcp_service()
- assert isinstance(service, RFEBuilderMCPService)
-
- status = await service.get_connection_status()
- assert "connected" in status
- assert "servers" in status
-
-@pytest.mark.asyncio
-async def test_contextual_data_retrieval():
- """Test retrieval of contextual data for RFE."""
- service = await initialize_mcp_service()
-
- result = await service.get_contextual_data_for_rfe(
- rfe_title="Test RFE",
- rfe_description="This is a test RFE for integration testing",
- project_context={
- "github_owner": "octocat",
- "github_repo": "Hello-World"
- }
- )
-
- assert "rfe_title" in result
- assert "data_sources" in result
- assert "summary" in result
-
-if __name__ == "__main__":
- pytest.main([__file__])
-```
-
-### 8.3 End-to-End Testing
-
-```bash
-# Run integration tests
-cd demos/rfe-builder
-uv run python test_mcp_integration.py
-
-# Test with actual RFE workflow
-uv run python -c "
-import asyncio
-from src.rfe_builder_workflow import RFEBuilderWorkflow
-
-async def test_enhanced_rfe():
- workflow = RFEBuilderWorkflow()
-
- # Initialize MCP
- mcp_status = await workflow.initialize_mcp_integration()
- print(f'MCP Status: {mcp_status}')
-
- # Test RFE enhancement
- sample_rfe = {
- 'title': 'Add authentication to API',
- 'description': 'Implement OAuth2 authentication for REST API endpoints'
- }
-
- enhanced_rfe = await workflow.enhance_rfe_with_mcp_data(
- sample_rfe,
- {'github_owner': 'your-org', 'github_repo': 'your-repo'}
- )
-
- print('Enhanced RFE:', enhanced_rfe)
-
-asyncio.run(test_enhanced_rfe())
-"
-```
-
-## Step 9: Production Deployment Considerations
-
-### 9.1 Environment Configuration
-
-For production deployment, create `demos/rfe-builder/.env.production`:
-
-```bash
-# Production MCP Configuration
-MCP_SERVERS='{
- "atlassian": "https://mcp-atlassian-route.apps.production.com/sse",
- "github": "https://mcp-github-route.apps.production.com/sse",
- "confluence": "mcp-confluence.production.svc.cluster.local:8000"
-}'
-
-# Security Settings
-MCP_PRODUCTION_MODE=true
-MCP_VERIFY_SSL=true
-MCP_MAX_CONNECTIONS=10
-MCP_TIMEOUT=30
-
-# Logging
-LOG_LEVEL=INFO
-MCP_LOG_LEVEL=INFO
-```
-
-### 9.2 Docker Integration
-
-Update `demos/rfe-builder/Dockerfile` to include MCP integration:
-
-```dockerfile
-# Add MCP client integration installation
-COPY ../../src/mcp_client_integration ./mcp_client_integration
-RUN pip install -e ./mcp_client_integration
-
-# Copy MCP configuration
-COPY .env.production .env
-
-# Health check that includes MCP
-HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \
- CMD python -c "
-import asyncio
-from src.health_check import get_health
-result = asyncio.run(get_health())
-exit(0 if result['overall_status'] in ['healthy', 'degraded'] else 1)
-"
-```
-
-### 9.3 Kubernetes Deployment
-
-Update Kubernetes manifests to include MCP configuration:
-
-```yaml
-apiVersion: v1
-kind: ConfigMap
-metadata:
- name: rfe-builder-mcp-config
-data:
- MCP_SERVERS: |
- {
- "atlassian": "https://mcp-atlassian-route.apps.cluster.com/sse",
- "github": "https://mcp-github-route.apps.cluster.com/sse",
- "confluence": "mcp-confluence.vteam-mcp.svc.cluster.local:8000"
- }
- MCP_PRODUCTION_MODE: "true"
- MCP_VERIFY_SSL: "true"
- MCP_MAX_CONNECTIONS: "10"
- MCP_TIMEOUT: "30"
----
-apiVersion: apps/v1
-kind: Deployment
-metadata:
- name: rfe-builder
-spec:
- template:
- spec:
- containers:
- - name: rfe-builder
- envFrom:
- - configMapRef:
- name: rfe-builder-mcp-config
- # Health check
- livenessProbe:
- exec:
- command:
- - python
- - -c
- - |
- import asyncio
- from src.health_check import get_health
- result = asyncio.run(get_health())
- exit(0 if result['overall_status'] in ['healthy', 'degraded'] else 1)
- initialDelaySeconds: 30
- periodSeconds: 30
- readinessProbe:
- exec:
- command:
- - python
- - -m
- - src.mcp_cli
- - validate
- initialDelaySeconds: 10
- periodSeconds: 10
-```
-
-## Step 10: Usage Examples
-
-### 10.1 Basic RFE Enhancement
-
-```python
-from src.rfe_builder_workflow import RFEBuilderWorkflow
-
-async def create_enhanced_rfe():
- workflow = RFEBuilderWorkflow()
-
- # Initialize MCP integration
- await workflow.initialize_mcp_integration()
-
- # Create RFE content
- rfe = {
- "title": "Implement user authentication",
- "description": "Add OAuth2 authentication to the web application",
- "priority": "high"
- }
-
- # Enhance with MCP data
- enhanced_rfe = await workflow.enhance_rfe_with_mcp_data(
- rfe,
- project_context={
- "github_owner": "my-org",
- "github_repo": "web-app",
- "jira_project": "WEBAPP"
- }
- )
-
- # The enhanced RFE now includes:
- # - Related JIRA tickets
- # - GitHub repository context
- # - Confluence documentation
- # - Additional contextual information
-
- return enhanced_rfe
-```
-
-### 10.2 Manual MCP Queries
-
-```python
-from src.services.mcp_service import get_mcp_service
-
-async def get_project_context():
- mcp_service = get_mcp_service()
-
- # Search for related tickets
- tickets = await mcp_service.search_jira_tickets(
- query="authentication OAuth",
- project_key="WEBAPP",
- max_results=10
- )
-
- # Get repository information
- repo_info = await mcp_service.get_github_repository_info(
- repo_owner="my-org",
- repo_name="web-app"
- )
-
- # Search documentation
- docs = await mcp_service.search_confluence_docs(
- search_query="authentication implementation guide",
- space_key="DEV"
- )
-
- return {
- "tickets": tickets,
- "repository": repo_info,
- "documentation": docs
- }
-```
-
-## Troubleshooting
-
-### Common Issues
-
-1. **MCP Server Connection Failed**
- ```bash
- uv run python -m src.mcp_cli test
- ```
- Check network connectivity and server URLs.
-
-2. **Configuration Validation Failed**
- ```bash
- uv run python -m src.mcp_cli validate
- ```
- Verify MCP_SERVERS environment variable format.
-
-3. **Import Errors**
- ```bash
- uv run python -c "from mcp_client_integration import SimpleMCPClient; print('OK')"
- ```
- Ensure MCP client integration is properly installed.
-
-4. **SSL Certificate Issues**
- Set `MCP_VERIFY_SSL=false` for development environments.
-
-### Logging and Debugging
-
-Enable debug logging:
-```python
-import logging
-logging.basicConfig(level=logging.DEBUG)
-logging.getLogger('mcp_client_integration').setLevel(logging.DEBUG)
-```
-
-Check MCP service logs:
-```bash
-uv run python -c "
-import asyncio
-from src.health_check import get_health
-result = asyncio.run(get_health())
-print(result)
-"
-```
-
-## Summary
-
-This integration guide provides a complete step-by-step process to integrate the MCP Client Integration library with the RFE Builder application. The integration enables:
-
-- **Real-time data fetching** from JIRA, GitHub, and Confluence
-- **Enhanced RFE content** with contextual information
-- **Health monitoring** and connectivity testing
-- **Production-ready deployment** with proper security
-- **CLI tools** for management and debugging
-
-The integration is designed to be robust, with fallback mechanisms and comprehensive error handling to ensure the RFE Builder continues to function even when MCP servers are unavailable.
\ No newline at end of file
diff --git a/archive/mcp_client_integration/SECURITY.md b/archive/mcp_client_integration/SECURITY.md
deleted file mode 100644
index 9fc252482..000000000
--- a/archive/mcp_client_integration/SECURITY.md
+++ /dev/null
@@ -1,268 +0,0 @@
-# MCP Client Integration - Security Guide
-
-This guide covers security considerations and best practices for the MCP Client Integration library.
-
-## Security Features
-
-### 1. Configuration Validation
-
-The library includes comprehensive security validation:
-
-- **Input Sanitization**: All configuration inputs are validated for injection attacks
-- **Size Limits**: Configuration size is limited to prevent DoS attacks
-- **Schema Validation**: JSON structure is strictly validated
-- **Endpoint Validation**: URLs are validated for malicious patterns
-
-### 2. Connection Security
-
-#### SSL/TLS Configuration
-
-```python
-from mcp_client_integration import SimpleMCPClient
-
-# Production mode - strict security (recommended)
-client = SimpleMCPClient() # Default: verify_ssl=True
-
-# Development mode - relaxed validation (testing only)
-config = MCPConfigurationManager(production_mode=False)
-```
-
-#### Production vs Development Mode
-
-**Production Mode** (recommended for production):
-- Only HTTPS connections allowed
-- SSL certificate validation enforced
-- Private IP addresses blocked
-- Strict timeout limits
-
-**Development Mode** (testing only):
-- HTTP connections allowed for localhost
-- SSL verification can be disabled
-- More permissive validation
-
-### 3. Environment Variable Security
-
-#### Secure Configuration
-
-```bash
-# ✅ SECURE: Use environment variables
-export MCP_SERVERS='{"atlassian": "https://mcp-server.company.com/sse"}'
-
-# ❌ INSECURE: Don't hardcode in source code
-client = SimpleMCPClient(config={"server": "https://secret-server.com"})
-```
-
-#### Configuration Validation
-
-```python
-from mcp_client_integration.common import MCPSecurityValidator
-
-# Validate configuration before use
-validator = MCPSecurityValidator(production_mode=True)
-result = validator.validate_configuration_security(config_data)
-
-if not result.valid:
- raise ValueError(f"Security validation failed: {result.error_message}")
-```
-
-## Security Best Practices
-
-### 1. Network Security
-
-```python
-# ✅ SECURE: Always use HTTPS in production
-{
- "atlassian": "https://mcp-atlassian.company.com/sse",
- "github": "https://mcp-github.company.com/sse"
-}
-
-# ❌ INSECURE: HTTP connections
-{
- "atlassian": "http://mcp-atlassian.company.com/sse" # Vulnerable to MITM
-}
-```
-
-### 2. Credential Management
-
-```python
-# ✅ SECURE: Use environment variables or secret management
-import os
-
-mcp_config = {
- "atlassian": os.getenv("MCP_ATLASSIAN_URL"),
- "auth_token": os.getenv("MCP_AUTH_TOKEN") # If auth is implemented
-}
-
-# ❌ INSECURE: Hardcoded credentials
-mcp_config = {
- "atlassian": "https://user:password@mcp-server.com/sse"
-}
-```
-
-### 3. Timeout Configuration
-
-```python
-# ✅ SECURE: Set reasonable timeouts
-{
- "atlassian": {
- "endpoint": "https://mcp-atlassian.com/sse",
- "timeout": 30, # 30 seconds maximum
- "connection_type": "external_route"
- }
-}
-
-# ❌ INSECURE: No timeout or excessive timeout
-{
- "atlassian": {
- "endpoint": "https://mcp-atlassian.com/sse",
- "timeout": 3600 # 1 hour - too long, enables DoS
- }
-}
-```
-
-### 4. Connection Pool Limits
-
-```python
-from mcp_client_integration.common import MCPConnectionPool
-
-# ✅ SECURE: Limit connection pool size
-pool = MCPConnectionPool(max_connections=10) # Reasonable limit
-
-# ❌ INSECURE: Unlimited connections
-pool = MCPConnectionPool(max_connections=1000) # Resource exhaustion risk
-```
-
-## Security Validation Examples
-
-### Basic Security Check
-
-```python
-from mcp_client_integration.common import MCPSecurityValidator
-
-validator = MCPSecurityValidator(production_mode=True)
-
-# Valid configuration
-config = {
- "atlassian": "https://mcp-atlassian.company.com/sse",
- "timeout": 30
-}
-
-result = validator.validate_configuration_security(config)
-if result.valid:
- print("✅ Configuration is secure")
-else:
- print(f"❌ Security issue: {result.error_message}")
-```
-
-### Production Security Validation
-
-```python
-# Production-grade validation
-validator = MCPSecurityValidator(production_mode=True)
-
-# This will fail in production mode
-insecure_config = {
- "local": "http://localhost:8080/sse", # Blocked in production
- "internal": "http://192.168.1.100/sse" # Private IP blocked
-}
-
-result = validator.validate_configuration_security(insecure_config)
-# Result: valid=False, error_message="Localhost/loopback addresses not allowed in production mode"
-```
-
-## Common Security Issues
-
-### 1. Configuration Injection
-
-```python
-# ❌ DANGEROUS: User input directly used in configuration
-user_input = request.json.get("mcp_server")
-config = json.dumps({"server": user_input}) # Potential injection
-
-# ✅ SAFE: Validate user input
-from mcp_client_integration.common import MCPSecurityValidator
-
-validator = MCPSecurityValidator(production_mode=True)
-result = validator.validate_configuration_security({"server": user_input})
-
-if result.valid:
- config = json.dumps({"server": user_input})
-else:
- raise ValueError("Invalid server configuration")
-```
-
-### 2. SSL Certificate Issues
-
-```python
-# ❌ DANGEROUS: Disabling SSL verification
-connection = ExternalRouteMCPConnection(
- "https://mcp-server.com/sse",
- verify_ssl=False # Vulnerable to MITM attacks
-)
-
-# ✅ SAFE: Always verify SSL in production
-connection = ExternalRouteMCPConnection(
- "https://mcp-server.com/sse",
- verify_ssl=True # Default and recommended
-)
-```
-
-### 3. Resource Exhaustion
-
-```python
-# ❌ DANGEROUS: No limits on configuration size
-large_config = {"server_" + str(i): f"https://server{i}.com" for i in range(10000)}
-
-# ✅ SAFE: Library automatically enforces limits
-# MCPSecurityValidator.MAX_ENDPOINTS = 50
-# MCPSecurityValidator.MAX_CONFIG_SIZE = 50KB
-```
-
-## Security Checklist
-
-Before deploying to production:
-
-- [ ] **Configuration Validation**: All configurations pass security validation
-- [ ] **HTTPS Only**: All external connections use HTTPS
-- [ ] **SSL Verification**: Certificate validation is enabled
-- [ ] **Environment Variables**: Secrets stored in environment variables or secret management
-- [ ] **Timeout Limits**: Reasonable timeout values configured
-- [ ] **Connection Limits**: Connection pool size limits set
-- [ ] **Production Mode**: `production_mode=True` for production deployments
-- [ ] **Logging**: Security events are logged appropriately
-- [ ] **Updates**: Dependencies are up-to-date with security patches
-
-## Reporting Security Issues
-
-If you discover a security vulnerability in the MCP Client Integration library:
-
-1. **Do not** open a public GitHub issue
-2. Email security concerns to: [security contact from project]
-3. Include:
- - Description of the vulnerability
- - Steps to reproduce
- - Potential impact assessment
- - Suggested mitigation
-
-## Security Dependencies
-
-The library includes these security-focused dependencies:
-
-- `certifi>=2024.8.30`: Up-to-date CA certificates
-- `cryptography>=41.0.0`: Secure cryptographic operations
-- `httpx[http2]>=0.28.1`: Secure HTTP client with HTTP/2 support
-
-## Development Security Tools
-
-For development and testing:
-
-```bash
-# Install security scanning tools
-pip install bandit safety
-
-# Run security linting
-bandit -r src/
-
-# Check for known vulnerabilities
-safety check
-```
\ No newline at end of file
diff --git a/archive/mcp_client_integration/__init__.py b/archive/mcp_client_integration/__init__.py
deleted file mode 100644
index 12711e5c7..000000000
--- a/archive/mcp_client_integration/__init__.py
+++ /dev/null
@@ -1,46 +0,0 @@
-"""
-MCP Client Integration for Llama Index
-
-This module provides MCP (Model Context Protocol) client integration for llama index
-deployments, enabling access to Jira and Confluence data through standardized interfaces.
-
-Based on SPIKE-001 and SPIKE-002 validated patterns.
-Refactored with common utilities for reduced code duplication and standardized interfaces.
-"""
-
-from .simple_mcp_client import SimpleMCPClient
-from .endpoint_connector import MCPEndpointConnector
-from .llama_integration import MCPEnhancedLlamaIndex
-from .llama_index_tool import MCPLlamaIndexTool, create_mcp_tool
-
-# Export common utilities for advanced users
-from .common import (
- MCPConnectionPool,
- MCPConfigurationManager,
- MCPEndpointValidator,
- MCPErrorHandler,
- MCPError,
- MCPConnectionError,
- MCPConfigurationError,
- handle_mcp_errors
-)
-
-__version__ = "1.0.0"
-__all__ = [
- # Main client classes
- "SimpleMCPClient",
- "MCPEndpointConnector",
- "MCPEnhancedLlamaIndex",
- "MCPLlamaIndexTool",
- "create_mcp_tool",
-
- # Common utilities
- "MCPConnectionPool",
- "MCPConfigurationManager",
- "MCPEndpointValidator",
- "MCPErrorHandler",
- "MCPError",
- "MCPConnectionError",
- "MCPConfigurationError",
- "handle_mcp_errors"
-]
\ No newline at end of file
diff --git a/archive/mcp_client_integration/common/__init__.py b/archive/mcp_client_integration/common/__init__.py
deleted file mode 100644
index cfeae33d0..000000000
--- a/archive/mcp_client_integration/common/__init__.py
+++ /dev/null
@@ -1,91 +0,0 @@
-"""
-Common utilities for MCP client integration.
-
-This package provides shared utilities for connection management, validation,
-error handling, and configuration management across MCP components.
-
-This refactored architecture eliminates code duplication and provides
-standardized interfaces for MCP operations.
-"""
-
-# Connection management
-from .connection_manager import (
- MCPConnectionInterface,
- MCPConnectionFactory,
- MCPConnectionPool,
- MockMCPConnection,
- ExternalRouteMCPConnection,
- ClusterServiceMCPConnection
-)
-
-# Validation utilities
-from .validation import (
- MCPEndpointValidator,
- MCPConfigurationValidator,
- MCPSecurityValidator,
- ValidationResult
-)
-
-# Error handling
-from .error_handler import (
- MCPError,
- MCPConnectionError,
- MCPConfigurationError,
- MCPValidationError,
- MCPProtocolError,
- MCPTimeoutError,
- MCPErrorHandler,
- MCPErrorCategory,
- MCPErrorContext,
- handle_mcp_errors,
- default_error_handler
-)
-
-# Configuration management
-from .configuration import (
- MCPConfigurationManager,
- MCPConfiguration,
- MCPServerConfig,
- load_mcp_configuration,
- create_simple_configuration,
- validate_mcp_configuration_dict
-)
-
-__version__ = "1.0.0"
-
-__all__ = [
- # Connection management
- "MCPConnectionInterface",
- "MCPConnectionFactory",
- "MCPConnectionPool",
- "MockMCPConnection",
- "ExternalRouteMCPConnection",
- "ClusterServiceMCPConnection",
-
- # Validation
- "MCPEndpointValidator",
- "MCPConfigurationValidator",
- "MCPSecurityValidator",
- "ValidationResult",
-
- # Error handling
- "MCPError",
- "MCPConnectionError",
- "MCPConfigurationError",
- "MCPValidationError",
- "MCPProtocolError",
- "MCPTimeoutError",
- "MCPErrorHandler",
- "MCPErrorCategory",
- "MCPErrorContext",
- "handle_mcp_errors",
- "default_error_handler",
-
- # Configuration
- "MCPConfigurationManager",
- "MCPConfiguration",
- "MCPServerConfig",
- "load_mcp_configuration",
- "create_simple_configuration",
- "validate_mcp_configuration_dict"
-]
\ No newline at end of file
diff --git a/archive/mcp_client_integration/common/configuration.py b/archive/mcp_client_integration/common/configuration.py
deleted file mode 100644
index efd1a80b3..000000000
--- a/archive/mcp_client_integration/common/configuration.py
+++ /dev/null
@@ -1,381 +0,0 @@
-#!/usr/bin/env python3
-"""
-MCP Configuration Management Utilities
-
-This module provides standardized configuration management for MCP clients,
-consolidating configuration loading and validation logic from across the codebase.
-"""
-
-import json
-import os
-import logging
-from typing import Dict, Any, Optional, List, Union
-from dataclasses import dataclass
-
-from .validation import MCPConfigurationValidator, MCPSecurityValidator, ValidationResult
-from .error_handler import MCPConfigurationError, handle_mcp_errors
-
-# Configure logging
-logger = logging.getLogger(__name__)
-
-
-@dataclass
-class MCPServerConfig:
- """Configuration for a single MCP server."""
- capability: str
- endpoint: str
- timeout: int = 30
- connection_type: Optional[str] = None
- enabled: bool = True
- metadata: Optional[Dict[str, Any]] = None
-
- def to_dict(self) -> Dict[str, Any]:
- """Convert to dictionary format."""
- return {
- "capability": self.capability,
- "endpoint": self.endpoint,
- "timeout": self.timeout,
- "connection_type": self.connection_type,
- "enabled": self.enabled,
- "metadata": self.metadata or {}
- }
-
-
-@dataclass
-class MCPConfiguration:
- """Complete MCP configuration with multiple servers."""
- servers: Dict[str, MCPServerConfig]
- default_timeout: int = 30
- health_check_interval: int = 300 # 5 minutes
- max_retries: int = 3
- metadata: Optional[Dict[str, Any]] = None
-
- def get_enabled_servers(self) -> Dict[str, MCPServerConfig]:
- """Get only enabled servers."""
- return {
- capability: config
- for capability, config in self.servers.items()
- if config.enabled
- }
-
- def get_server_endpoints(self) -> Dict[str, str]:
- """Get mapping of capability to endpoint for enabled servers."""
- return {
- capability: config.endpoint
- for capability, config in self.get_enabled_servers().items()
- }
-
- def to_dict(self) -> Dict[str, Any]:
- """Convert to dictionary format."""
- return {
- "servers": {
- capability: config.to_dict()
- for capability, config in self.servers.items()
- },
- "default_timeout": self.default_timeout,
- "health_check_interval": self.health_check_interval,
- "max_retries": self.max_retries,
- "metadata": self.metadata or {}
- }
-
-
-class MCPConfigurationManager:
- """
- Unified configuration management for MCP clients.
-
- This class consolidates configuration loading, validation, and management
- logic, providing a single interface for all MCP configuration operations.
- """
-
- def __init__(self, default_timeout: int = 30, production_mode: bool = False):
- """
- Initialize configuration manager.
-
- Args:
- default_timeout: Default timeout for connections
- production_mode: Whether to use production-grade security validation
- """
- self.default_timeout = default_timeout
- self.production_mode = production_mode
- self.validator = MCPConfigurationValidator()
- self.security_validator = MCPSecurityValidator(production_mode)
- self._cached_config: Optional[MCPConfiguration] = None
-
- logger.debug(f"MCPConfigurationManager initialized (production_mode={production_mode})")
-
- @handle_mcp_errors("load_configuration")
- def load_configuration(self, env_var: str = "MCP_SERVERS") -> MCPConfiguration:
- """
- Load and validate MCP server configuration from environment.
-
- Args:
- env_var: Environment variable name containing configuration
-
- Returns:
- Validated MCP configuration
-
- Raises:
- MCPConfigurationError: If configuration is invalid
- """
- # Get configuration from environment
- env_value = os.getenv(env_var)
-
- if not env_value:
- # Return default configuration
- logger.info(f"No {env_var} found, using default configuration")
- return self._create_default_configuration()
-
- # Validate environment variable format
- validation_result = self.validator.validate_environment_config(env_value)
-
- if not validation_result.valid:
- raise MCPConfigurationError(validation_result.error_message)
-
- # Security validation
- security_result = self.security_validator.validate_configuration_security(env_value)
-
- if not security_result.valid:
- raise MCPConfigurationError(f"Security validation failed: {security_result.error_message}")
-
- # Parse JSON configuration (async-safe)
- try:
- # Use json.loads which is CPU-bound but fast for config sizes
- # For very large configs, could use asyncio.to_thread in future
- config_data = json.loads(env_value)
- except json.JSONDecodeError as e:
- raise MCPConfigurationError(f"Invalid JSON in {env_var}: {e}")
-
- # Convert to MCPConfiguration
- config = self._parse_configuration_dict(config_data)
-
- # Cache the configuration
- self._cached_config = config
-
- logger.info(f"Loaded configuration with {len(config.servers)} servers: {list(config.servers.keys())}")
- return config
-
- def _create_default_configuration(self) -> MCPConfiguration:
- """Create default configuration with a single server."""
- default_server = MCPServerConfig(
- capability="default",
- endpoint="https://mcp-server/sse",
- timeout=self.default_timeout
- )
-
- return MCPConfiguration(
- servers={"default": default_server},
- default_timeout=self.default_timeout
- )
-
- def _parse_configuration_dict(self, config_data: Dict[str, Any]) -> MCPConfiguration:
- """
- Parse configuration dictionary into MCPConfiguration.
-
- Args:
- config_data: Raw configuration dictionary
-
- Returns:
- Parsed MCPConfiguration
- """
- servers = {}
-
- for capability, endpoint in config_data.items():
- if isinstance(endpoint, str):
- # Simple endpoint string
- servers[capability] = MCPServerConfig(
- capability=capability,
- endpoint=endpoint,
- timeout=self.default_timeout
- )
- elif isinstance(endpoint, dict):
- # Complex endpoint configuration
- servers[capability] = MCPServerConfig(
- capability=capability,
- endpoint=endpoint["endpoint"],
- timeout=endpoint.get("timeout", self.default_timeout),
- connection_type=endpoint.get("connection_type"),
- enabled=endpoint.get("enabled", True),
- metadata=endpoint.get("metadata")
- )
- else:
- raise MCPConfigurationError(
- f"Invalid endpoint configuration for '{capability}': {endpoint}"
- )
-
- return MCPConfiguration(
- servers=servers,
- default_timeout=self.default_timeout
- )
-
- @handle_mcp_errors("validate_configuration")
- def validate_configuration(self, config: MCPConfiguration) -> ValidationResult:
- """
- Validate an MCPConfiguration object.
-
- Args:
- config: Configuration to validate
-
- Returns:
- ValidationResult with validation outcome
- """
- # Validate server endpoints
- endpoint_map = config.get_server_endpoints()
- return self.validator.validate_configuration_dict(endpoint_map)
-
- def get_cached_configuration(self) -> Optional[MCPConfiguration]:
- """Get cached configuration if available."""
- return self._cached_config
-
- def reload_configuration(self, env_var: str = "MCP_SERVERS") -> MCPConfiguration:
- """
- Reload configuration from environment.
-
- Args:
- env_var: Environment variable name containing configuration
-
- Returns:
- Reloaded configuration
- """
- self._cached_config = None
- return self.load_configuration(env_var)
-
- @handle_mcp_errors("create_configuration_from_dict")
- def create_configuration_from_dict(self, config_dict: Dict[str, Any]) -> MCPConfiguration:
- """
- Create configuration from dictionary.
-
- Args:
- config_dict: Configuration dictionary
-
- Returns:
- Created MCPConfiguration
-
- Raises:
- MCPConfigurationError: If configuration is invalid
- """
- # Validate the dictionary
- validation_result = self.validator.validate_json_config(config_dict)
-
- if not validation_result.valid:
- raise MCPConfigurationError(validation_result.error_message)
-
- # Parse and return configuration
- return self._parse_configuration_dict(config_dict)
-
- @handle_mcp_errors("save_configuration")
- def save_configuration_to_env(self, config: MCPConfiguration, env_var: str = "MCP_SERVERS") -> None:
- """
- Save configuration to environment variable format.
-
- Args:
- config: Configuration to save
- env_var: Environment variable name to save to
- """
- # Convert to simple endpoint mapping for environment storage
- endpoint_map = config.get_server_endpoints()
-
- # Convert to JSON string
- config_json = json.dumps(endpoint_map, indent=2)
-
- # Set environment variable (for current process)
- os.environ[env_var] = config_json
-
- logger.info(f"Configuration saved to {env_var}")
-
- def create_kubernetes_configmap_data(self, config: MCPConfiguration) -> Dict[str, str]:
- """
- Create Kubernetes ConfigMap data from configuration.
-
- Args:
- config: Configuration to convert
-
- Returns:
- Dictionary suitable for Kubernetes ConfigMap data
- """
- endpoint_map = config.get_server_endpoints()
-
- return {
- "MCP_SERVERS": json.dumps(endpoint_map, indent=2),
- "MCP_DEFAULT_TIMEOUT": str(config.default_timeout),
- "MCP_HEALTH_CHECK_INTERVAL": str(config.health_check_interval),
- "MCP_MAX_RETRIES": str(config.max_retries)
- }
-
- def get_configuration_summary(self, config: Optional[MCPConfiguration] = None) -> Dict[str, Any]:
- """
- Get configuration summary for logging/debugging.
-
- Args:
- config: Optional configuration, uses cached if not provided
-
- Returns:
- Configuration summary dictionary
- """
- if config is None:
- config = self._cached_config
-
- if config is None:
- return {"status": "no_configuration_loaded"}
-
- enabled_servers = config.get_enabled_servers()
-
- summary = {
- "total_servers": len(config.servers),
- "enabled_servers": len(enabled_servers),
- "capabilities": list(enabled_servers.keys()),
- "default_timeout": config.default_timeout,
- "health_check_interval": config.health_check_interval,
- "server_details": {}
- }
-
- for capability, server_config in enabled_servers.items():
- summary["server_details"][capability] = {
- "endpoint": server_config.endpoint,
- "timeout": server_config.timeout,
- "connection_type": server_config.connection_type or "auto-detect"
- }
-
- return summary
-
-
-# Convenience functions for common operations
-def load_mcp_configuration(env_var: str = "MCP_SERVERS") -> MCPConfiguration:
- """
- Convenience function to load MCP configuration.
-
- Args:
- env_var: Environment variable name containing configuration
-
- Returns:
- Loaded MCPConfiguration
- """
- manager = MCPConfigurationManager()
- return manager.load_configuration(env_var)
-
-
-def create_simple_configuration(servers: Dict[str, str]) -> MCPConfiguration:
- """
- Convenience function to create simple configuration.
-
- Args:
- servers: Dictionary mapping capability to endpoint
-
- Returns:
- Created MCPConfiguration
- """
- manager = MCPConfigurationManager()
- return manager.create_configuration_from_dict(servers)
-
-
-def validate_mcp_configuration_dict(config_dict: Dict[str, Any]) -> ValidationResult:
- """
- Convenience function to validate configuration dictionary.
-
- Args:
- config_dict: Configuration dictionary to validate
-
- Returns:
- ValidationResult
- """
- validator = MCPConfigurationValidator()
- return validator.validate_json_config(config_dict)
\ No newline at end of file
diff --git a/archive/mcp_client_integration/common/connection_manager.py b/archive/mcp_client_integration/common/connection_manager.py
deleted file mode 100644
index c9c2f1622..000000000
--- a/archive/mcp_client_integration/common/connection_manager.py
+++ /dev/null
@@ -1,550 +0,0 @@
-#!/usr/bin/env python3
-"""
-MCP Connection Management Utilities
-
-This module provides standardized connection interfaces and factories for MCP clients,
-eliminating code duplication and providing consistent connection lifecycle management.
-"""
-
-import asyncio
-import logging
-import ssl
-from abc import ABC, abstractmethod
-from typing import Dict, Any, Optional, Union
-from urllib.parse import urlparse
-
-# Configure logging
-logger = logging.getLogger(__name__)
-
-
-class MCPConnectionInterface(ABC):
- """
- Standard interface for all MCP connections.
-
- This abstract base class defines the contract that all MCP connection
- implementations must follow, ensuring consistency across different
- connection types (external routes, cluster services, etc.).
- """
-
- @abstractmethod
- async def send_message(self, message: Dict[str, Any]) -> Dict[str, Any]:
- """
- Send a message through the connection.
-
- Args:
- message: The message to send
-
- Returns:
- Response from the server
-
- Raises:
- ConnectionError: If the connection is not available
- TimeoutError: If the message times out
- """
- pass
-
- @abstractmethod
- async def close(self) -> None:
- """
- Close the connection and cleanup resources.
-
- This method should be idempotent and safe to call multiple times.
- """
- pass
-
- @property
- @abstractmethod
- def connected(self) -> bool:
- """
- Check if the connection is currently active.
-
- Returns:
- True if connected, False otherwise
- """
- pass
-
- @property
- @abstractmethod
- def endpoint(self) -> str:
- """
- Get the endpoint this connection is connected to.
-
- Returns:
- The endpoint URL or address
- """
- pass
-
-
-class MockMCPConnection(MCPConnectionInterface):
- """
- Mock MCP connection for testing and development.
-
- This implementation provides a standardized mock connection that can be
- used across all MCP components for testing purposes, eliminating the
- duplicate MockConnection classes found in the original implementation.
- """
-
- def __init__(self, endpoint: str, simulate_failure: bool = False):
- """
- Initialize mock connection.
-
- Args:
- endpoint: The endpoint to simulate connection to
- simulate_failure: Whether to simulate connection failures
- """
- self._endpoint = endpoint
- self._connected = True
- self._simulate_failure = simulate_failure
- self._message_count = 0
-
- logger.debug(f"Created mock connection to {endpoint}")
-
- async def send_message(self, message: Dict[str, Any]) -> Dict[str, Any]:
- """Send message through mock connection."""
- if self._simulate_failure:
- raise ConnectionError("Simulated connection failure")
-
- if not self._connected:
- raise ConnectionError("Connection is not active")
-
- self._message_count += 1
-
- # Simulate message processing
- response = {
- "status": "ok",
- "endpoint": self._endpoint,
- "message_id": self._message_count,
- "echo": message,
- "timestamp": asyncio.get_event_loop().time()
- }
-
- logger.debug(f"Mock connection sent message {self._message_count} to {self._endpoint}")
- return response
-
- async def close(self) -> None:
- """Close mock connection."""
- if self._connected:
- self._connected = False
- logger.debug(f"Mock connection to {self._endpoint} closed")
-
- @property
- def connected(self) -> bool:
- """Check if mock connection is active."""
- return self._connected
-
- @property
- def endpoint(self) -> str:
- """Get mock connection endpoint."""
- return self._endpoint
-
- def set_failure_mode(self, simulate_failure: bool) -> None:
- """Set whether to simulate failures."""
- self._simulate_failure = simulate_failure
-
-
-class ExternalRouteMCPConnection(MCPConnectionInterface):
- """
- MCP connection for external routes (HTTPS/SSE).
-
- This implementation handles connections to external MCP servers via
- HTTPS routes, typically used in OpenShift environments.
- """
-
- def __init__(self, endpoint: str, timeout: int = 30, verify_ssl: bool = True):
- """
- Initialize external route connection.
-
- Args:
- endpoint: The HTTPS endpoint URL
- timeout: Connection timeout in seconds
- verify_ssl: Whether to verify SSL certificates (should be True in production)
- """
- self._endpoint = endpoint
- self._timeout = timeout
- self._verify_ssl = verify_ssl
- self._connected = False
- self._session = None
-
- # Validate endpoint format
- parsed = urlparse(endpoint)
- if parsed.scheme not in ['http', 'https']:
- raise ValueError(f"Invalid endpoint scheme: {parsed.scheme}")
-
- # Warn about insecure configurations
- if parsed.scheme == 'http':
- logger.warning(f"Insecure HTTP connection to {endpoint} - consider using HTTPS")
-
- if not verify_ssl:
- logger.warning(f"SSL verification disabled for {endpoint} - this is insecure for production")
-
- # Create SSL context for secure connections
- if parsed.scheme == 'https':
- self._ssl_context = ssl.create_default_context()
- if not verify_ssl:
- self._ssl_context.check_hostname = False
- self._ssl_context.verify_mode = ssl.CERT_NONE
- else:
- self._ssl_context = None
-
- logger.debug(f"Created external route connection to {endpoint} (SSL verify: {verify_ssl})")
-
- async def connect(self) -> None:
- """Establish connection to external route."""
- try:
- # In a real implementation, this would establish the SSE connection
- # For now, we simulate a successful connection
- self._connected = True
- logger.info(f"Connected to external route: {self._endpoint}")
- except Exception as e:
- logger.error(f"Failed to connect to {self._endpoint}: {e}")
- raise ConnectionError(f"Failed to connect to {self._endpoint}: {e}")
-
- async def send_message(self, message: Dict[str, Any]) -> Dict[str, Any]:
- """Send message through external route connection."""
- if not self._connected:
- raise ConnectionError("Connection not established")
-
- try:
- # In a real implementation, this would send via HTTPS/SSE
- # For now, we simulate a successful message exchange
- response = {
- "status": "ok",
- "endpoint": self._endpoint,
- "type": "external_route",
- "message": message,
- "timestamp": asyncio.get_event_loop().time()
- }
-
- logger.debug(f"Sent message via external route to {self._endpoint}")
- return response
-
- except Exception as e:
- logger.error(f"Failed to send message to {self._endpoint}: {e}")
- raise ConnectionError(f"Message send failed: {e}")
-
- async def close(self) -> None:
- """Close external route connection."""
- if self._connected:
- self._connected = False
- if self._session:
- await self._session.close()
- self._session = None
- logger.debug(f"External route connection to {self._endpoint} closed")
-
- @property
- def connected(self) -> bool:
- """Check if external route connection is active."""
- return self._connected
-
- @property
- def endpoint(self) -> str:
- """Get external route endpoint."""
- return self._endpoint
-
-
-class ClusterServiceMCPConnection(MCPConnectionInterface):
- """
- MCP connection for cluster-internal services.
-
- This implementation handles connections to MCP servers running as
- Kubernetes services within the same cluster.
- """
-
- def __init__(self, endpoint: str, timeout: int = 30):
- """
- Initialize cluster service connection.
-
- Args:
- endpoint: The cluster service endpoint
- timeout: Connection timeout in seconds
- """
- self._endpoint = endpoint
- self._timeout = timeout
- self._connected = False
- self._websocket = None
-
- # Parse cluster service format
- if '.svc.cluster.local' not in endpoint:
- raise ValueError(f"Invalid cluster service format: {endpoint}")
-
- logger.debug(f"Created cluster service connection to {endpoint}")
-
- async def connect(self) -> None:
- """Establish connection to cluster service."""
- try:
- # In a real implementation, this would establish WebSocket connection
- # For now, we simulate a successful connection
- self._connected = True
- logger.info(f"Connected to cluster service: {self._endpoint}")
- except Exception as e:
- logger.error(f"Failed to connect to {self._endpoint}: {e}")
- raise ConnectionError(f"Failed to connect to {self._endpoint}: {e}")
-
- async def send_message(self, message: Dict[str, Any]) -> Dict[str, Any]:
- """Send message through cluster service connection."""
- if not self._connected:
- raise ConnectionError("Connection not established")
-
- try:
- # In a real implementation, this would send via WebSocket
- # For now, we simulate a successful message exchange
- response = {
- "status": "ok",
- "endpoint": self._endpoint,
- "type": "cluster_service",
- "message": message,
- "timestamp": asyncio.get_event_loop().time()
- }
-
- logger.debug(f"Sent message via cluster service to {self._endpoint}")
- return response
-
- except Exception as e:
- logger.error(f"Failed to send message to {self._endpoint}: {e}")
- raise ConnectionError(f"Message send failed: {e}")
-
- async def close(self) -> None:
- """Close cluster service connection."""
- if self._connected:
- self._connected = False
- if self._websocket:
- await self._websocket.close()
- self._websocket = None
- logger.debug(f"Cluster service connection to {self._endpoint} closed")
-
- @property
- def connected(self) -> bool:
- """Check if cluster service connection is active."""
- return self._connected
-
- @property
- def endpoint(self) -> str:
- """Get cluster service endpoint."""
- return self._endpoint
-
-
-class MCPConnectionFactory:
- """
- Factory for creating appropriate MCP connection types.
-
- This factory eliminates code duplication by providing a centralized
- way to create connections based on endpoint format and requirements.
- """
-
- @staticmethod
- def create_connection(
- endpoint: str,
- connection_type: Optional[str] = None,
- timeout: int = 30,
- mock: bool = False,
- verify_ssl: bool = True
- ) -> MCPConnectionInterface:
- """
- Create appropriate connection type based on endpoint.
-
- Args:
- endpoint: The endpoint to connect to
- connection_type: Optional explicit connection type
- timeout: Connection timeout in seconds
- mock: Whether to create a mock connection
- verify_ssl: Whether to verify SSL certificates for HTTPS connections
-
- Returns:
- Appropriate connection implementation
-
- Raises:
- ValueError: If endpoint format is invalid
- """
- if mock:
- return MockMCPConnection(endpoint)
-
- # Auto-detect connection type if not specified
- if connection_type is None:
- if endpoint.startswith(('http://', 'https://')):
- connection_type = 'external_route'
- elif '.svc.cluster.local' in endpoint:
- connection_type = 'cluster_service'
- else:
- raise ValueError(f"Cannot determine connection type for endpoint: {endpoint}")
-
- # Create appropriate connection type
- if connection_type == 'external_route':
- return ExternalRouteMCPConnection(endpoint, timeout, verify_ssl)
- elif connection_type == 'cluster_service':
- return ClusterServiceMCPConnection(endpoint, timeout)
- elif connection_type == 'mock':
- return MockMCPConnection(endpoint)
- else:
- raise ValueError(f"Unsupported connection type: {connection_type}")
-
- @staticmethod
- def create_mock_connection(endpoint: str, simulate_failure: bool = False) -> MockMCPConnection:
- """
- Create a mock connection for testing.
-
- Args:
- endpoint: The endpoint to mock
- simulate_failure: Whether to simulate failures
-
- Returns:
- Mock connection instance
- """
- return MockMCPConnection(endpoint, simulate_failure)
-
-
-class MCPConnectionPool:
- """
- Connection pool for managing multiple MCP connections.
-
- This utility provides centralized connection lifecycle management,
- eliminating the need for each component to manage connections individually.
- """
-
- def __init__(self, timeout: int = 30, max_connections: int = 10):
- """
- Initialize connection pool.
-
- Args:
- timeout: Default connection timeout
- max_connections: Maximum number of connections allowed
- """
- self._connections: Dict[str, MCPConnectionInterface] = {}
- self._health: Dict[str, bool] = {}
- self._timeout = timeout
- self._max_connections = max_connections
- self._connection_count = 0
-
- logger.debug(f"Initialized MCP connection pool (max_connections={max_connections})")
-
- async def add_connection(
- self,
- capability: str,
- endpoint: str,
- connection_type: Optional[str] = None,
- mock: bool = False
- ) -> bool:
- """
- Add a connection to the pool.
-
- Args:
- capability: The capability name for this connection
- endpoint: The endpoint to connect to
- connection_type: Optional explicit connection type
- mock: Whether to create a mock connection
-
- Returns:
- True if connection was successfully added
- """
- # Check connection pool limits
- if self._connection_count >= self._max_connections:
- logger.error(f"Connection pool limit reached ({self._max_connections}). Cannot add '{capability}'")
- return False
-
- try:
- connection = MCPConnectionFactory.create_connection(
- endpoint, connection_type, self._timeout, mock
- )
-
- # Establish connection for non-mock connections
- if not mock and hasattr(connection, 'connect'):
- await connection.connect()
-
- self._connections[capability] = connection
- self._health[capability] = True
- self._connection_count += 1
-
- logger.info(f"Added connection for capability '{capability}' to {endpoint} ({self._connection_count}/{self._max_connections})")
- return True
-
- except Exception as e:
- logger.error(f"Failed to add connection for '{capability}': {e}")
- self._health[capability] = False
- return False
-
- async def send_message(
- self,
- capability: str,
- message: Dict[str, Any]
- ) -> Dict[str, Any]:
- """
- Send message through a specific capability connection.
-
- Args:
- capability: The capability to send message through
- message: The message to send
-
- Returns:
- Response from the server
-
- Raises:
- KeyError: If capability not found
- ConnectionError: If connection is not healthy
- """
- if capability not in self._connections:
- raise KeyError(f"No connection found for capability: {capability}")
-
- if not self._health.get(capability, False):
- raise ConnectionError(f"Connection for '{capability}' is not healthy")
-
- try:
- connection = self._connections[capability]
- return await connection.send_message(message)
- except Exception as e:
- # Mark connection as unhealthy on failure
- self._health[capability] = False
- logger.warning(f"Connection for '{capability}' marked unhealthy: {e}")
- raise
-
- async def health_check(self) -> Dict[str, bool]:
- """
- Check health of all connections.
-
- Returns:
- Dictionary mapping capability to health status
- """
- health_results = {}
-
- for capability, connection in self._connections.items():
- try:
- # Simple ping to check health
- await connection.send_message({"type": "ping"})
- health_results[capability] = True
- except Exception as e:
- logger.warning(f"Health check failed for '{capability}': {e}")
- health_results[capability] = False
-
- # Update internal health tracking
- self._health.update(health_results)
- return health_results
-
- async def close_all(self) -> None:
- """Close all connections in the pool."""
- for capability, connection in self._connections.items():
- try:
- await connection.close()
- logger.debug(f"Closed connection for capability '{capability}'")
- except Exception as e:
- logger.warning(f"Error closing connection for '{capability}': {e}")
-
- self._connections.clear()
- self._health.clear()
- self._connection_count = 0
- logger.info("All connections closed")
-
- def get_health_status(self) -> Dict[str, bool]:
- """Get current health status of all connections."""
- return self._health.copy()
-
- def get_connection_info(self) -> Dict[str, Dict[str, Any]]:
- """Get detailed information about all connections."""
- info = {}
-
- for capability, connection in self._connections.items():
- info[capability] = {
- "endpoint": connection.endpoint,
- "connected": connection.connected,
- "healthy": self._health.get(capability, False),
- "type": type(connection).__name__
- }
-
- return info
\ No newline at end of file
diff --git a/archive/mcp_client_integration/common/error_handler.py b/archive/mcp_client_integration/common/error_handler.py
deleted file mode 100644
index d239d5c97..000000000
--- a/archive/mcp_client_integration/common/error_handler.py
+++ /dev/null
@@ -1,429 +0,0 @@
-#!/usr/bin/env python3
-"""
-MCP Error Handling Utilities
-
-This module provides standardized error handling for MCP operations,
-consolidating error handling patterns and providing consistent error
-responses across all MCP components.
-"""
-
-import asyncio
-import logging
-import traceback
-from contextlib import asynccontextmanager
-from typing import Dict, Any, Optional, Union, Type
-from dataclasses import dataclass
-from enum import Enum
-
-# Configure logging
-logger = logging.getLogger(__name__)
-
-
-class MCPErrorCategory(Enum):
- """Categories of MCP errors for structured error handling."""
- CONNECTION = "connection"
- CONFIGURATION = "configuration"
- VALIDATION = "validation"
- PROTOCOL = "protocol"
- TIMEOUT = "timeout"
- AUTHENTICATION = "authentication"
- UNKNOWN = "unknown"
-
-
-@dataclass
-class MCPErrorContext:
- """Context information for MCP errors."""
- operation: str
- endpoint: Optional[str] = None
- capability: Optional[str] = None
- message_id: Optional[str] = None
- additional_info: Optional[Dict[str, Any]] = None
-
-
-class MCPError(Exception):
- """
- Base MCP error class with structured error information.
-
- This class provides a standardized way to represent errors
- across all MCP operations.
- """
-
- def __init__(
- self,
- message: str,
- category: MCPErrorCategory = MCPErrorCategory.UNKNOWN,
- context: Optional[MCPErrorContext] = None,
- original_error: Optional[Exception] = None
- ):
- """
- Initialize MCP error.
-
- Args:
- message: Human-readable error message
- category: Error category for classification
- context: Context information about the error
- original_error: Original exception that caused this error
- """
- super().__init__(message)
- self.message = message
- self.category = category
- self.context = context or MCPErrorContext(operation="unknown")
- self.original_error = original_error
-
- logger.debug(f"MCPError created: {category.value} - {message}")
-
- def to_dict(self) -> Dict[str, Any]:
- """Convert error to dictionary format."""
- error_dict = {
- "error": True,
- "message": self.message,
- "category": self.category.value,
- "operation": self.context.operation
- }
-
- if self.context.endpoint:
- error_dict["endpoint"] = self.context.endpoint
-
- if self.context.capability:
- error_dict["capability"] = self.context.capability
-
- if self.context.message_id:
- error_dict["message_id"] = self.context.message_id
-
- if self.context.additional_info:
- error_dict["additional_info"] = self.context.additional_info
-
- if self.original_error:
- error_dict["original_error"] = str(self.original_error)
- error_dict["original_error_type"] = type(self.original_error).__name__
-
- return error_dict
-
-
-class MCPConnectionError(MCPError):
- """Error related to MCP connection issues."""
-
- def __init__(self, message: str, endpoint: str, original_error: Optional[Exception] = None):
- context = MCPErrorContext(operation="connection", endpoint=endpoint)
- super().__init__(message, MCPErrorCategory.CONNECTION, context, original_error)
-
-
-class MCPConfigurationError(MCPError):
- """Error related to MCP configuration issues."""
-
- def __init__(self, message: str, original_error: Optional[Exception] = None):
- context = MCPErrorContext(operation="configuration")
- super().__init__(message, MCPErrorCategory.CONFIGURATION, context, original_error)
-
-
-class MCPValidationError(MCPError):
- """Error related to MCP validation issues."""
-
- def __init__(self, message: str, endpoint: Optional[str] = None, original_error: Optional[Exception] = None):
- context = MCPErrorContext(operation="validation", endpoint=endpoint)
- super().__init__(message, MCPErrorCategory.VALIDATION, context, original_error)
-
-
-class MCPProtocolError(MCPError):
- """Error related to MCP protocol issues."""
-
- def __init__(self, message: str, capability: Optional[str] = None, message_id: Optional[str] = None, original_error: Optional[Exception] = None):
- context = MCPErrorContext(operation="protocol", capability=capability, message_id=message_id)
- super().__init__(message, MCPErrorCategory.PROTOCOL, context, original_error)
-
-
-class MCPTimeoutError(MCPError):
- """Error related to MCP timeout issues."""
-
- def __init__(self, message: str, endpoint: str, timeout_seconds: int, original_error: Optional[Exception] = None):
- context = MCPErrorContext(
- operation="timeout",
- endpoint=endpoint,
- additional_info={"timeout_seconds": timeout_seconds}
- )
- super().__init__(message, MCPErrorCategory.TIMEOUT, context, original_error)
-
-
-class MCPErrorHandler:
- """
- Standardized error handling for MCP operations.
-
- This class provides consistent error handling patterns,
- logging, and error response formatting across all MCP components.
- """
-
- def __init__(self, logger_name: Optional[str] = None, max_error_history: int = 1000):
- """
- Initialize error handler.
-
- Args:
- logger_name: Optional logger name, defaults to class module
- max_error_history: Maximum number of error entries to track (prevents memory leaks)
- """
- self.logger = logging.getLogger(logger_name or __name__)
- self._error_counts: Dict[str, int] = {}
- self._max_error_history = max_error_history
-
- logger.debug(f"MCPErrorHandler initialized (max_error_history={max_error_history})")
-
- @asynccontextmanager
- async def handle_connection_errors(self, operation_name: str, endpoint: str):
- """
- Context manager for connection error handling.
-
- Args:
- operation_name: Name of the operation being performed
- endpoint: Endpoint being connected to
- """
- try:
- yield
- except asyncio.TimeoutError as e:
- error_msg = f"Connection timeout during {operation_name} to {endpoint}"
- self.logger.error(error_msg)
- self._increment_error_count(f"timeout_{operation_name}")
- raise MCPTimeoutError(error_msg, endpoint, 30, e)
- except ConnectionError as e:
- error_msg = f"Connection failed during {operation_name} to {endpoint}: {e}"
- self.logger.error(error_msg)
- self._increment_error_count(f"connection_{operation_name}")
- raise MCPConnectionError(error_msg, endpoint, e)
- except Exception as e:
- error_msg = f"Unexpected error during {operation_name} to {endpoint}: {e}"
- self.logger.error(error_msg, exc_info=True)
- self._increment_error_count(f"unknown_{operation_name}")
- raise MCPError(error_msg, MCPErrorCategory.UNKNOWN,
- MCPErrorContext(operation_name, endpoint), e)
-
- @asynccontextmanager
- async def handle_protocol_errors(self, operation_name: str, capability: Optional[str] = None):
- """
- Context manager for protocol error handling.
-
- Args:
- operation_name: Name of the operation being performed
- capability: Optional capability being used
- """
- try:
- yield
- except MCPError:
- # Re-raise MCP errors as-is
- raise
- except asyncio.TimeoutError as e:
- error_msg = f"Protocol timeout during {operation_name}"
- if capability:
- error_msg += f" for capability {capability}"
- self.logger.error(error_msg)
- self._increment_error_count(f"protocol_timeout_{operation_name}")
- raise MCPTimeoutError(error_msg, capability or "unknown", 30, e)
- except Exception as e:
- error_msg = f"Protocol error during {operation_name}: {e}"
- self.logger.error(error_msg, exc_info=True)
- self._increment_error_count(f"protocol_{operation_name}")
- raise MCPProtocolError(error_msg, capability, original_error=e)
-
- def handle_configuration_error(self, operation_name: str, error: Exception) -> MCPConfigurationError:
- """
- Handle configuration errors.
-
- Args:
- operation_name: Name of the operation that failed
- error: Original error
-
- Returns:
- MCPConfigurationError with standardized format
- """
- error_msg = f"Configuration error during {operation_name}: {error}"
- self.logger.error(error_msg)
- self._increment_error_count(f"config_{operation_name}")
- return MCPConfigurationError(error_msg, error)
-
- def handle_validation_error(self, operation_name: str, endpoint: Optional[str], error: Exception) -> MCPValidationError:
- """
- Handle validation errors.
-
- Args:
- operation_name: Name of the operation that failed
- endpoint: Optional endpoint being validated
- error: Original error
-
- Returns:
- MCPValidationError with standardized format
- """
- error_msg = f"Validation error during {operation_name}"
- if endpoint:
- error_msg += f" for endpoint {endpoint}"
- error_msg += f": {error}"
-
- self.logger.error(error_msg)
- self._increment_error_count(f"validation_{operation_name}")
- return MCPValidationError(error_msg, endpoint, error)
-
- def format_error_response(self, error: Union[Exception, MCPError], context: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
- """
- Standardize error response format.
-
- Args:
- error: The error to format
- context: Optional additional context
-
- Returns:
- Standardized error response dictionary
- """
- if isinstance(error, MCPError):
- response = error.to_dict()
- else:
- response = {
- "error": True,
- "message": str(error),
- "category": MCPErrorCategory.UNKNOWN.value,
- "operation": "unknown",
- "original_error_type": type(error).__name__
- }
-
- # Add additional context if provided
- if context:
- response["context"] = context
-
- # Add timestamp
- response["timestamp"] = asyncio.get_event_loop().time()
-
- # Add error tracking info
- response["error_counts"] = self.get_error_summary()
-
- return response
-
- def create_success_response(self, data: Any, operation: str, context: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
- """
- Create standardized success response.
-
- Args:
- data: The success data
- operation: Operation that succeeded
- context: Optional additional context
-
- Returns:
- Standardized success response dictionary
- """
- response = {
- "error": False,
- "success": True,
- "data": data,
- "operation": operation,
- "timestamp": asyncio.get_event_loop().time()
- }
-
- if context:
- response["context"] = context
-
- return response
-
- def _increment_error_count(self, error_type: str) -> None:
- """Increment error count for tracking with memory limit protection."""
- # Prevent unbounded growth by cleaning up old entries when limit is reached
- if len(self._error_counts) >= self._max_error_history:
- # Remove oldest half of entries (simple cleanup strategy)
- sorted_keys = sorted(self._error_counts.keys())
- keys_to_remove = sorted_keys[:len(sorted_keys) // 2]
- for key in keys_to_remove:
- del self._error_counts[key]
- logger.debug(f"Cleaned up {len(keys_to_remove)} old error count entries")
-
- self._error_counts[error_type] = self._error_counts.get(error_type, 0) + 1
-
- def get_error_summary(self) -> Dict[str, int]:
- """Get summary of error counts."""
- return self._error_counts.copy()
-
- def reset_error_counts(self) -> None:
- """Reset error count tracking."""
- self._error_counts.clear()
- logger.debug("Error counts reset")
-
- def log_error_with_context(
- self,
- error: Exception,
- operation: str,
- endpoint: Optional[str] = None,
- capability: Optional[str] = None,
- additional_context: Optional[Dict[str, Any]] = None
- ) -> None:
- """
- Log error with structured context information.
-
- Args:
- error: The error to log
- operation: Operation that failed
- endpoint: Optional endpoint
- capability: Optional capability
- additional_context: Optional additional context
- """
- context_info = {
- "operation": operation,
- "error_type": type(error).__name__,
- "error_message": str(error)
- }
-
- if endpoint:
- context_info["endpoint"] = endpoint
-
- if capability:
- context_info["capability"] = capability
-
- if additional_context:
- context_info.update(additional_context)
-
- self.logger.error(
- f"MCP operation failed: {operation}",
- extra={"mcp_context": context_info},
- exc_info=True
- )
-
-
-# Global error handler instance for convenience
-default_error_handler = MCPErrorHandler()
-
-
-def handle_mcp_errors(operation: str, endpoint: Optional[str] = None):
- """
- Decorator for automatic MCP error handling.
-
- Args:
- operation: Name of the operation
- endpoint: Optional endpoint being operated on
- """
- def decorator(func):
- if asyncio.iscoroutinefunction(func):
- async def async_wrapper(*args, **kwargs):
- try:
- return await func(*args, **kwargs)
- except MCPError:
- raise # Re-raise MCP errors as-is
- except Exception as e:
- default_error_handler.log_error_with_context(
- e, operation, endpoint
- )
- raise MCPError(
- f"Error in {operation}: {e}",
- MCPErrorCategory.UNKNOWN,
- MCPErrorContext(operation, endpoint),
- e
- )
- return async_wrapper
- else:
- def sync_wrapper(*args, **kwargs):
- try:
- return func(*args, **kwargs)
- except MCPError:
- raise # Re-raise MCP errors as-is
- except Exception as e:
- default_error_handler.log_error_with_context(
- e, operation, endpoint
- )
- raise MCPError(
- f"Error in {operation}: {e}",
- MCPErrorCategory.UNKNOWN,
- MCPErrorContext(operation, endpoint),
- e
- )
- return sync_wrapper
- return decorator
\ No newline at end of file
diff --git a/archive/mcp_client_integration/common/validation.py b/archive/mcp_client_integration/common/validation.py
deleted file mode 100644
index 30bc674cd..000000000
--- a/archive/mcp_client_integration/common/validation.py
+++ /dev/null
@@ -1,701 +0,0 @@
-#!/usr/bin/env python3
-"""
-MCP Endpoint Validation Utilities
-
-This module provides standardized validation utilities for MCP endpoints,
-consolidating validation logic from across the codebase into a single,
-reusable interface.
-"""
-
-import re
-import logging
-import json
-from dataclasses import dataclass
-from typing import Dict, Any, Optional, List
-from urllib.parse import urlparse
-
-# Configure logging
-logger = logging.getLogger(__name__)
-
-
-@dataclass
-class ValidationResult:
- """
- Standardized validation result structure.
-
- This class provides a consistent way to return validation results
- across all validation operations.
- """
- valid: bool
- error_message: Optional[str] = None
- endpoint_type: Optional[str] = None
- parsed_info: Optional[Dict[str, Any]] = None
- details: Optional[Dict[str, Any]] = None
-
- def to_dict(self) -> Dict[str, Any]:
- """Convert validation result to dictionary."""
- return {
- "valid": self.valid,
- "error_message": self.error_message,
- "endpoint_type": self.endpoint_type,
- "parsed_info": self.parsed_info
- }
-
-
-class MCPEndpointValidator:
- """
- Unified endpoint validation utility.
-
- This class consolidates all endpoint validation logic from the original
- implementation, providing a single interface for validating MCP endpoints
- in various formats.
- """
-
- def __init__(self):
- """Initialize validator with compiled regex patterns."""
- # Compiled regex patterns for performance
- self._external_route_pattern = re.compile(
- r'^https?://[a-zA-Z0-9\-\.]+\.[a-zA-Z]{2,}(?::\d+)?(?:/[a-zA-Z0-9\-\.\/]*)?(?:\?.*)?$'
- )
-
- self._cluster_service_pattern = re.compile(
- r'^[a-zA-Z0-9\-]+\.[a-zA-Z0-9\-]+\.svc\.cluster\.local(?::\d+)?$'
- )
-
- self._hostname_pattern = re.compile(
- r'^[a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?(\.[a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)*$'
- )
-
- self._k8s_name_pattern = re.compile(
- r'^[a-z0-9]([a-z0-9\-]{0,61}[a-z0-9])?$'
- )
-
- logger.debug("MCPEndpointValidator initialized with compiled patterns")
-
- def validate_endpoint(self, endpoint: str) -> ValidationResult:
- """
- Validate any MCP endpoint format.
-
- Args:
- endpoint: The endpoint to validate
-
- Returns:
- ValidationResult with validation outcome and details
- """
- if not endpoint or not isinstance(endpoint, str):
- return ValidationResult(
- valid=False,
- error_message="Endpoint must be a non-empty string"
- )
-
- endpoint = endpoint.strip()
-
- # Check for external route format
- if self._is_external_route(endpoint):
- return self._validate_external_route(endpoint)
-
- # Check for cluster service format
- if self._is_cluster_service(endpoint):
- return self._validate_cluster_service(endpoint)
-
- return ValidationResult(
- valid=False,
- error_message=f"Endpoint format not recognized: {endpoint}"
- )
-
- def validate_configuration(self, config: Dict[str, str]) -> Dict[str, ValidationResult]:
- """
- Validate entire MCP server configuration.
-
- Args:
- config: Dictionary mapping capability names to endpoints
-
- Returns:
- Dictionary mapping capability names to validation results
- """
- results = {}
-
- for capability, endpoint in config.items():
- results[capability] = self.validate_endpoint(endpoint)
-
- return results
-
- def validate_configuration_dict(self, config: Dict[str, str]) -> ValidationResult:
- """
- Validate configuration dictionary as a whole.
-
- Args:
- config: Configuration dictionary to validate
-
- Returns:
- Overall validation result
- """
- if not isinstance(config, dict):
- return ValidationResult(
- valid=False,
- error_message="Configuration must be a dictionary"
- )
-
- if not config:
- return ValidationResult(
- valid=False,
- error_message="Configuration must contain at least one server definition"
- )
-
- # Validate each endpoint
- invalid_endpoints = []
- results = self.validate_configuration(config)
-
- for capability, result in results.items():
- if not result.valid:
- invalid_endpoints.append(f"{capability}: {result.error_message}")
-
- if invalid_endpoints:
- return ValidationResult(
- valid=False,
- error_message=f"Invalid endpoints found: {'; '.join(invalid_endpoints)}"
- )
-
- return ValidationResult(
- valid=True,
- parsed_info={"endpoint_count": len(config), "capabilities": list(config.keys())}
- )
-
- def _is_external_route(self, endpoint: str) -> bool:
- """Check if endpoint is an external route format."""
- return endpoint.startswith('http://') or endpoint.startswith('https://')
-
- def _is_cluster_service(self, endpoint: str) -> bool:
- """Check if endpoint is a cluster service format."""
- return '.svc.cluster.local' in endpoint
-
- def _validate_external_route(self, endpoint: str) -> ValidationResult:
- """
- Validate external route endpoint format.
-
- Args:
- endpoint: External route endpoint to validate
-
- Returns:
- ValidationResult with validation outcome
- """
- try:
- parsed = urlparse(endpoint)
-
- # Must have scheme
- if not parsed.scheme or parsed.scheme not in ['http', 'https']:
- return ValidationResult(
- valid=False,
- error_message=f"Invalid scheme: {parsed.scheme}. Must be http or https"
- )
-
- # Must have hostname
- if not parsed.hostname:
- return ValidationResult(
- valid=False,
- error_message="Missing hostname in URL"
- )
-
- # Validate hostname format
- if not self._is_valid_hostname(parsed.hostname):
- return ValidationResult(
- valid=False,
- error_message=f"Invalid hostname format: {parsed.hostname}"
- )
-
- # Validate port if specified
- if parsed.port is not None:
- if not (1 <= parsed.port <= 65535):
- return ValidationResult(
- valid=False,
- error_message=f"Invalid port number: {parsed.port}"
- )
-
- # Path validation (optional)
- if parsed.path and not self._is_valid_path(parsed.path):
- return ValidationResult(
- valid=False,
- error_message=f"Invalid path format: {parsed.path}"
- )
-
- return ValidationResult(
- valid=True,
- endpoint_type="external_route",
- parsed_info={
- "scheme": parsed.scheme,
- "hostname": parsed.hostname,
- "port": parsed.port,
- "path": parsed.path
- }
- )
-
- except Exception as e:
- return ValidationResult(
- valid=False,
- error_message=f"URL parsing error: {e}"
- )
-
- def _validate_cluster_service(self, endpoint: str) -> ValidationResult:
- """
- Validate cluster service endpoint format.
-
- Args:
- endpoint: Cluster service endpoint to validate
-
- Returns:
- ValidationResult with validation outcome
- """
- try:
- # Split endpoint and port if present
- port = None
- if ':' in endpoint:
- service_part, port_part = endpoint.rsplit(':', 1)
- try:
- port = int(port_part)
- if not (1 <= port <= 65535):
- return ValidationResult(
- valid=False,
- error_message=f"Invalid port number: {port}"
- )
- except ValueError:
- return ValidationResult(
- valid=False,
- error_message=f"Invalid port format: {port_part}"
- )
- else:
- service_part = endpoint
-
- # Validate service name format: service.namespace.svc.cluster.local
- parts = service_part.split('.')
- if len(parts) < 5: # minimum: service.namespace.svc.cluster.local
- return ValidationResult(
- valid=False,
- error_message="Incomplete cluster service format. Expected: service.namespace.svc.cluster.local"
- )
-
- if parts[-3:] != ['svc', 'cluster', 'local']:
- return ValidationResult(
- valid=False,
- error_message="Invalid cluster service domain. Must end with .svc.cluster.local"
- )
-
- # Validate service and namespace names
- service_name = parts[0]
- namespace = parts[1]
-
- if not self._is_valid_k8s_name(service_name):
- return ValidationResult(
- valid=False,
- error_message=f"Invalid service name format: {service_name}"
- )
-
- if not self._is_valid_k8s_name(namespace):
- return ValidationResult(
- valid=False,
- error_message=f"Invalid namespace format: {namespace}"
- )
-
- return ValidationResult(
- valid=True,
- endpoint_type="cluster_service",
- parsed_info={
- "service": service_name,
- "namespace": namespace,
- "domain": '.'.join(parts[2:]),
- "port": port
- }
- )
-
- except Exception as e:
- return ValidationResult(
- valid=False,
- error_message=f"Cluster service parsing error: {e}"
- )
-
- def _is_valid_hostname(self, hostname: str) -> bool:
- """
- Validate hostname format.
-
- Args:
- hostname: Hostname to validate
-
- Returns:
- True if valid, False otherwise
- """
- if not hostname or len(hostname) > 253:
- return False
-
- return bool(self._hostname_pattern.match(hostname))
-
- def _is_valid_path(self, path: str) -> bool:
- """
- Validate URL path format.
-
- Args:
- path: URL path to validate
-
- Returns:
- True if valid, False otherwise
- """
- if not path:
- return True
-
- # Basic path validation - allow alphanumeric, hyphens, slashes, dots
- path_pattern = re.compile(r'^[a-zA-Z0-9\-\./]*$')
- return bool(path_pattern.match(path))
-
- def _is_valid_k8s_name(self, name: str) -> bool:
- """
- Validate Kubernetes resource name format.
-
- Args:
- name: Kubernetes name to validate
-
- Returns:
- True if valid, False otherwise
- """
- if not name or len(name) > 63:
- return False
-
- return bool(self._k8s_name_pattern.match(name))
-
- def get_endpoint_info(self, endpoint: str) -> Dict[str, Any]:
- """
- Get detailed information about an endpoint.
-
- Args:
- endpoint: The endpoint to analyze
-
- Returns:
- Dictionary with endpoint information
- """
- result = self.validate_endpoint(endpoint)
-
- info = {
- "endpoint": endpoint,
- "valid": result.valid,
- "type": result.endpoint_type,
- "parsed": result.parsed_info
- }
-
- if not result.valid:
- info["error"] = result.error_message
-
- return info
-
-
-class MCPConfigurationValidator:
- """
- Configuration validation utility for MCP servers.
-
- This class provides high-level configuration validation,
- combining endpoint validation with configuration format validation.
- """
-
- def __init__(self):
- """Initialize configuration validator."""
- self.endpoint_validator = MCPEndpointValidator()
- logger.debug("MCPConfigurationValidator initialized")
-
- def validate_json_config(self, json_data: Any) -> ValidationResult:
- """
- Validate JSON configuration data.
-
- Args:
- json_data: Parsed JSON data to validate
-
- Returns:
- ValidationResult with validation outcome
- """
- # Check if it's a dictionary
- if not isinstance(json_data, dict):
- return ValidationResult(
- valid=False,
- error_message="MCP_SERVERS must be a JSON object with server definitions"
- )
-
- # Check if it's empty
- if not json_data:
- return ValidationResult(
- valid=False,
- error_message="MCP_SERVERS must contain at least one server definition"
- )
-
- # Validate each server configuration
- return self.endpoint_validator.validate_configuration_dict(json_data)
-
- def validate_environment_config(self, env_value: str) -> ValidationResult:
- """
- Validate environment variable configuration.
-
- Args:
- env_value: Environment variable value to validate
-
- Returns:
- ValidationResult with validation outcome
- """
- if not env_value or not isinstance(env_value, str):
- return ValidationResult(
- valid=False,
- error_message="Environment variable must be a non-empty string"
- )
-
- try:
- import json
- json_data = json.loads(env_value)
- return self.validate_json_config(json_data)
- except json.JSONDecodeError as e:
- return ValidationResult(
- valid=False,
- error_message=f"Invalid JSON in MCP_SERVERS environment variable: {e}"
- )
-
-
-class MCPSecurityValidator:
- """
- Security-focused validation for MCP configurations.
-
- This validator implements additional security checks to prevent
- common security vulnerabilities in MCP configurations.
- """
-
- # Security constraints
- MAX_CONFIG_SIZE = 50 * 1024 # 50KB max config size
- MAX_ENDPOINTS = 50 # Maximum number of endpoints
- MAX_TIMEOUT = 300 # Maximum timeout in seconds
- MIN_TIMEOUT = 1 # Minimum timeout in seconds
- ALLOWED_SCHEMES = {'https', 'http'} # Allow http for localhost/testing
- BLOCKED_HOSTS = {'localhost', '127.0.0.1', '0.0.0.0', '::1'} # Block local hosts in production
-
- def __init__(self, production_mode: bool = False):
- """
- Initialize security validator.
-
- Args:
- production_mode: If True, applies stricter security checks
- """
- self.production_mode = production_mode
- self.base_validator = MCPEndpointValidator()
-
- # In production mode, only allow HTTPS
- if production_mode:
- self.ALLOWED_SCHEMES = {'https'}
-
- logger.debug(f"MCPSecurityValidator initialized (production_mode={production_mode})")
-
- def validate_configuration_security(self, config_data: Any) -> ValidationResult:
- """
- Comprehensive security validation of MCP configuration.
-
- Args:
- config_data: Configuration data to validate
-
- Returns:
- ValidationResult with security validation outcome
- """
- # Check configuration size
- try:
- config_str = json.dumps(config_data) if not isinstance(config_data, str) else config_data
- if len(config_str.encode('utf-8')) > self.MAX_CONFIG_SIZE:
- return ValidationResult(
- valid=False,
- error_message=f"Configuration exceeds maximum size of {self.MAX_CONFIG_SIZE} bytes"
- )
- except Exception as e:
- return ValidationResult(
- valid=False,
- error_message=f"Failed to serialize configuration for size check: {e}"
- )
-
- # Validate JSON structure for injection attempts
- if isinstance(config_data, str):
- try:
- config_data = json.loads(config_data)
- except json.JSONDecodeError as e:
- return ValidationResult(
- valid=False,
- error_message=f"Invalid JSON configuration: {e}"
- )
-
- # Check if it's a valid dictionary
- if not isinstance(config_data, dict):
- return ValidationResult(
- valid=False,
- error_message="Configuration must be a JSON object"
- )
-
- # Check number of endpoints
- if len(config_data) > self.MAX_ENDPOINTS:
- return ValidationResult(
- valid=False,
- error_message=f"Configuration exceeds maximum of {self.MAX_ENDPOINTS} endpoints"
- )
-
- # Validate each endpoint for security issues
- for capability, endpoint_config in config_data.items():
- # Validate capability name
- security_result = self._validate_capability_name(capability)
- if not security_result.valid:
- return security_result
-
- # Handle both string and dict endpoint configurations
- if isinstance(endpoint_config, str):
- endpoint = endpoint_config
- timeout = 30 # default
- elif isinstance(endpoint_config, dict):
- endpoint = endpoint_config.get("endpoint")
- timeout = endpoint_config.get("timeout", 30)
-
- # Validate timeout bounds
- if not isinstance(timeout, (int, float)) or timeout < self.MIN_TIMEOUT or timeout > self.MAX_TIMEOUT:
- return ValidationResult(
- valid=False,
- error_message=f"Timeout for '{capability}' must be between {self.MIN_TIMEOUT} and {self.MAX_TIMEOUT} seconds"
- )
- else:
- return ValidationResult(
- valid=False,
- error_message=f"Invalid endpoint configuration for '{capability}': must be string or object"
- )
-
- # Validate endpoint security
- security_result = self._validate_endpoint_security(capability, endpoint)
- if not security_result.valid:
- return security_result
-
- return ValidationResult(
- valid=True,
- parsed_info={
- "endpoints_validated": len(config_data),
- "security_checks_passed": True,
- "production_mode": self.production_mode
- }
- )
-
- def _validate_capability_name(self, capability: str) -> ValidationResult:
- """
- Validate capability name for security issues.
-
- Args:
- capability: Capability name to validate
-
- Returns:
- ValidationResult
- """
- # Check for dangerous characters
- if not isinstance(capability, str):
- return ValidationResult(
- valid=False,
- error_message="Capability name must be a string"
- )
-
- # Check length
- if len(capability) > 100:
- return ValidationResult(
- valid=False,
- error_message="Capability name too long (max 100 characters)"
- )
-
- # Check for injection patterns
- dangerous_patterns = [
- r'[<>"\'\`]', # HTML/script injection
- r'[;|&]', # Command injection
- r'\$\{', # Variable expansion
- r'\.\./', # Path traversal
- r'__.*__', # Python internals
- ]
-
- for pattern in dangerous_patterns:
- if re.search(pattern, capability):
- return ValidationResult(
- valid=False,
- error_message=f"Capability name contains potentially dangerous characters: {capability}"
- )
-
- return ValidationResult(valid=True)
-
- def _validate_endpoint_security(self, capability: str, endpoint: str) -> ValidationResult:
- """
- Validate endpoint URL for security issues.
-
- Args:
- capability: Capability name
- endpoint: Endpoint URL to validate
-
- Returns:
- ValidationResult
- """
- if not isinstance(endpoint, str):
- return ValidationResult(
- valid=False,
- error_message=f"Endpoint for '{capability}' must be a string"
- )
-
- # Parse URL - handle cluster service format which doesn't have scheme
- try:
- # Check if this is a cluster service format first
- if '.svc.cluster.local' in endpoint and not endpoint.startswith(('http://', 'https://')):
- # This is a cluster service, skip URL parsing for scheme validation
- parsed = None
- else:
- parsed = urlparse(endpoint)
- except Exception as e:
- return ValidationResult(
- valid=False,
- error_message=f"Invalid URL format for '{capability}': {e}"
- )
-
- # Check scheme only for URLs that have schemes
- if parsed and parsed.scheme and parsed.scheme.lower() not in self.ALLOWED_SCHEMES:
- return ValidationResult(
- valid=False,
- error_message=f"Unsupported URL scheme '{parsed.scheme}' for '{capability}'. Allowed: {', '.join(self.ALLOWED_SCHEMES)}"
- )
-
- # In production mode, check for blocked hosts (only for URLs with schemes)
- if self.production_mode and parsed and parsed.hostname:
- if parsed.hostname.lower() in self.BLOCKED_HOSTS:
- return ValidationResult(
- valid=False,
- error_message=f"Localhost/loopback addresses not allowed in production mode: {parsed.hostname}"
- )
-
- # Check for private IP ranges in production
- if self._is_private_ip(parsed.hostname):
- return ValidationResult(
- valid=False,
- error_message=f"Private IP addresses not recommended in production mode: {parsed.hostname}"
- )
-
- # Check for suspicious ports (only for URLs with schemes)
- if parsed and parsed.port and parsed.port in [22, 23, 25, 110, 143, 993, 995]: # Common non-HTTP ports
- return ValidationResult(
- valid=False,
- error_message=f"Suspicious port {parsed.port} for HTTP endpoint '{capability}'"
- )
-
- # Basic endpoint validation
- return self.base_validator.validate_endpoint(endpoint)
-
- def _is_private_ip(self, hostname: str) -> bool:
- """
- Check if hostname is a private IP address.
-
- Args:
- hostname: Hostname to check
-
- Returns:
- True if hostname appears to be a private IP
- """
- # Basic private IP pattern matching
- private_patterns = [
- r'^10\.', # 10.0.0.0/8
- r'^172\.(1[6-9]|2[0-9]|3[01])\.', # 172.16.0.0/12
- r'^192\.168\.', # 192.168.0.0/16
- ]
-
- for pattern in private_patterns:
- if re.match(pattern, hostname):
- return True
-
- return False
\ No newline at end of file
diff --git a/archive/mcp_client_integration/endpoint_connector.py b/archive/mcp_client_integration/endpoint_connector.py
deleted file mode 100644
index 7deef7899..000000000
--- a/archive/mcp_client_integration/endpoint_connector.py
+++ /dev/null
@@ -1,254 +0,0 @@
-#!/usr/bin/env python3
-"""
-MCPEndpointConnector - Endpoint Validation and Connectivity
-
-This module provides endpoint validation and connectivity testing for MCP clients,
-based on SPIKE-002 validated OpenShift service discovery patterns.
-
-Refactored to use common utilities for reduced code duplication.
-"""
-
-import asyncio
-import logging
-import socket
-import ssl
-import time
-from typing import Optional, Dict, Any
-from urllib.parse import urlparse
-
-from .common import (
- MCPEndpointValidator,
- ValidationResult,
- MCPErrorHandler,
- handle_mcp_errors,
- MCPConnectionError,
- MCPValidationError
-)
-
-# Configure logging
-logger = logging.getLogger(__name__)
-
-
-class MCPEndpointConnector:
- """
- MCP endpoint connector with validation based on SPIKE-002 findings.
-
- This class provides endpoint validation for both external routes and
- cluster-internal services, implementing patterns validated in SPIKE-002.
-
- Refactored to use common validation utilities for reduced code duplication.
- """
-
- def __init__(self, timeout_seconds: int = 30):
- """Initialize MCPEndpointConnector with common utilities.
-
- Args:
- timeout_seconds: Connection timeout in seconds
- """
- self.timeout_seconds = timeout_seconds
- self.validator = MCPEndpointValidator()
- self.error_handler = MCPErrorHandler(__name__)
-
- logger.debug(f"MCPEndpointConnector initialized with {timeout_seconds}s timeout")
-
- def validate_endpoint_config(self, endpoint: str) -> bool:
- """
- Validate endpoint configuration format using common validator.
-
- This method implements SPIKE-002 validated endpoint validation patterns
- for both external routes and cluster-internal services.
-
- Args:
- endpoint: The endpoint URL/address to validate
-
- Returns:
- True if endpoint format is valid, False otherwise
- """
- result = self.validator.validate_endpoint(endpoint)
- return result.valid
-
- def get_validation_result(self, endpoint: str) -> ValidationResult:
- """
- Get detailed validation result using common validator.
-
- Args:
- endpoint: The endpoint URL/address to validate
-
- Returns:
- ValidationResult with detailed validation information
- """
- return self.validator.validate_endpoint(endpoint)
-
- def _is_external_route(self, endpoint: str) -> bool:
- """Check if endpoint is an external route format."""
- return self.validator._is_external_route(endpoint)
-
- def _is_cluster_service(self, endpoint: str) -> bool:
- """Check if endpoint is a cluster service format."""
- return self.validator._is_cluster_service(endpoint)
-
- @handle_mcp_errors("test_connectivity")
- async def test_connectivity(self, endpoint: str) -> Dict[str, Any]:
- """
- Test connectivity to an endpoint with error handling.
-
- This method performs actual connectivity testing based on SPIKE-002
- validated patterns using common error handling utilities.
-
- Args:
- endpoint: The endpoint to test
-
- Returns:
- Dictionary with connectivity test results
- """
- result = {
- "endpoint": endpoint,
- "reachable": False,
- "response_time_ms": None,
- "error": None,
- "endpoint_type": None
- }
-
- # Use common validator
- validation_result = self.get_validation_result(endpoint)
- if not validation_result.valid:
- result["error"] = validation_result.error_message
- return result
-
- try:
- async with self.error_handler.handle_connection_errors("test_connectivity", endpoint):
- if self._is_external_route(endpoint):
- result["endpoint_type"] = "external_route"
- return await self._test_external_route_connectivity(endpoint, result)
- else:
- result["endpoint_type"] = "cluster_service"
- return await self._test_cluster_service_connectivity(endpoint, result)
-
- except Exception as e:
- result["error"] = str(e)
- return result
-
- async def _test_external_route_connectivity(self, endpoint: str, result: Dict[str, Any]) -> Dict[str, Any]:
- """Test connectivity to external route."""
- try:
- parsed = urlparse(endpoint)
- host = parsed.hostname
- port = parsed.port or (443 if parsed.scheme == 'https' else 80)
-
- start_time = time.time()
-
- # Test TCP connectivity
- sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
- sock.settimeout(self.timeout_seconds)
-
- try:
- if parsed.scheme == 'https':
- # Test SSL connectivity
- context = ssl.create_default_context()
- sock = context.wrap_socket(sock, server_hostname=host)
-
- sock.connect((host, port))
-
- end_time = time.time()
- result["reachable"] = True
- result["response_time_ms"] = int((end_time - start_time) * 1000)
-
- finally:
- sock.close()
-
- except socket.timeout:
- result["error"] = f"Connection timeout after {self.timeout_seconds}s"
- except socket.gaierror as e:
- result["error"] = f"DNS resolution failed: {e}"
- except Exception as e:
- result["error"] = f"Connection failed: {e}"
-
- return result
-
- async def _test_cluster_service_connectivity(self, endpoint: str, result: Dict[str, Any]) -> Dict[str, Any]:
- """Test connectivity to cluster service."""
- try:
- # Split endpoint and port
- if ':' in endpoint:
- service_part, port_part = endpoint.rsplit(':', 1)
- port = int(port_part)
- else:
- service_part = endpoint
- port = 80 # Default port
-
- start_time = time.time()
-
- # Test TCP connectivity to cluster service
- sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
- sock.settimeout(self.timeout_seconds)
-
- try:
- sock.connect((service_part, port))
-
- end_time = time.time()
- result["reachable"] = True
- result["response_time_ms"] = int((end_time - start_time) * 1000)
-
- finally:
- sock.close()
-
- except socket.timeout:
- result["error"] = f"Connection timeout after {self.timeout_seconds}s"
- except socket.gaierror as e:
- result["error"] = f"DNS resolution failed: {e}"
- except Exception as e:
- result["error"] = f"Connection failed: {e}"
-
- return result
-
- def get_endpoint_info(self, endpoint: str) -> Dict[str, Any]:
- """
- Get detailed information about an endpoint using common validator.
-
- Args:
- endpoint: The endpoint to analyze
-
- Returns:
- Dictionary with endpoint information
- """
- validation_result = self.get_validation_result(endpoint)
-
- info = {
- "endpoint": endpoint,
- "valid": validation_result.valid,
- "type": None,
- "parsed": None,
- "validation_details": validation_result.details if validation_result.details else {}
- }
-
- if not validation_result.valid:
- info["error"] = validation_result.error_message
- return info
-
- if self._is_external_route(endpoint):
- info["type"] = "external_route"
- parsed = urlparse(endpoint)
- info["parsed"] = {
- "scheme": parsed.scheme,
- "hostname": parsed.hostname,
- "port": parsed.port,
- "path": parsed.path
- }
- else:
- info["type"] = "cluster_service"
- if ':' in endpoint:
- service_part, port_part = endpoint.rsplit(':', 1)
- port = int(port_part)
- else:
- service_part = endpoint
- port = None
-
- parts = service_part.split('.')
- info["parsed"] = {
- "service": parts[0],
- "namespace": parts[1],
- "domain": '.'.join(parts[2:]),
- "port": port
- }
-
- return info
\ No newline at end of file
diff --git a/archive/mcp_client_integration/llama_index_tool.py b/archive/mcp_client_integration/llama_index_tool.py
deleted file mode 100644
index 95fb6de62..000000000
--- a/archive/mcp_client_integration/llama_index_tool.py
+++ /dev/null
@@ -1,216 +0,0 @@
-#!/usr/bin/env python3
-"""
-MCPLlamaIndexTool - LlamaIndex Tool Integration
-
-This module provides a LlamaIndex tool wrapper for MCP client integration,
-enabling direct use within LlamaIndex agents and workflows.
-"""
-
-import asyncio
-import logging
-from typing import Dict, Any, Optional
-
-from .simple_mcp_client import SimpleMCPClient
-from .common import handle_mcp_errors, MCPErrorHandler
-
-# Configure logging
-logger = logging.getLogger(__name__)
-
-
-class MCPLlamaIndexTool:
- """
- LlamaIndex tool wrapper for MCP client integration.
-
- This class provides a standardized tool interface for LlamaIndex agents
- to interact with MCP servers for data retrieval and analysis.
- """
-
- def __init__(self, env_var: str = "MCP_SERVERS", mock: bool = False):
- """
- Initialize MCPLlamaIndexTool.
-
- Args:
- env_var: Environment variable containing MCP server configuration
- mock: Whether to use mock connections for testing
- """
- self.mcp_client = SimpleMCPClient(env_var, mock)
- self.error_handler = MCPErrorHandler(__name__)
- self._initialized = False
-
- # Tool metadata for LlamaIndex integration
- self.name = "mcp_query"
- self.description = (
- "Query MCP servers for data from Atlassian (Jira, Confluence), "
- "GitHub, and other configured services. Automatically routes "
- "queries to appropriate servers based on content."
- )
-
- logger.info("MCPLlamaIndexTool initialized")
-
- @handle_mcp_errors("initialize_tool")
- async def initialize(self):
- """Initialize MCP connections if not already initialized."""
- if not self._initialized:
- await self.mcp_client.connect_all()
- self._initialized = True
- logger.info("MCP tool connections initialized")
-
- @handle_mcp_errors("query_tool")
- async def call(self, query: str, capability: Optional[str] = None) -> Dict[str, Any]:
- """
- Call the MCP tool with a query.
-
- This method is the primary interface for LlamaIndex agents to
- interact with MCP servers.
-
- Args:
- query: The query string to send to MCP servers
- capability: Optional specific capability to target
-
- Returns:
- Dictionary containing query results and metadata
- """
- # Ensure tool is initialized
- if not self._initialized:
- await self.initialize()
-
- logger.debug(f"MCPLlamaIndexTool processing query: {query[:100]}...")
-
- try:
- # Query MCP servers
- result = await self.mcp_client.query(query, capability)
-
- # Format response for LlamaIndex
- return {
- "tool": "mcp_query",
- "query": query,
- "capability": capability,
- "result": result,
- "success": True,
- "metadata": {
- "server_status": self.mcp_client.get_server_status(),
- "query_length": len(query)
- }
- }
-
- except Exception as e:
- logger.error(f"MCP tool query failed: {e}")
- return {
- "tool": "mcp_query",
- "query": query,
- "error": str(e),
- "success": False
- }
-
- def __call__(self, query: str, capability: Optional[str] = None) -> Dict[str, Any]:
- """
- Synchronous call interface for LlamaIndex compatibility.
-
- This method wraps the async call method for synchronous usage
- in LlamaIndex agents that may not support async tools.
-
- Args:
- query: The query string to send to MCP servers
- capability: Optional specific capability to target
-
- Returns:
- Dictionary containing query results and metadata
- """
- # Run async call in event loop
- try:
- loop = asyncio.get_event_loop()
- return loop.run_until_complete(self.call(query, capability))
- except RuntimeError:
- # Create new event loop if none exists
- loop = asyncio.new_event_loop()
- asyncio.set_event_loop(loop)
- try:
- return loop.run_until_complete(self.call(query, capability))
- finally:
- loop.close()
-
- @handle_mcp_errors("get_tool_status")
- async def get_status(self) -> Dict[str, Any]:
- """
- Get tool status and MCP server health information.
-
- Returns:
- Dictionary with tool and server status information
- """
- if not self._initialized:
- await self.initialize()
-
- return {
- "tool_name": self.name,
- "initialized": self._initialized,
- "mcp_servers": self.mcp_client.get_server_status(),
- "health": await self.mcp_client.health_check()
- }
-
- async def close(self):
- """Close all MCP connections."""
- if self._initialized:
- await self.mcp_client.disconnect_all()
- self._initialized = False
- logger.info("MCP tool connections closed")
-
- def to_llama_index_tool(self):
- """
- Convert to LlamaIndex tool format.
-
- This method creates a properly formatted tool for LlamaIndex agents.
- Note: Requires llama-index to be installed by the consuming application.
-
- Returns:
- LlamaIndex tool instance
- """
- try:
- from llama_index.core.tools import FunctionTool
-
- return FunctionTool.from_defaults(
- fn=self.__call__,
- name=self.name,
- description=self.description,
- return_direct=False
- )
- except ImportError as e:
- logger.error(
- "llama-index not available. Install llama-index dependencies "
- "in your consuming application to use this method."
- )
- raise ImportError(
- "llama-index not installed. Please install llama-index "
- "dependencies to use LlamaIndex integration."
- ) from e
-
- def get_tool_metadata(self) -> Dict[str, Any]:
- """
- Get tool metadata for registration with LlamaIndex.
-
- Returns:
- Dictionary with tool metadata
- """
- return {
- "name": self.name,
- "description": self.description,
- "version": "1.0.0",
- "type": "mcp_integration",
- "async_support": True,
- "sync_support": True,
- "capabilities": list(self.mcp_client.servers.keys()) if hasattr(self.mcp_client, 'servers') else []
- }
-
-
-# Convenience function for quick tool creation
-def create_mcp_tool(env_var: str = "MCP_SERVERS", mock: bool = False) -> MCPLlamaIndexTool:
- """
- Convenience function to create an MCPLlamaIndexTool.
-
- Args:
- env_var: Environment variable containing MCP server configuration
- mock: Whether to use mock connections for testing
-
- Returns:
- Configured MCPLlamaIndexTool instance
- """
- return MCPLlamaIndexTool(env_var, mock)
\ No newline at end of file
diff --git a/archive/mcp_client_integration/llama_integration.py b/archive/mcp_client_integration/llama_integration.py
deleted file mode 100644
index 33983362a..000000000
--- a/archive/mcp_client_integration/llama_integration.py
+++ /dev/null
@@ -1,129 +0,0 @@
-#!/usr/bin/env python3
-"""
-MCPEnhancedLlamaIndex - Llama Index Integration with MCP
-
-This module provides enhanced llama index integration with MCP client support,
-enabling AI-powered workflows with Atlassian data access.
-
-Based on SPIKE-001 llama index integration patterns.
-"""
-
-import asyncio
-import logging
-from typing import Dict, Any, Optional
-
-from .simple_mcp_client import SimpleMCPClient
-
-# Configure logging
-logger = logging.getLogger(__name__)
-
-
-class MCPEnhancedLlamaIndex:
- """
- Enhanced llama index with multi-MCP server support.
-
- This class integrates SimpleMCPClient with llama index processing,
- implementing patterns validated in SPIKE-001.
- """
-
- def __init__(self):
- """Initialize MCPEnhancedLlamaIndex."""
- self.mcp_client = SimpleMCPClient()
- self._initialized = False
-
- logger.info("MCPEnhancedLlamaIndex initialized")
-
- async def initialize(self):
- """Initialize MCP connections."""
- if not self._initialized:
- await self.mcp_client.connect_all()
- self._initialized = True
- logger.info("MCP connections initialized")
-
- async def enhanced_query(self, query: str, capability: str = None) -> Dict[str, Any]:
- """
- Enhanced query method that integrates MCP data retrieval with llama index processing.
-
- Args:
- query: The query string to process
- capability: Optional specific MCP capability to target
-
- Returns:
- Dictionary containing both MCP data and processed results
- """
- if not self._initialized:
- await self.initialize()
-
- logger.debug(f"Processing enhanced query: {query[:100]}...")
-
- try:
- # Get data from MCP server
- mcp_data = await self.mcp_client.query(query, capability)
-
- # Process with llama index (simplified implementation)
- processed_result = await self.process_with_llama_index(mcp_data, query)
-
- return {
- "query": query,
- "mcp_data": mcp_data,
- "processed_result": processed_result,
- "capability_used": capability or self.mcp_client._detect_capability(query),
- "success": True
- }
-
- except Exception as e:
- logger.error(f"Enhanced query failed: {e}")
- return {
- "query": query,
- "error": str(e),
- "success": False
- }
-
- async def process_with_llama_index(self, mcp_data: Any, query: str) -> Dict[str, Any]:
- """
- Process MCP data with llama index.
-
- This is a simplified implementation. In production, this would integrate
- with actual llama index processing pipeline.
-
- Args:
- mcp_data: Data retrieved from MCP server
- query: Original query string
-
- Returns:
- Processed results from llama index
- """
- # Simplified processing implementation
- # In production, this would include:
- # - Vector indexing of MCP data
- # - Semantic search capabilities
- # - AI-powered analysis and summarization
-
- processed = {
- "summary": f"Processed query '{query}' with MCP data",
- "data_source": "mcp",
- "processing_method": "llama_index_enhanced",
- "raw_data_size": len(str(mcp_data)) if mcp_data else 0
- }
-
- logger.debug(f"Processed MCP data with llama index: {processed}")
-
- return processed
-
- async def get_mcp_status(self) -> Dict[str, Any]:
- """Get status of MCP connections."""
- if not self._initialized:
- return {"initialized": False, "servers": {}}
-
- return {
- "initialized": True,
- "servers": self.mcp_client.get_server_status(),
- "health": await self.mcp_client.health_check()
- }
-
- async def close(self):
- """Close all MCP connections."""
- if self._initialized:
- await self.mcp_client.disconnect_all()
- self._initialized = False
- logger.info("MCP connections closed")
\ No newline at end of file
diff --git a/archive/mcp_client_integration/pyproject.toml b/archive/mcp_client_integration/pyproject.toml
deleted file mode 100644
index b41555f47..000000000
--- a/archive/mcp_client_integration/pyproject.toml
+++ /dev/null
@@ -1,83 +0,0 @@
-[build-system]
-requires = ["hatchling"]
-build-backend = "hatchling.build"
-
-[project]
-name = "mcp-client-integration"
-version = "1.0.0"
-description = "MCP (Model Context Protocol) client integration library for llama-index"
-readme = "README.md"
-license = "MIT"
-requires-python = ">=3.8"
-authors = [
- {name = "vTeam", email = "vteam@example.com"},
-]
-keywords = ["mcp", "llama-index", "integration", "client", "protocol"]
-classifiers = [
- "Development Status :: 4 - Beta",
- "Intended Audience :: Developers",
- "License :: OSI Approved :: MIT License",
- "Programming Language :: Python :: 3",
- "Programming Language :: Python :: 3.8",
- "Programming Language :: Python :: 3.9",
- "Programming Language :: Python :: 3.10",
- "Programming Language :: Python :: 3.11",
- "Programming Language :: Python :: 3.12",
- "Topic :: Software Development :: Libraries :: Python Modules",
- "Topic :: Software Development :: Libraries :: Application Frameworks",
-]
-dependencies = [
- "httpx[http2]>=0.28.1",
- "websockets>=13.1",
- "certifi>=2024.8.30", # Ensure up-to-date CA certificates
- "cryptography>=41.0.0", # For secure connections
-]
-
-[project.optional-dependencies]
-dev = [
- "pytest>=8.3.5",
- "pytest-asyncio>=0.24.0",
- "pytest-cov>=5.0.0",
- "black",
- "isort",
- "flake8",
- "mypy",
- "bandit", # Security linting
- "safety", # Dependency vulnerability scanning
-]
-
-# Optional dependencies for different use cases
-llama = [
- # Note: llama-index dependencies should be added by consuming applications
- # as they have specific version requirements and Python constraints
- # "llama-index-core>=0.10.0,<=0.11.23",
- # "llama-index-embeddings-openai>=0.1.0",
- # "llama-index-llms-anthropic>=0.1.0",
-]
-
-[project.urls]
-Homepage = "https://github.com/red-hat-data-services/vTeam"
-Repository = "https://github.com/red-hat-data-services/vTeam"
-Issues = "https://github.com/red-hat-data-services/vTeam/issues"
-
-[tool.hatch.build.targets.sdist]
-include = [
- "/",
- "README.md",
- "LICENSE",
-]
-
-[tool.hatch.build.targets.wheel]
-packages = ["."]
-
-[tool.black]
-line-length = 88
-target-version = ['py38']
-
-[tool.isort]
-profile = "black"
-line_length = 88
-
-[tool.mypy]
-python_version = "3.8"
-ignore_missing_imports = true
\ No newline at end of file
diff --git a/archive/mcp_client_integration/simple_mcp_client.py b/archive/mcp_client_integration/simple_mcp_client.py
deleted file mode 100644
index 1ddd63f05..000000000
--- a/archive/mcp_client_integration/simple_mcp_client.py
+++ /dev/null
@@ -1,220 +0,0 @@
-#!/usr/bin/env python3
-"""
-SimpleMCPClient - Multi-MCP Server Client Implementation (Refactored)
-
-This module provides the core MCP client implementation for llama index integration,
-supporting multiple MCP servers with simplified JSON configuration.
-
-Based on SPIKE-001 and SPIKE-002 validated patterns.
-Refactored to use common utilities for reduced code duplication.
-"""
-
-import asyncio
-import logging
-from typing import Dict, Any, Optional
-
-from .common import (
- MCPConnectionPool,
- MCPConfigurationManager,
- MCPErrorHandler,
- MCPConfiguration,
- handle_mcp_errors,
- MCPConnectionError,
- MCPConfigurationError
-)
-
-# Configure logging
-logger = logging.getLogger(__name__)
-
-
-class SimpleMCPClient:
- """
- Simple MCP client with multi-server support using opinionated JSON configuration.
-
- This class implements the enhanced US-001 pattern with SPIKE-001 and SPIKE-002
- validated connectivity patterns. Refactored to use common utilities for
- improved maintainability and reduced code duplication.
- """
-
- def __init__(self, env_var: str = "MCP_SERVERS", mock: bool = False):
- """
- Initialize SimpleMCPClient with JSON configuration from environment.
-
- Args:
- env_var: Environment variable containing MCP server configuration
- mock: Whether to use mock connections for testing
- """
- # Initialize utilities
- self.config_manager = MCPConfigurationManager()
- self.connection_pool = MCPConnectionPool()
- self.error_handler = MCPErrorHandler(__name__)
-
- # Load and validate configuration
- try:
- self.config = self.config_manager.load_configuration(env_var)
- except Exception as e:
- raise MCPConfigurationError(f"Failed to load configuration: {e}", e)
-
- # Store configuration details for compatibility
- self.servers = self.config.get_server_endpoints()
- self.mock = mock
-
- logger.info(f"Initialized SimpleMCPClient with {len(self.servers)} servers: {list(self.servers.keys())}")
-
- @property
- def connections(self) -> Dict[str, Any]:
- """Get connection pool info for compatibility."""
- return self.connection_pool.get_connection_info()
-
- @property
- def health(self) -> Dict[str, bool]:
- """Get health status for compatibility."""
- return self.connection_pool.get_health_status()
-
- @handle_mcp_errors("connect_all")
- async def connect_all(self):
- """Connect to all configured MCP servers using connection pool."""
- logger.info("Connecting to all configured MCP servers...")
-
- enabled_servers = self.config.get_enabled_servers()
- connection_tasks = []
-
- for capability, server_config in enabled_servers.items():
- task = asyncio.create_task(
- self.connection_pool.add_connection(
- capability,
- server_config.endpoint,
- server_config.connection_type,
- self.mock
- )
- )
- connection_tasks.append((capability, task))
-
- # Wait for all connections to complete
- results = await asyncio.gather(
- *[task for _, task in connection_tasks],
- return_exceptions=True
- )
-
- # Process results
- successful_connections = 0
- for i, result in enumerate(results):
- capability = connection_tasks[i][0]
- if isinstance(result, Exception):
- logger.warning(f"Failed to connect to {capability}: {result}")
- elif result:
- logger.info(f"Successfully connected to {capability}")
- successful_connections += 1
- else:
- logger.warning(f"Connection to {capability} returned False")
-
- logger.info(f"Connected to {successful_connections}/{len(enabled_servers)} MCP servers")
-
- @handle_mcp_errors("query")
- async def query(self, request: str, capability: str = None) -> Any:
- """
- Send query to appropriate MCP server based on capability routing.
-
- Args:
- request: The query string to send
- capability: Optional explicit capability to target
-
- Returns:
- Query response from the MCP server
- """
- # Auto-detect capability from request if not specified
- if not capability:
- capability = self._detect_capability(request)
-
- logger.debug(f"Routing query to capability '{capability}': {request[:100]}...")
-
- try:
- # Use connection pool to send message
- return await self.connection_pool.send_message(capability, {"query": request})
- except KeyError:
- # Try fallback to any healthy server
- health_status = await self.health_check()
- healthy_capabilities = [cap for cap, healthy in health_status.items() if healthy]
-
- if not healthy_capabilities:
- raise MCPConnectionError("No healthy MCP servers available", "unknown")
-
- fallback_capability = healthy_capabilities[0]
- logger.info(f"Falling back to {fallback_capability} for query")
- return await self.connection_pool.send_message(fallback_capability, {"query": request})
-
- def _detect_capability(self, request: str) -> str:
- """
- Simple keyword-based capability detection.
-
- This method analyzes the request string for capability keywords
- and returns the appropriate server capability name.
- """
- request_lower = request.lower()
-
- # Check for capability keywords in request
- for capability in self.servers.keys():
- if capability.lower() in request_lower:
- return capability
-
- # Check for common Atlassian keywords
- if any(keyword in request_lower for keyword in ['jira', 'ticket', 'issue', 'project']):
- # Look for atlassian capability
- if 'atlassian' in self.servers:
- return 'atlassian'
-
- # Check for GitHub keywords
- if any(keyword in request_lower for keyword in ['github', 'repository', 'repo', 'commit']):
- # Look for github capability
- if 'github' in self.servers:
- return 'github'
-
- # Check for Confluence keywords
- if any(keyword in request_lower for keyword in ['confluence', 'wiki', 'document', 'page']):
- # Look for confluence capability
- if 'confluence' in self.servers:
- return 'confluence'
-
- # Default to first configured server
- return next(iter(self.servers.keys()))
-
- @handle_mcp_errors("health_check")
- async def health_check(self) -> Dict[str, bool]:
- """
- Perform health check on all configured servers using connection pool.
-
- Returns:
- Dictionary mapping capability names to health status
- """
- logger.debug("Performing health check on all servers")
- return await self.connection_pool.health_check()
-
- @handle_mcp_errors("disconnect_all")
- async def disconnect_all(self):
- """Disconnect from all MCP servers using connection pool."""
- logger.info("Disconnecting from all MCP servers")
- await self.connection_pool.close_all()
-
- def get_server_status(self) -> Dict[str, Dict[str, Any]]:
- """
- Get comprehensive status of all configured servers.
-
- Returns:
- Dictionary with server status information
- """
- connection_info = self.connection_pool.get_connection_info()
- health_status = self.connection_pool.get_health_status()
-
- status = {}
-
- for capability, endpoint in self.servers.items():
- connection_data = connection_info.get(capability, {})
-
- status[capability] = {
- "endpoint": endpoint,
- "connected": connection_data.get("connected", False),
- "healthy": health_status.get(capability, False),
- "connection_type": connection_data.get("type", "unknown")
- }
-
- return status
\ No newline at end of file
diff --git a/archive/mcp_client_integration/tests/__init__.py b/archive/mcp_client_integration/tests/__init__.py
deleted file mode 100644
index 480d02dff..000000000
--- a/archive/mcp_client_integration/tests/__init__.py
+++ /dev/null
@@ -1,6 +0,0 @@
-"""
-Test suite for MCP Client Integration
-
-This test suite validates the MCP client integration functionality,
-including unit tests, integration tests, and acceptance criteria validation.
-"""
\ No newline at end of file
diff --git a/archive/mcp_client_integration/tests/integration/__init__.py b/archive/mcp_client_integration/tests/integration/__init__.py
deleted file mode 100644
index 86de3b0b5..000000000
--- a/archive/mcp_client_integration/tests/integration/__init__.py
+++ /dev/null
@@ -1 +0,0 @@
-"""Integration tests for MCP Client Integration"""
\ No newline at end of file
diff --git a/archive/mcp_client_integration/tests/integration/test_integration.py b/archive/mcp_client_integration/tests/integration/test_integration.py
deleted file mode 100644
index 186269359..000000000
--- a/archive/mcp_client_integration/tests/integration/test_integration.py
+++ /dev/null
@@ -1,241 +0,0 @@
-#!/usr/bin/env python3
-"""
-Integration tests for US-001: MCP Client Integration
-
-This test suite validates end-to-end integration scenarios for the MCP client
-implementation with real configuration patterns.
-"""
-
-import pytest
-import asyncio
-import json
-import os
-from unittest.mock import patch
-
-from ...simple_mcp_client import SimpleMCPClient
-from ...endpoint_connector import MCPEndpointConnector
-from ...llama_integration import MCPEnhancedLlamaIndex
-from ...common import MCPConfigurationError
-
-
-class TestIntegrationScenarios:
- """Integration test scenarios for US-001 implementation."""
-
- def test_single_server_configuration(self):
- """Test single MCP server configuration scenario."""
- config = '{"atlassian": "https://mcp-atlassian-route.apps.cluster.com/sse"}'
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- assert len(client.servers) == 1
- assert "atlassian" in client.servers
- assert client.servers["atlassian"] == "https://mcp-atlassian-route.apps.cluster.com/sse"
-
- def test_multi_server_configuration(self):
- """Test multi-MCP server configuration scenario."""
- config = json.dumps({
- "atlassian": "https://mcp-atlassian-route.apps.cluster.com/sse",
- "github": "https://mcp-github-route.apps.cluster.com/sse",
- "confluence": "mcp-confluence.vteam-mcp.svc.cluster.local:8000"
- })
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- assert len(client.servers) == 3
- assert all(capability in client.servers for capability in ["atlassian", "github", "confluence"])
-
- def test_endpoint_validation_scenarios(self):
- """Test various endpoint validation scenarios."""
- connector = MCPEndpointConnector()
-
- # Valid external routes
- assert connector.validate_endpoint_config("https://mcp-route.apps.cluster.com/sse")
- assert connector.validate_endpoint_config("https://mcp-atlassian-route.apps.cluster.com/sse")
- assert connector.validate_endpoint_config("http://mcp-dev.example.com:8080/sse")
-
- # Valid cluster services
- assert connector.validate_endpoint_config("mcp-atlassian.namespace.svc.cluster.local:8000")
- assert connector.validate_endpoint_config("mcp-service.vteam-mcp.svc.cluster.local:9000")
-
- # Invalid formats
- assert not connector.validate_endpoint_config("invalid-format")
- assert not connector.validate_endpoint_config("ftp://invalid.com")
- assert not connector.validate_endpoint_config("")
- assert not connector.validate_endpoint_config("mcp-service.incomplete")
-
- def test_capability_detection_scenarios(self):
- """Test capability detection with various query patterns."""
- config = json.dumps({
- "atlassian": "https://mcp-atlassian.com/sse",
- "github": "https://mcp-github.com/sse",
- "confluence": "https://mcp-confluence.com/sse"
- })
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Test Jira/Atlassian detection
- assert client._detect_capability("Get Jira tickets for project ABC") == "atlassian"
- assert client._detect_capability("List Jira issues") == "atlassian"
- assert client._detect_capability("Show project tickets") == "atlassian"
-
- # Test GitHub detection
- assert client._detect_capability("List GitHub repositories") == "github"
- assert client._detect_capability("Show commit history") == "github"
- assert client._detect_capability("Get repository info") == "github"
-
- # Test Confluence detection
- assert client._detect_capability("Search confluence docs") == "confluence"
- assert client._detect_capability("Find wiki pages") == "confluence"
- assert client._detect_capability("Get document content") == "confluence"
-
- # Test explicit capability mention
- assert client._detect_capability("Get data from atlassian") == "atlassian"
- assert client._detect_capability("Query github for info") == "github"
-
- @pytest.mark.asyncio
- async def test_mcp_enhanced_llama_index_integration(self):
- """Test MCPEnhancedLlamaIndex integration scenario."""
- config = '{"atlassian": "https://test-mcp.com/sse"}'
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- enhanced = MCPEnhancedLlamaIndex()
-
- # Test initialization
- assert not enhanced._initialized
- await enhanced.initialize()
- assert enhanced._initialized
-
- # Test status reporting
- status = await enhanced.get_mcp_status()
- assert status["initialized"] == True
- assert "servers" in status
-
- # Clean up
- await enhanced.close()
- assert not enhanced._initialized
-
- @pytest.mark.asyncio
- async def test_error_handling_scenarios(self):
- """Test error handling in various failure scenarios."""
- # Test invalid JSON configuration
- with patch.dict(os.environ, {'MCP_SERVERS': 'invalid-json'}):
- with pytest.raises(MCPConfigurationError, match="Invalid JSON"):
- SimpleMCPClient()
-
- # Test empty configuration
- with patch.dict(os.environ, {'MCP_SERVERS': '{}'}):
- with pytest.raises(MCPConfigurationError, match="at least one server"):
- SimpleMCPClient()
-
- # Test invalid endpoint in configuration
- invalid_config = '{"test": "invalid-endpoint"}'
- with patch.dict(os.environ, {'MCP_SERVERS': invalid_config}):
- with pytest.raises(MCPConfigurationError, match="not recognized"):
- SimpleMCPClient()
-
- def test_server_status_reporting(self):
- """Test comprehensive server status reporting."""
- config = json.dumps({
- "atlassian": "https://mcp-atlassian.com/sse",
- "confluence": "mcp-confluence.ns.svc.cluster.local:8000"
- })
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- status = client.get_server_status()
-
- # Validate status structure
- assert len(status) == 2
- assert "atlassian" in status
- assert "confluence" in status
-
- # Validate atlassian status (external route)
- atlassian_status = status["atlassian"]
- assert atlassian_status["endpoint"] == "https://mcp-atlassian.com/sse"
- assert "connection_type" in atlassian_status # Type detection may be automatic or unknown
- assert "connected" in atlassian_status
- assert "healthy" in atlassian_status
-
- # Validate confluence status (cluster service)
- confluence_status = status["confluence"]
- assert confluence_status["endpoint"] == "mcp-confluence.ns.svc.cluster.local:8000"
- assert "connection_type" in confluence_status # Type detection may be automatic or unknown
-
- def test_endpoint_info_analysis(self):
- """Test detailed endpoint information analysis."""
- connector = MCPEndpointConnector()
-
- # Test external route analysis
- external_info = connector.get_endpoint_info("https://mcp-route.apps.cluster.com:8443/sse")
- assert external_info["valid"] == True
- assert external_info["type"] == "external_route"
- assert external_info["parsed"]["scheme"] == "https"
- assert external_info["parsed"]["hostname"] == "mcp-route.apps.cluster.com"
- assert external_info["parsed"]["port"] == 8443
- assert external_info["parsed"]["path"] == "/sse"
-
- # Test cluster service analysis
- cluster_info = connector.get_endpoint_info("mcp-service.vteam.svc.cluster.local:9000")
- assert cluster_info["valid"] == True
- assert cluster_info["type"] == "cluster_service"
- assert cluster_info["parsed"]["service"] == "mcp-service"
- assert cluster_info["parsed"]["namespace"] == "vteam"
- assert cluster_info["parsed"]["domain"] == "svc.cluster.local"
- assert cluster_info["parsed"]["port"] == 9000
-
-
-class TestProductionScenarios:
- """Test scenarios that would occur in production deployment."""
-
- def test_kubernetes_configmap_simulation(self):
- """Simulate Kubernetes ConfigMap configuration pattern."""
- # Simulate ConfigMap data as it would appear in environment
- configmap_data = {
- "MCP_SERVERS": json.dumps({
- "atlassian": "https://mcp-atlassian-route.apps.cluster.com/sse",
- "github": "https://mcp-github-route.apps.cluster.com/sse"
- })
- }
-
- with patch.dict(os.environ, configmap_data):
- client = SimpleMCPClient()
-
- assert len(client.servers) == 2
- assert client.servers["atlassian"].startswith("https://mcp-atlassian-route")
- assert client.servers["github"].startswith("https://mcp-github-route")
-
- def test_openshift_route_patterns(self):
- """Test OpenShift route URL patterns."""
- connector = MCPEndpointConnector()
-
- # Test typical OpenShift route patterns
- openshift_routes = [
- "https://mcp-atlassian-route.apps.cluster.example.com/sse",
- "https://mcp-service-vteam.apps.openshift.local/sse",
- "https://mcp-app-dev.apps.cluster.local:8443/api/sse"
- ]
-
- for route in openshift_routes:
- assert connector.validate_endpoint_config(route), f"Should validate OpenShift route: {route}"
-
- def test_cluster_service_dns_patterns(self):
- """Test Kubernetes cluster service DNS patterns."""
- connector = MCPEndpointConnector()
-
- # Test typical cluster service patterns
- cluster_services = [
- "mcp-atlassian.vteam-mcp.svc.cluster.local:8000",
- "mcp-service.default.svc.cluster.local:9000",
- "mcp-app.my-namespace.svc.cluster.local:80"
- ]
-
- for service in cluster_services:
- assert connector.validate_endpoint_config(service), f"Should validate cluster service: {service}"
-
-
-if __name__ == "__main__":
- pytest.main([__file__, "-v", "--tb=short"])
\ No newline at end of file
diff --git a/archive/mcp_client_integration/tests/unit/__init__.py b/archive/mcp_client_integration/tests/unit/__init__.py
deleted file mode 100644
index a74abd9e6..000000000
--- a/archive/mcp_client_integration/tests/unit/__init__.py
+++ /dev/null
@@ -1 +0,0 @@
-"""Unit tests for MCP Client Integration"""
\ No newline at end of file
diff --git a/archive/mcp_client_integration/tests/unit/test_acceptance_criteria.py b/archive/mcp_client_integration/tests/unit/test_acceptance_criteria.py
deleted file mode 100644
index a23cf53fe..000000000
--- a/archive/mcp_client_integration/tests/unit/test_acceptance_criteria.py
+++ /dev/null
@@ -1,251 +0,0 @@
-#!/usr/bin/env python3
-"""
-Acceptance Criteria Validation Tests for US-001
-
-This test suite specifically validates each acceptance criteria from US-001
-to ensure complete implementation compliance.
-"""
-
-import pytest
-import asyncio
-import json
-import os
-from unittest.mock import patch, AsyncMock
-
-from ... import SimpleMCPClient, MCPEndpointConnector, MCPEnhancedLlamaIndex
-from ...common import MCPConfigurationError
-
-
-class TestAcceptanceCriteriaValidation:
- """Validate all US-001 acceptance criteria are met."""
-
- def test_ac001_mcp_client_library_integration(self):
- """
- AC-001: MCP Client Library Integration
- ✓ Llama index deployment includes MCP client libraries
- ✓ Client can establish SSE connection to MCP server endpoint
- ✓ Client handles MCP protocol handshake and capability negotiation
- ✓ Connection supports both synchronous and asynchronous operations
- """
- # Verify MCP client libraries are included
- assert SimpleMCPClient is not None
- assert MCPEndpointConnector is not None
- assert MCPEnhancedLlamaIndex is not None
-
- # Verify client can be initialized
- config = '{"test": "https://test-mcp.com/sse"}'
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
- assert hasattr(client, 'connection_pool') # Connection management capability
- assert hasattr(client, 'query') # Async operation support
- assert hasattr(client, 'health_check') # Protocol handshake support
-
- def test_ac002_basic_connectivity_validation(self):
- """
- AC-002: Basic Connectivity Validation
- ✓ Client can connect to provided MCP server endpoints
- ✓ Supports external route connections
- ✓ Supports cluster-internal service connections
- ✓ Connection health checks validate server availability
- ✓ Client gracefully handles connection timeouts
- ✓ SSL/TLS certificate validation for external route connections
- """
- connector = MCPEndpointConnector()
-
- # External route support
- assert connector.validate_endpoint_config("https://mcp-route.apps.cluster.com/sse")
-
- # Cluster service support
- assert connector.validate_endpoint_config("mcp-atlassian.namespace.svc.cluster.local:8000")
-
- # Health check capability
- config = '{"test": "https://test.com/sse"}'
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
- assert hasattr(client, 'health_check')
-
- # Timeout configuration
- assert hasattr(connector, 'timeout_seconds')
- assert connector.timeout_seconds == 30 # 30 second default
-
- def test_ac003_protocol_compliance(self):
- """
- AC-003: Protocol Compliance
- ✓ Client implements MCP specification for tool discovery
- ✓ Supports MCP message format for request/response cycles
- ✓ Handles MCP error responses according to specification
- ✓ Implements proper message sequencing and correlation IDs
- """
- config = '{"test": "https://test.com/sse"}'
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Tool discovery support (through query interface)
- assert hasattr(client, 'query')
-
- # Message format and error handling (implemented through connection pool)
- assert hasattr(client, 'connection_pool')
-
- # Message sequencing (async query support)
- import inspect
- assert inspect.iscoroutinefunction(client.query)
-
- def test_ac004_multi_mcp_server_configuration(self):
- """
- AC-004: Multi-MCP Server Configuration
- ✓ Multiple MCP Server Support
- ✓ JSON Configuration via environment variables
- ✓ Environment Variable Patterns
- ✓ ConfigMap/Secret Integration
- ✓ Configuration Validation with clear error messages
- ✓ Server Prioritization
- ✓ Health-based Routing
- """
- # Multiple server support
- multi_config = json.dumps({
- "atlassian": "https://mcp-atlassian.com/sse",
- "github": "https://mcp-github.com/sse",
- "confluence": "mcp-confluence.ns.svc.cluster.local:8000"
- })
-
- with patch.dict(os.environ, {'MCP_SERVERS': multi_config}):
- client = SimpleMCPClient()
- assert len(client.servers) == 3
- assert "atlassian" in client.servers
- assert "github" in client.servers
- assert "confluence" in client.servers
-
- # Configuration validation with clear errors
- with patch.dict(os.environ, {'MCP_SERVERS': 'invalid-json'}):
- with pytest.raises(MCPConfigurationError, match="Invalid JSON"):
- SimpleMCPClient()
-
- # Health-based routing
- assert hasattr(client, 'health')
- assert hasattr(client, '_detect_capability')
-
- def test_ac005_configuration_format_support(self):
- """
- AC-005: Configuration Format Support
- ✓ Simple Format: Single MCP server
- ✓ Multi-Server JSON via environment variable
- ✓ Kubernetes ConfigMap support
- ✓ External Route Support (SPIKE-002 validated)
- ✓ Cluster Service Support (SPIKE-002 validated)
- """
- connector = MCPEndpointConnector()
-
- # Simple format (single server in JSON)
- simple_config = '{"default": "https://server/sse"}'
- with patch.dict(os.environ, {'MCP_SERVERS': simple_config}):
- client = SimpleMCPClient()
- assert "default" in client.servers
-
- # Multi-server JSON
- multi_config = json.dumps({
- "server1": "https://server1.com/sse",
- "server2": "https://server2.com/sse"
- })
- with patch.dict(os.environ, {'MCP_SERVERS': multi_config}):
- client = SimpleMCPClient()
- assert len(client.servers) == 2
-
- # External route format validation (SPIKE-002)
- assert connector.validate_endpoint_config("https://mcp-route.apps.cluster.com/sse")
-
- # Cluster service format validation (SPIKE-002)
- assert connector.validate_endpoint_config("mcp-atlassian.namespace.svc.cluster.local:8000")
-
- @pytest.mark.asyncio
- async def test_definition_of_done(self):
- """
- Definition of Done Validation
- ✓ MCP client successfully connects to deployed MCP Atlassian server
- ✓ Connection can be established from llama index pod to MCP server pod
- ✓ All unit tests pass with >90% coverage (validated separately)
- ✓ Integration test validates end-to-end connectivity
- ✓ Documentation includes connection setup examples
- ✓ Error scenarios tested and documented
- """
- # Connection capability
- config = '{"atlassian": "https://test-mcp-server.com/sse"}'
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Mock successful connection
- with patch.object(client.connection_pool, 'add_connection', new_callable=AsyncMock) as mock_connect:
- mock_connect.return_value = True
- await client.connect_all()
- # Verify connection attempt was made
- mock_connect.assert_called()
-
- # Integration test capability
- enhanced = MCPEnhancedLlamaIndex()
- assert hasattr(enhanced, 'enhanced_query')
-
- # Error scenario handling
- with patch.dict(os.environ, {'MCP_SERVERS': '{}'}):
- with pytest.raises(MCPConfigurationError, match="at least one server"):
- SimpleMCPClient()
-
- def test_spike_integration_validation(self):
- """
- Validate SPIKE-001 and SPIKE-002 integration
- ✓ SPIKE-001 patterns integrated (MCP client architecture)
- ✓ SPIKE-002 patterns integrated (endpoint validation)
- ✓ Risk level reduced from MEDIUM to LOW
- """
- # SPIKE-001 patterns: SimpleMCPClient architecture
- config = '{"test": "https://test.com/sse"}'
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Validate SPIKE-001 patterns
- assert hasattr(client, 'servers') # Multi-server support
- assert hasattr(client, 'connections') # Connection management
- assert hasattr(client, 'health') # Health tracking
- assert hasattr(client, '_detect_capability') # Capability routing
-
- # SPIKE-002 patterns: MCPEndpointConnector
- connector = MCPEndpointConnector()
-
- # Validate SPIKE-002 patterns
- assert hasattr(connector, 'validate_endpoint_config')
- assert hasattr(connector, 'get_validation_result') # Uses common validator
- assert hasattr(connector, 'test_connectivity')
-
- # Risk reduction validation: All core functionality implemented
- assert SimpleMCPClient is not None
- assert MCPEndpointConnector is not None
- assert MCPEnhancedLlamaIndex is not None
-
- def test_enhanced_features_validation(self):
- """
- Validate enhanced features from US-001 requirements
- ✓ Opinionated configuration approach
- ✓ Capability-based routing
- ✓ Health-based failover
- ✓ Multi-server JSON configuration
- """
- # Opinionated configuration: Single JSON environment variable
- config = json.dumps({
- "atlassian": "https://mcp-atlassian.com/sse",
- "github": "https://mcp-github.com/sse"
- })
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Capability-based routing
- assert client._detect_capability("Get Jira tickets") == "atlassian"
- assert client._detect_capability("List GitHub repos") == "github"
-
- # Health-based failover
- assert hasattr(client, 'health')
-
- # Multi-server support
- assert len(client.servers) == 2
-
-
-if __name__ == "__main__":
- pytest.main([__file__, "-v", "--tb=short"])
\ No newline at end of file
diff --git a/archive/mcp_client_integration/tests/unit/test_edge_cases.py b/archive/mcp_client_integration/tests/unit/test_edge_cases.py
deleted file mode 100644
index e119a51be..000000000
--- a/archive/mcp_client_integration/tests/unit/test_edge_cases.py
+++ /dev/null
@@ -1,356 +0,0 @@
-#!/usr/bin/env python3
-"""
-Edge case and error handling tests for MCP Client Integration
-
-This test suite covers edge cases, error conditions, and boundary scenarios
-to improve test coverage and ensure robust error handling.
-"""
-
-import pytest
-import asyncio
-import json
-import os
-from unittest.mock import patch, Mock, AsyncMock
-
-from ...simple_mcp_client import SimpleMCPClient
-from ...endpoint_connector import MCPEndpointConnector
-from ...llama_integration import MCPEnhancedLlamaIndex
-from ...common import MCPConfigurationError
-
-
-class TestEdgeCasesAndErrorHandling:
- """Test edge cases and error handling scenarios."""
-
- def test_empty_environment_variable(self):
- """Test behavior with empty MCP_SERVERS environment variable."""
- with patch.dict(os.environ, {}, clear=True):
- # Should use default configuration
- client = SimpleMCPClient()
- assert "default" in client.servers
- assert client.servers["default"] == "https://mcp-server/sse"
-
- def test_malformed_json_configurations(self):
- """Test various malformed JSON configurations."""
- malformed_configs = [
- '{"incomplete": ',
- '{"invalid": "value"',
- '{"mixed": "quotes\'}',
- '{invalid-key: "value"}',
- ]
-
- for config in malformed_configs:
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- with pytest.raises(MCPConfigurationError, match="Invalid JSON"):
- SimpleMCPClient()
-
- # Test non-object JSON (should raise different error)
- non_object_configs = [
- '["not", "an", "object"]',
- '"just-a-string"',
- 'null'
- ]
-
- for config in non_object_configs:
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- with pytest.raises(MCPConfigurationError, match="JSON object"):
- SimpleMCPClient()
-
- def test_invalid_server_configurations(self):
- """Test invalid server configuration scenarios."""
- invalid_configs = [
- '{"empty-endpoint": ""}',
- '{"none-endpoint": null}',
- '{"number-endpoint": 123}',
- '{"array-endpoint": ["not", "valid"]}',
- '{"object-endpoint": {"nested": "object"}}'
- ]
-
- for config in invalid_configs:
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- with pytest.raises(MCPConfigurationError):
- SimpleMCPClient()
-
- def test_endpoint_validation_edge_cases(self):
- """Test endpoint validation with edge cases."""
- connector = MCPEndpointConnector()
-
- # Edge case URLs that should be rejected
- edge_cases = [
- None,
- "",
- " ",
- " ",
- "https://",
- "https:///path",
- "https://hostname..double-dot.com",
- "https://hostname-ending-with-.com",
- "https://hostname:999999/path",
- "https://hostname:-1/path",
- "mcp-service..double-dot.svc.cluster.local",
- "mcp-service.svc.cluster.local", # Missing namespace
- ".svc.cluster.local:8000", # Missing service name
- "mcp-service.namespace.svc.cluster.wrong:8000", # Wrong domain
- ]
-
- for endpoint in edge_cases:
- assert not connector.validate_endpoint_config(endpoint), f"Should reject: {endpoint}"
-
- # Note: "http://hostname-with-no-tld" is actually valid according to our implementation
- # as it's a valid hostname format (just not a FQDN)
-
- def test_hostname_validation_edge_cases(self):
- """Test hostname validation edge cases."""
- connector = MCPEndpointConnector()
-
- invalid_hostnames = [
- "",
- "a" * 254, # Too long
- "-starting-with-dash.com",
- "ending-with-dash-.com",
- "under_score.com",
- "special!char.com",
- "space name.com",
- ]
-
- for hostname in invalid_hostnames:
- # Test through endpoint validation since hostname validation is now internal to validator
- assert not connector.validate_endpoint_config(f"https://{hostname}/sse"), f"Should reject hostname: {hostname}"
-
- # Note: "double--dash.com" is actually valid according to our regex pattern
- # Our validation is more permissive than strict RFC standards
-
- def test_kubernetes_name_validation_edge_cases(self):
- """Test Kubernetes name validation edge cases."""
- connector = MCPEndpointConnector()
-
- invalid_names = [
- "",
- "A" * 64, # Too long
- "Capital-Letters",
- "-starting-dash",
- "ending-dash-",
- "under_score",
- # Note: "special.char" is actually valid in service names in some contexts
- ]
-
- for name in invalid_names:
- # Test through cluster service validation since k8s name validation is now internal to validator
- test_endpoint = f"{name}.default.svc.cluster.local:8080"
- assert not connector.validate_endpoint_config(test_endpoint), f"Should reject k8s name: {name}"
-
- # Note: "123invalid" is actually valid in our implementation
- # Kubernetes allows names starting with numbers in some contexts
-
- @pytest.mark.asyncio
- async def test_connection_failure_handling(self):
- """Test connection failure scenarios."""
- config = '{"test": "https://nonexistent.example.com/sse"}'
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Mock connection failure in connection pool
- with patch.object(client.connection_pool, 'add_connection', side_effect=ConnectionError("Connection failed")):
- await client.connect_all()
-
- # Should handle connection failure gracefully
- health_status = await client.health_check()
- status = client.get_server_status()
- # Connection failure is handled, server status reflects the failure
-
- @pytest.mark.asyncio
- async def test_query_with_all_servers_unhealthy(self):
- """Test query behavior when all servers are unhealthy."""
- config = '{"test": "https://test.com/sse"}'
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Mock health check to return unhealthy status
- with patch.object(client, 'health_check', return_value={"test": False}):
- with pytest.raises(Exception): # Should raise some error for no healthy servers
- await client.query("test query")
-
- @pytest.mark.asyncio
- async def test_query_fallback_scenarios(self):
- """Test query fallback when primary server fails."""
- config = json.dumps({
- "primary": "https://primary.com/sse",
- "fallback": "https://fallback.com/sse"
- })
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Mock connection pool to simulate fallback behavior
- with patch.object(client.connection_pool, 'send_message') as mock_send:
- # First call fails, second call (fallback) succeeds
- mock_send.side_effect = [
- ConnectionError("Primary failed"),
- {"status": "ok", "server": "fallback"}
- ]
-
- # Should succeed via fallback mechanism
- try:
- result = await client.query("test query", capability="primary")
- # If implemented, should get fallback result
- except Exception:
- # Fallback may not be fully implemented yet
- pass
-
- # Test that health status can be checked
- health_status = await client.health_check()
- # Health status reflects current server state
-
- @pytest.mark.asyncio
- async def test_health_check_edge_cases(self):
- """Test health check with various edge cases."""
- config = json.dumps({
- "good": "https://good.com/sse",
- "bad": "https://bad.com/sse",
- "no-method": "https://no-method.com/sse"
- })
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Mock different connection types
- good_conn = AsyncMock()
- good_conn.send_message.return_value = "pong"
-
- bad_conn = AsyncMock()
- bad_conn.send_message.side_effect = Exception("Health check failed")
-
- no_method_conn = Mock() # No send_message method
-
- # Mock health check to return expected results
- with patch.object(client.connection_pool, 'health_check') as mock_health:
- mock_health.return_value = {
- "good": True,
- "bad": False,
- "no-method": False
- }
-
- health = await client.health_check()
-
- assert health["good"] == True
- assert health["bad"] == False
- assert health["no-method"] == False
-
- @pytest.mark.asyncio
- async def test_disconnect_edge_cases(self):
- """Test disconnect with various connection states."""
- config = '{"test": "https://test.com/sse"}'
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Mock different connection types
- good_conn = AsyncMock()
- bad_conn = AsyncMock()
- bad_conn.close.side_effect = Exception("Close failed")
- no_close_conn = Mock() # No close method
-
- # Mock the connection pool close_all method
- with patch.object(client.connection_pool, 'close_all') as mock_close:
- # Should not raise exceptions even if individual connections fail
- await client.disconnect_all()
- mock_close.assert_called_once()
- # Test that disconnect was called
-
- def test_capability_detection_edge_cases(self):
- """Test capability detection with edge cases."""
- config = json.dumps({
- "test-server": "https://test.com/sse",
- "another": "https://another.com/sse"
- })
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Test empty query
- capability = client._detect_capability("")
- assert capability in client.servers
-
- # Test whitespace-only query
- capability = client._detect_capability(" ")
- assert capability in client.servers
-
- # Test query with no matching keywords
- capability = client._detect_capability("random unrelated query")
- assert capability == "test-server" # Should default to first server
-
- def test_server_status_with_no_connections(self):
- """Test server status when no connections exist."""
- config = '{"test": "https://test.com/sse"}'
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- status = client.get_server_status()
-
- assert "test" in status
- assert status["test"]["connected"] == False
- assert status["test"]["healthy"] == False
-
- def test_endpoint_info_with_invalid_endpoints(self):
- """Test endpoint info analysis with invalid endpoints."""
- connector = MCPEndpointConnector()
-
- invalid_endpoints = [
- "",
- "invalid-format",
- "ftp://unsupported.com",
- None
- ]
-
- for endpoint in invalid_endpoints:
- info = connector.get_endpoint_info(endpoint)
- assert info["valid"] == False
- assert info["type"] is None
- assert info["parsed"] is None
-
- @pytest.mark.asyncio
- async def test_llama_integration_error_scenarios(self):
- """Test MCPEnhancedLlamaIndex error handling."""
- config = '{"test": "https://test.com/sse"}'
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- enhanced = MCPEnhancedLlamaIndex()
-
- # Mock MCP client to raise error
- enhanced.mcp_client.query = AsyncMock(side_effect=Exception("MCP query failed"))
- enhanced._initialized = True
-
- result = await enhanced.enhanced_query("test query")
-
- assert result["success"] == False
- assert "error" in result
- assert "MCP query failed" in result["error"]
-
- @pytest.mark.asyncio
- async def test_llama_integration_uninitialized_status(self):
- """Test MCPEnhancedLlamaIndex status when uninitialized."""
- enhanced = MCPEnhancedLlamaIndex()
-
- status = await enhanced.get_mcp_status()
-
- assert status["initialized"] == False
- assert "servers" in status
-
- @pytest.mark.asyncio
- async def test_connectivity_testing_edge_cases(self):
- """Test connectivity testing with various scenarios."""
- connector = MCPEndpointConnector()
-
- # Test with invalid endpoint
- result = await connector.test_connectivity("invalid-endpoint")
- assert result["reachable"] == False
- assert "not recognized" in result["error"] or "Invalid" in result["error"]
-
- # Test timeout scenarios would require longer test setup
- # This is a placeholder for more comprehensive connectivity testing
-
-
-if __name__ == "__main__":
- pytest.main([__file__, "-v", "--tb=short"])
\ No newline at end of file
diff --git a/archive/mcp_client_integration/tests/unit/test_mcp_client_integration.py b/archive/mcp_client_integration/tests/unit/test_mcp_client_integration.py
deleted file mode 100644
index e35ab1c42..000000000
--- a/archive/mcp_client_integration/tests/unit/test_mcp_client_integration.py
+++ /dev/null
@@ -1,380 +0,0 @@
-#!/usr/bin/env python3
-"""
-Test suite for US-001: MCP Client Integration
-
-This test suite validates all acceptance criteria for MCP client integration
-with llama index deployments. Based on SPIKE-001 and SPIKE-002 validated patterns.
-"""
-
-import pytest
-import asyncio
-import json
-import os
-from unittest.mock import Mock, patch, AsyncMock, MagicMock
-from typing import Dict, Any, List
-import tempfile
-
-# Test imports - using relative imports for package structure
-try:
- from ...simple_mcp_client import SimpleMCPClient
- from ...endpoint_connector import MCPEndpointConnector
- from ...llama_integration import MCPEnhancedLlamaIndex
-except ImportError:
- # Tests will initially fail - this is expected in TDD
- SimpleMCPClient = None
- MCPEndpointConnector = None
- MCPEnhancedLlamaIndex = None
-
-
-class TestMCPClientLibraryIntegration:
- """Test AC-001: MCP Client Library Integration"""
-
- def test_mcp_client_import(self):
- """Test MCP client can be imported without errors"""
- # This test validates the basic import structure
- assert SimpleMCPClient is not None, "SimpleMCPClient should be importable"
-
- def test_simple_mcp_client_initialization(self):
- """Test SimpleMCPClient can be initialized"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- with patch.dict(os.environ, {'MCP_SERVERS': '{"atlassian": "https://test.com/sse"}'}):
- client = SimpleMCPClient()
- assert hasattr(client, 'servers')
- assert hasattr(client, 'connections')
- assert hasattr(client, 'health')
-
- def test_sse_connection_capability(self):
- """Test client can establish SSE connection capability"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- with patch.dict(os.environ, {'MCP_SERVERS': '{"atlassian": "https://test.com/sse"}'}):
- client = SimpleMCPClient()
- # Test that client has connection methods
- assert hasattr(client, 'connect_all')
- assert hasattr(client, 'connection_pool') # Uses connection pool now
-
- def test_protocol_handshake_support(self):
- """Test client handles MCP protocol handshake"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- # This will test the protocol compliance once implemented
- client = SimpleMCPClient()
- assert hasattr(client, 'query') # Main query interface
-
-
-class TestBasicConnectivityValidation:
- """Test AC-002: Basic Connectivity Validation"""
-
- def test_external_route_connection_support(self):
- """Test support for external route connections"""
- if MCPEndpointConnector is None:
- pytest.skip("MCPEndpointConnector not implemented yet")
-
- connector = MCPEndpointConnector()
- # Test external route format validation
- assert connector.validate_endpoint_config("https://mcp-route.apps.cluster.com/sse")
-
- def test_cluster_service_connection_support(self):
- """Test support for cluster-internal service connections"""
- if MCPEndpointConnector is None:
- pytest.skip("MCPEndpointConnector not implemented yet")
-
- connector = MCPEndpointConnector()
- # Test cluster service format validation
- assert connector.validate_endpoint_config("mcp-atlassian.namespace.svc.cluster.local:8000")
-
- def test_invalid_endpoint_rejection(self):
- """Test invalid endpoints are properly rejected"""
- if MCPEndpointConnector is None:
- pytest.skip("MCPEndpointConnector not implemented yet")
-
- connector = MCPEndpointConnector()
- # Test invalid formats are rejected
- assert not connector.validate_endpoint_config("invalid-format")
- assert not connector.validate_endpoint_config("")
- assert not connector.validate_endpoint_config("ftp://invalid.com")
-
- def test_connection_timeout_configuration(self):
- """Test connection timeout configuration (30 second default)"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- with patch.dict(os.environ, {'MCP_SERVERS': '{"atlassian": "https://test.com/sse"}'}):
- client = SimpleMCPClient()
- # Verify timeout configuration exists in config
- assert hasattr(client, 'config') # Should have configuration object
- assert client.config.default_timeout == 30 # Default timeout
-
-
-class TestProtocolCompliance:
- """Test AC-003: Protocol Compliance"""
-
- def test_mcp_tool_discovery_implementation(self):
- """Test client implements MCP specification for tool discovery"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- client = SimpleMCPClient()
- # Test tool discovery capability
- assert hasattr(client, 'query') # Should support tool discovery queries
-
- def test_message_format_support(self):
- """Test supports MCP message format for request/response cycles"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- # This will test message format compliance
- with patch.dict(os.environ, {'MCP_SERVERS': '{"atlassian": "https://test.com/sse"}'}):
- client = SimpleMCPClient()
- # Verify message handling exists through connection pool
- assert hasattr(client, 'connection_pool')
-
- def test_error_response_handling(self):
- """Test handles MCP error responses according to specification"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- # Test error handling patterns
- client = SimpleMCPClient()
- # Should have error handling in query method
- assert hasattr(client, 'query')
-
- def test_message_correlation_ids(self):
- """Test implements proper message sequencing and correlation IDs"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- # Test correlation ID handling
- client = SimpleMCPClient()
- # Should support correlation in async operations
- assert hasattr(client, 'query')
-
-
-class TestMultiMCPServerConfiguration:
- """Test AC-004: Multi-MCP Server Configuration"""
-
- def test_multiple_mcp_server_support(self):
- """Test configure multiple MCP servers for llama index integration"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- config = json.dumps({
- "atlassian": "https://mcp-atlassian.com/sse",
- "github": "https://mcp-github.com/sse",
- "confluence": "mcp-confluence.namespace.svc.cluster.local:8000"
- })
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
- assert len(client.servers) == 3
- assert "atlassian" in client.servers
- assert "github" in client.servers
- assert "confluence" in client.servers
-
- def test_json_configuration_parsing(self):
- """Test JSON-based multi-server configuration via environment variables"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- config = '{"atlassian": "https://test.com/sse"}'
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
- assert "atlassian" in client.servers
- assert client.servers["atlassian"] == "https://test.com/sse"
-
- def test_configuration_validation_on_startup(self):
- """Test validate all MCP server configurations on startup with clear error messages"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- # Test invalid JSON configuration
- with patch.dict(os.environ, {'MCP_SERVERS': 'invalid-json'}):
- with pytest.raises(Exception): # Should raise clear error
- SimpleMCPClient()
-
- def test_health_based_routing(self):
- """Test route requests to healthy MCP servers"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- client = SimpleMCPClient()
- # Should have health tracking
- assert hasattr(client, 'health')
- assert hasattr(client, 'query') # Should support health-based routing
-
-
-class TestConfigurationFormatSupport:
- """Test AC-005: Configuration Format Support"""
-
- def test_simple_format_single_server(self):
- """Test single MCP server via MCP_ENDPOINT"""
- # Note: Based on US-001 enhancement, we're using simplified JSON approach
- # This test validates the single server case using JSON format
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- config = '{"default": "https://server/sse"}'
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
- assert "default" in client.servers
-
- def test_multi_server_json_configuration(self):
- """Test multiple servers via MCP_SERVERS JSON environment variable"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- config = json.dumps({
- "atlassian": "https://mcp-atlassian-route.apps.cluster.com/sse",
- "github": "https://mcp-github-route.apps.cluster.com/sse"
- })
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
- assert len(client.servers) == 2
- assert client.servers["atlassian"] == "https://mcp-atlassian-route.apps.cluster.com/sse"
-
- def test_external_route_format_validation(self):
- """Test external route format validation (SPIKE-002 validated)"""
- if MCPEndpointConnector is None:
- pytest.skip("MCPEndpointConnector not implemented yet")
-
- connector = MCPEndpointConnector()
- # Test SPIKE-002 validated external route formats
- assert connector.validate_endpoint_config("https://mcp-route.apps.cluster.com/sse")
- assert connector.validate_endpoint_config("https://mcp-atlassian-route.apps.cluster.com/sse")
-
- def test_cluster_service_format_validation(self):
- """Test cluster service format validation (SPIKE-002 validated)"""
- if MCPEndpointConnector is None:
- pytest.skip("MCPEndpointConnector not implemented yet")
-
- connector = MCPEndpointConnector()
- # Test SPIKE-002 validated cluster service formats
- assert connector.validate_endpoint_config("mcp-atlassian.namespace.svc.cluster.local:8000")
- assert connector.validate_endpoint_config("service.ns.svc.cluster.local:8080")
-
-
-class TestCapabilityRouting:
- """Test enhanced capability routing (from US-001 multi-MCP enhancement)"""
-
- def test_automatic_capability_detection(self):
- """Test keyword-based capability routing"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- client = SimpleMCPClient()
- # Test capability detection method exists
- assert hasattr(client, '_detect_capability')
-
- @pytest.mark.asyncio
- async def test_capability_based_request_routing(self):
- """Test requests route to correct servers based on capabilities"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- # Mock multi-server setup
- config = json.dumps({
- "atlassian": "https://mcp-atlassian.com/sse",
- "github": "https://mcp-github.com/sse"
- })
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Mock connection pool for testing
- with patch.object(client.connection_pool, 'get_connection_info') as mock_connections, \
- patch.object(client.connection_pool, 'get_health_status') as mock_health:
-
- mock_connections.return_value = {
- "atlassian": {"connected": True, "type": "external_route"},
- "github": {"connected": True, "type": "external_route"}
- }
- mock_health.return_value = {"atlassian": True, "github": True}
-
- # Test capability detection
- capability = client._detect_capability("Get Jira tickets")
- assert capability == "atlassian"
-
- capability = client._detect_capability("List GitHub repos")
- assert capability == "github"
-
-
-class TestLlamaIndexIntegration:
- """Test llama index integration patterns"""
-
- def test_llama_index_enhanced_class_creation(self):
- """Test enhanced llama index class can be created"""
- if MCPEnhancedLlamaIndex is None:
- pytest.skip("MCPEnhancedLlamaIndex not implemented yet")
-
- # Test creation without initialization
- assert MCPEnhancedLlamaIndex is not None
-
- @pytest.mark.asyncio
- async def test_enhanced_query_method(self):
- """Test enhanced query method integrates MCP and llama index"""
- if MCPEnhancedLlamaIndex is None:
- pytest.skip("MCPEnhancedLlamaIndex not implemented yet")
-
- # This will test the integration pattern
- enhanced = MCPEnhancedLlamaIndex()
- assert hasattr(enhanced, 'enhanced_query')
-
-
-class TestDefinitionOfDone:
- """Test Definition of Done criteria"""
-
- @pytest.mark.asyncio
- async def test_mcp_client_connects_to_deployed_server(self):
- """Test MCP client successfully connects to deployed MCP Atlassian server"""
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- # This test will validate real connection capability
- config = '{"atlassian": "https://test-mcp-server.com/sse"}'
-
- with patch.dict(os.environ, {'MCP_SERVERS': config}):
- client = SimpleMCPClient()
-
- # Mock successful connection for test
- with patch.object(client.connection_pool, 'add_connection', new_callable=AsyncMock) as mock_connect:
- mock_connect.return_value = True
-
- await client.connect_all()
- # Check health status through the property
- health_status = await client.health_check()
- mock_connect.assert_called()
-
- def test_unit_test_coverage_requirement(self):
- """Test that unit test coverage is >90%"""
- # This test ensures we have comprehensive coverage
- # Coverage will be validated by external tools
- assert True # Placeholder for coverage validation
-
- def test_integration_test_validation(self):
- """Test integration test validates end-to-end connectivity"""
- # This test will validate integration testing capability
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- # Integration test framework should exist
- assert SimpleMCPClient is not None
-
- def test_error_scenarios_documented(self):
- """Test error scenarios tested and documented"""
- # This test ensures error handling is comprehensive
- if SimpleMCPClient is None:
- pytest.skip("SimpleMCPClient not implemented yet")
-
- # Should have error handling in main methods
- client = SimpleMCPClient()
- assert hasattr(client, 'query') # Should handle errors in query
-
-
-if __name__ == "__main__":
- # Run tests with pytest
- pytest.main([__file__, "-v", "--tb=short"])
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/__init__.py b/archive/vteam_shared_configs/__init__.py
deleted file mode 100644
index 65b8e45e4..000000000
--- a/archive/vteam_shared_configs/__init__.py
+++ /dev/null
@@ -1,12 +0,0 @@
-"""vTeam Shared Claude Code Configuration Package.
-
-Provides shared Claude Code configuration for team development standards.
-"""
-
-__version__ = "1.0.0"
-__author__ = "Jeremy Eder"
-__email__ = "jeremy@example.com"
-
-from .installer import ConfigInstaller
-
-__all__ = ["ConfigInstaller"]
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/cli.py b/archive/vteam_shared_configs/cli.py
deleted file mode 100644
index 4e94f6fa0..000000000
--- a/archive/vteam_shared_configs/cli.py
+++ /dev/null
@@ -1,77 +0,0 @@
-"""Command-line interface for vTeam shared configuration management."""
-
-import click
-from .installer import ConfigInstaller
-
-
-@click.group()
-@click.version_option(version="1.0.0")
-def main():
- """vTeam Shared Claude Code Configuration Manager.
-
- Manage shared Claude Code configuration for team development standards.
- """
- pass
-
-
-@main.command()
-@click.option('--force', is_flag=True, help='Force reinstallation even if already installed')
-def install(force):
- """Install vTeam shared Claude Code configuration.
-
- Sets up symlinks for global configuration and project templates.
- Automatically backs up existing configuration.
- """
- installer = ConfigInstaller()
-
- if installer.install(force_reinstall=force):
- click.echo(click.style("✅ vTeam configuration installed successfully!", fg="green"))
- installer.status()
- else:
- click.echo(click.style("❌ Installation failed", fg="red"))
- exit(1)
-
-
-@main.command()
-def uninstall():
- """Uninstall vTeam shared Claude Code configuration.
-
- Removes symlinks and restores backed up configuration if available.
- """
- installer = ConfigInstaller()
-
- if installer.uninstall():
- click.echo(click.style("✅ vTeam configuration uninstalled successfully!", fg="green"))
- else:
- click.echo(click.style("❌ Uninstallation failed", fg="red"))
- exit(1)
-
-
-@main.command()
-def status():
- """Show current vTeam configuration status.
-
- Displays whether configuration is active and properly linked.
- """
- installer = ConfigInstaller()
- installer.status()
-
-
-@main.command()
-def update():
- """Update to latest vTeam configuration.
-
- Equivalent to reinstalling with --force flag.
- """
- installer = ConfigInstaller()
-
- if installer.install(force_reinstall=True):
- click.echo(click.style("✅ vTeam configuration updated successfully!", fg="green"))
- installer.status()
- else:
- click.echo(click.style("❌ Update failed", fg="red"))
- exit(1)
-
-
-if __name__ == "__main__":
- main()
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/data/.claude/settings.json b/archive/vteam_shared_configs/data/.claude/settings.json
deleted file mode 100644
index 91aba8ee7..000000000
--- a/archive/vteam_shared_configs/data/.claude/settings.json
+++ /dev/null
@@ -1,45 +0,0 @@
-{
- "hooks": {
- "preToolUse": [
- {
- "name": "enforce-vteam-config",
- "description": "Ensure latest vTeam shared configuration is active before Git operations",
- "match": {
- "tool": "Bash",
- "command": "git commit*"
- },
- "command": "vteam-config install --force"
- },
- {
- "name": "enforce-vteam-config-push",
- "description": "Ensure latest vTeam shared configuration is active before Git push",
- "match": {
- "tool": "Bash",
- "command": "git push*"
- },
- "command": "vteam-config install --force"
- }
- ],
- "sessionStart": [
- {
- "name": "vteam-welcome",
- "description": "Display vTeam configuration status on session start",
- "command": "vteam-config status"
- }
- ]
- },
- "permissions": {
- "allow": [
- "Bash(git:*)",
- "Bash(npm:*)",
- "Bash(python:*)",
- "Bash(uv:*)",
- "Bash(black:*)",
- "Bash(isort:*)",
- "Bash(flake8:*)",
- "Bash(pytest:*)",
- "Bash(shellcheck:*)",
- "Bash(markdownlint:*)"
- ]
- }
-}
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/data/README.md b/archive/vteam_shared_configs/data/README.md
deleted file mode 100644
index 135a79d00..000000000
--- a/archive/vteam_shared_configs/data/README.md
+++ /dev/null
@@ -1,275 +0,0 @@
-# vTeam Shared Claude Configuration
-
-> Shared Claude Code configuration templates and standards for team development
-
-## Quick Start
-
-```bash
-# Install the package
-pip install vteam-shared-configs
-
-# Install team configuration
-vteam-config install
-
-# Start using Claude Code with team standards!
-# Configuration is now active and enforced
-```
-
-## How It Works
-
-### Configuration Hierarchy
-
-```mermaid
-flowchart TD
- A[Developer Action] --> B{Claude Settings Hierarchy}
- B --> C[".claude/settings.local.json
🏆 HIGHEST PRIORITY
Personal overrides
(never committed)"]
- B --> D[".claude/settings.json
📋 TEAM STANDARDS
vTeam shared config
(in git repository)"]
- B --> E["~/.claude/settings.json
⚙️ PERSONAL DEFAULTS
Global user settings
(copied from team)"]
- C --> F[Final Configuration]
- D --> F
- E --> F
- F --> G[Claude Code Execution]
-```
-
-### Developer Workflow
-
-```mermaid
-sequenceDiagram
- participant Dev as Developer
- participant PyPI as PyPI Package Registry
- participant Claude as Claude Code
- participant Config as vTeam Config
-
- Dev->>PyPI: pip install vteam-shared-configs
- Dev->>Config: vteam-config install
- Config->>Claude: Auto-create ~/.claude/CLAUDE.md symlink
- Config->>Claude: Auto-create ~/.claude/project-templates symlink
- Config->>Claude: Install team hooks configuration
- Config->>Dev: ✅ Team configuration active
-
- Dev->>Claude: Start Claude session
- Config->>Dev: Display configuration status
-
- Note over Dev: All development uses team standards
-```
-
-## Overview
-
-This directory contains shared Claude Code configuration files that provide:
-
-- **Consistent development standards** across team projects
-- **Automatic enforcement** via Claude Code hooks
-- **Reusable project templates** for common technologies
-- **Developer override flexibility** with clear hierarchy
-- **Best practices** for code quality and team collaboration
-
-## What's Included
-
-### Core Configuration Files
-
-- **`.claude/settings.json`** - Team hooks and enforcement rules
-- **`.gitignore`** - Comprehensive ignore rules for development files
-- **`CLAUDE.md`** - Project-specific Claude Code guidance template
-- **`LICENSE`** - MIT license for open source projects
-- **`.github/dependabot.yml`** - Automated dependency management
-
-### Hooks & Enforcement
-
-- **`hooks/enforce-config.sh`** - Pre-commit configuration enforcement
-- **`hooks/status-check.sh`** - Session start status display
-- **Automatic symlink management** - Global config and templates
-- **Pre-commit/push validation** - Ensures team standards
-
-### Global Configuration
-
-- **`claude/global-CLAUDE.md`** - Team-wide Claude Code standards
-- **Development principles** - Git workflow, code quality, testing
-- **File management standards** - Branch verification, linting requirements
-- **Team collaboration guidelines** - GitHub Flow, commit standards
-
-### Project Templates
-
-Pre-configured CLAUDE.md templates for:
-
-- **`python-CLAUDE.md`** - Python projects with uv, black, pytest
-- **`javascript-CLAUDE.md`** - JavaScript/Node.js projects
-- **`shell-CLAUDE.md`** - Shell script projects with ShellCheck
-
-## Installation
-
-### Package Installation (Recommended)
-
-```bash
-# Install package
-pip install vteam-shared-configs
-
-# Set up configuration
-vteam-config install
-
-# Check status
-vteam-config status
-```
-
-### Manual Setup (Development)
-
-If working with the source repository:
-
-```bash
-# Create .claude directory
-mkdir -p ~/.claude
-
-# Link global configuration
-ln -sf "$(pwd)/claude/global-CLAUDE.md" ~/.claude/CLAUDE.md
-
-# Link project templates
-ln -sf "$(pwd)/claude/project-templates" ~/.claude/
-```
-
-## Usage
-
-### For New Projects
-
-1. **Copy project template** to your project root as `CLAUDE.md`:
- ```bash
- cp ~/.claude/project-templates/python-CLAUDE.md myproject/CLAUDE.md
- ```
-
-2. **Customize for your project** - update architecture, commands, etc.
-
-3. **Start development** with Claude Code understanding your project context
-
-### For Existing Projects
-
-1. **Add CLAUDE.md** using appropriate template as starting point
-2. **Update development commands** to match your project's needs
-3. **Follow team standards** from global configuration
-
-## Configuration Management
-
-### Automatic Updates
-- **Hooks enforce latest config** on every Git commit/push
-- **No manual updates needed** - configuration stays current
-- **Session status checks** display current configuration state
-
-### Manual Updates (if needed)
-```bash
-# Pull latest changes
-git pull origin main
-
-# Hooks will automatically update configuration on next Git operation
-# Or run manually:
-./hooks/enforce-config.sh
-```
-
-## Developer Customization
-
-### Configuration Hierarchy (Highest → Lowest Priority)
-1. **Local Project Settings** (`.claude/settings.local.json`) - **Your overrides**
-2. **Shared Project Settings** (`.claude/settings.json`) - **vTeam standards**
-3. **User Global Settings** (`~/.claude/settings.json`) - **Personal defaults**
-
-### Personal Overrides
-
-Create `.claude/settings.local.json` in any project for personal customizations:
-
-```json
-{
- "hooks": {
- "postToolUse": [
- {"name": "my-logging", "command": "echo 'Tool used'"}
- ]
- },
- "permissions": {
- "allow": ["Bash(my-custom-tool:*)"]
- }
-}
-```
-
-### File Structure & Configuration Flow
-
-```mermaid
-graph TB
- subgraph "vTeam Repository"
- A[".claude/settings.json
Team hooks & standards"]
- B["hooks/enforce-config.sh
Pre-commit enforcement"]
- C["claude/global-CLAUDE.md
Team-wide standards"]
- D["claude/project-templates/
Language templates"]
- A --> B
- end
-
- subgraph "Developer Machine"
- E["~/.claude/settings.json
Personal global config"]
- F["~/.claude/CLAUDE.md
Global team standards"]
- G["~/.claude/project-templates/
Shared templates"]
- H[".claude/settings.local.json
Personal overrides"]
- end
-
- subgraph "Claude Code"
- I["Final Configuration
Merged hierarchy"]
- J["Development Actions
Git, linting, etc."]
- end
-
- A -.->|copied once| E
- B -.->|creates symlinks| F
- B -.->|creates symlinks| G
- C -.->|symlinked to| F
- D -.->|symlinked to| G
-
- H -->|highest priority| I
- A -->|team standards| I
- E -->|personal defaults| I
-
- I --> J
- J -.->|triggers| B
-```
-
-**What you CAN override:**
-- ✅ Add personal hooks and automation
-- ✅ Extend permissions for custom tools
-- ✅ Personal workflow preferences
-- ✅ Custom aliases and shortcuts
-
-**What gets enforced:**
-- 🔒 Pre-commit/push configuration validation
-- 🔒 Core team development standards
-- 🔒 Quality and linting requirements
-
-## Team Standards
-
-### Git Workflow
-- **Always work in feature branches** unless explicitly told otherwise
-- **Mandatory branch verification** before any file modifications
-- **Squash commits** for clean history
-- **Follow GitHub Flow** for all repositories
-
-### Code Quality
-- **Run linters locally** before every commit/push
-- **Never push** if linters report errors or warnings
-- **Use language-specific tooling** (black for Python, prettier for JS, etc.)
-- **Always run tests** before pushing changes
-
-### Development Environment
-- **Use virtual environments** for Python projects (prefer `uv` over `pip`)
-- **Automate dependency management** with dependabot
-- **Document project-specific commands** in project CLAUDE.md
-
-## Contributing
-
-To update shared configuration:
-
-1. **Create feature branch** from main
-2. **Make changes** to templates, hooks, or global config
-3. **Test changes** locally by copying `.claude/settings.json`
-4. **Create pull request** with clear description of changes
-5. **Team review** before merging
-
-## Support
-
-- **Documentation**: See `claude/INSTALL.md` for detailed setup
-- **Issues**: Report problems via GitHub Issues
-- **Questions**: Reach out to team leads for guidance
-
----
-
-**Latest Update**: Hook-based automatic configuration management
-**Compatibility**: Claude Code with hierarchical settings and project-specific CLAUDE.md support
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/data/claude/INSTALL.md b/archive/vteam_shared_configs/data/claude/INSTALL.md
deleted file mode 100644
index 873b7ec4a..000000000
--- a/archive/vteam_shared_configs/data/claude/INSTALL.md
+++ /dev/null
@@ -1,265 +0,0 @@
-# vTeam Claude Configuration Installation Guide
-
-Comprehensive setup instructions for vTeam shared Claude Code configurations.
-
-## Quick Setup (Recommended)
-
-Use the vteam-config CLI tool:
-
-```bash
-pip install vteam-shared-configs
-vteam-config install
-```
-
-This handles everything automatically including backups and verification.
-
-## Manual Setup
-
-If you prefer manual installation or need to troubleshoot:
-
-### 1. Install Global Configuration
-
-```bash
-# Create .claude directory if needed
-mkdir -p ~/.claude
-
-# Link global configuration
-ln -sf "$(pwd)/claude/global-CLAUDE.md" ~/.claude/CLAUDE.md
-
-# Link project templates
-ln -sf "$(pwd)/claude/project-templates" ~/.claude/
-```
-
-### 2. Add Convenience Aliases (Optional)
-
-Add to your shell configuration (~/.bashrc, ~/.zshrc, etc.):
-
-```bash
-# vTeam Claude template aliases
-alias claude-python="cp ~/.claude/project-templates/python-CLAUDE.md ./CLAUDE.md"
-alias claude-js="cp ~/.claude/project-templates/javascript-CLAUDE.md ./CLAUDE.md"
-alias claude-shell="cp ~/.claude/project-templates/shell-CLAUDE.md ./CLAUDE.md"
-
-# Reload shell configuration
-source ~/.bashrc # or ~/.zshrc
-```
-
-## Usage
-
-### For New Projects
-```bash
-# Use automated installation aliases
-claude-python # Copy Python template
-claude-js # Copy JavaScript template
-claude-shell # Copy Shell template
-
-# Or copy manually
-cp ~/.claude/project-templates/python-CLAUDE.md ./CLAUDE.md
-```
-
-### For Existing Projects
-1. Choose appropriate template from `~/.claude/project-templates/`
-2. Copy to project root as `CLAUDE.md`
-3. Customize development commands and architecture for your project
-
-## Verification
-
-### Check Installation
-```bash
-# Verify global config symlink
-ls -la ~/.claude/CLAUDE.md
-
-# Verify project templates symlink
-ls -la ~/.claude/project-templates
-
-# Check project config exists
-ls -la ./CLAUDE.md
-```
-
-### Test Configuration
-```bash
-# Test with Claude Code
-claude --help
-
-# Check project-specific guidance loads
-claude "What are the development commands for this project?"
-```
-
-## Directory Structure After Setup
-
-```
-~/.claude/
-├── CLAUDE.md -> /path/to/vTeam/shared-configs/claude/global-CLAUDE.md
-└── project-templates/ -> /path/to/vTeam/shared-configs/claude/project-templates/
-
-vTeam/shared-configs/
-├── install.sh # Automated installation
-├── uninstall.sh # Automated removal
-├── update.sh # Update configuration
-├── README.md # Team documentation
-├── .gitignore # Development ignore rules
-├── CLAUDE.md # Project template
-├── LICENSE # MIT license
-├── .github/dependabot.yml # Dependency automation
-└── claude/
- ├── INSTALL.md # This file
- ├── global-CLAUDE.md # Global team standards
- └── project-templates/
- ├── python-CLAUDE.md
- ├── javascript-CLAUDE.md
- └── shell-CLAUDE.md
-```
-
-## Lifecycle Management
-
-### Update Configuration
-```bash
-vteam-config update
-```
-Updates to latest team configuration.
-
-### Uninstall
-```bash
-vteam-config uninstall
-```
-Removes configuration and restores backups.
-
-## Troubleshooting
-
-### Symlinks Not Working
-```bash
-# Check if path is correct
-ls -la ~/.claude/CLAUDE.md
-
-# Recreate symlink
-rm ~/.claude/CLAUDE.md
-ln -sf "/full/path/to/vTeam/shared-configs/claude/global-CLAUDE.md" ~/.claude/CLAUDE.md
-```
-
-### Templates Not Found
-```bash
-# Verify templates directory
-ls ~/.claude/project-templates/
-
-# Recreate templates symlink
-rm ~/.claude/project-templates
-ln -sf "/full/path/to/vTeam/shared-configs/claude/project-templates" ~/.claude/
-```
-
-### Restore Previous Configuration
-```bash
-# Find backup directory
-ls -la ~/.claude-backup-*
-
-# Manually restore if needed
-cp ~/.claude-backup-YYYYMMDD-HHMMSS/CLAUDE.md ~/.claude/
-```
-
-## Detailed Configuration Flow
-
-### Hook Execution Sequence
-
-```mermaid
-sequenceDiagram
- participant Dev as Developer
- participant Git as Git Command
- participant Hook as enforce-config.sh
- participant FS as File System
- participant Claude as Claude Code
-
- Dev->>Git: git commit
- Git->>Hook: Pre-tool hook triggers
- Hook->>FS: Check ~/.claude/CLAUDE.md symlink
-
- alt Symlink missing or incorrect
- Hook->>FS: Backup existing file
- Hook->>FS: Create symlink to vTeam config
- Hook->>FS: Create templates symlink
- Hook->>Dev: ✅ Configuration updated
- else Symlink correct
- Hook->>Dev: ✅ Configuration current
- end
-
- Hook->>Git: Allow commit to proceed
- Git->>Dev: Commit successful
-```
-
-### Configuration Hierarchy Deep Dive
-
-```mermaid
-graph LR
- subgraph "Configuration Sources"
- A["vTeam Repository
.claude/settings.json"]
- B["Developer Machine
~/.claude/settings.json"]
- C["Project Override
.claude/settings.local.json"]
- end
-
- subgraph "Merge Process"
- D["Base Settings
(User Global)"]
- E["Team Standards
(Shared Project)"]
- F["Personal Overrides
(Local Project)"]
- end
-
- subgraph "Final Result"
- G["Active Configuration
in Claude Code"]
- end
-
- B --> D
- A --> E
- C --> F
-
- D --> G
- E --> G
- F --> G
-
- style F fill:#e1f5fe
- style E fill:#f3e5f5
- style D fill:#fff3e0
-```
-
-### File System Layout
-
-```mermaid
-graph TD
- subgraph "vTeam Repository Structure"
- A["vTeam/shared-configs/"]
- A --> B[".claude/settings.json"]
- A --> C["hooks/enforce-config.sh"]
- A --> D["hooks/status-check.sh"]
- A --> E["claude/global-CLAUDE.md"]
- A --> F["claude/project-templates/"]
- F --> G["python-CLAUDE.md"]
- F --> H["javascript-CLAUDE.md"]
- F --> I["shell-CLAUDE.md"]
- end
-
- subgraph "Developer Machine After Setup"
- J["~/.claude/"]
- J --> K["settings.json (copied)"]
- J --> L["CLAUDE.md → vTeam/shared-configs/claude/global-CLAUDE.md"]
- J --> M["project-templates/ → vTeam/shared-configs/claude/project-templates/"]
- end
-
- subgraph "Project-Specific Overrides"
- N["project/.claude/"]
- N --> O["settings.local.json (optional)"]
- end
-
- B -.->|copied once| K
- E -.->|symlinked| L
- F -.->|symlinked| M
-
- style L stroke:#2196F3,stroke-width:3px
- style M stroke:#2196F3,stroke-width:3px
- style O stroke:#4CAF50,stroke-width:3px
-```
-
-## Team Benefits
-
-This setup provides:
-- ✅ **Consistent standards** across all team projects
-- ✅ **Automatic enforcement** via pre-commit hooks
-- ✅ **Developer flexibility** through local overrides
-- ✅ **Zero maintenance** - configuration stays current
-- ✅ **Version controlled** team standards
-- ✅ **Visual workflow** documentation for team collaboration
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/data/claude/README.md b/archive/vteam_shared_configs/data/claude/README.md
deleted file mode 100644
index da9a679c9..000000000
--- a/archive/vteam_shared_configs/data/claude/README.md
+++ /dev/null
@@ -1,56 +0,0 @@
-# Claude Configuration Management
-
-This directory contains Claude Code configuration files for managing global and project-specific settings.
-
-## Structure
-
-```
-claude/
-├── README.md # This file
-├── global-CLAUDE.md # Global configuration (symlink to ~/.claude/CLAUDE.md)
-└── project-templates/ # Templates for common project types
- ├── python-CLAUDE.md
- ├── javascript-CLAUDE.md
- └── shell-CLAUDE.md
-```
-
-## Setup Instructions
-
-### Global Configuration
-```bash
-# Create symlink for global Claude configuration
-ln -sf ~/repos/dotfiles/claude/global-CLAUDE.md ~/.claude/CLAUDE.md
-```
-
-### Project-Specific Configuration
-For new projects, copy the appropriate template:
-```bash
-# For Python projects
-cp ~/repos/dotfiles/claude/project-templates/python-CLAUDE.md /path/to/project/CLAUDE.md
-
-# For JavaScript projects
-cp ~/repos/dotfiles/claude/project-templates/javascript-CLAUDE.md /path/to/project/CLAUDE.md
-
-# For shell projects
-cp ~/repos/dotfiles/claude/project-templates/shell-CLAUDE.md /path/to/project/CLAUDE.md
-```
-
-## Best Practices
-
-1. **Global Configuration**: Use `~/.claude/CLAUDE.md` for settings that apply to ALL projects
-2. **Project Configuration**: Use `PROJECT_ROOT/CLAUDE.md` for project-specific commands and context
-3. **Version Control**: Keep both global and project configurations in git
-4. **Symlinks**: Use symlinks to maintain a single source of truth for global config
-5. **Templates**: Use project templates to ensure consistency across similar projects
-
-## Configuration Hierarchy
-
-Claude Code follows this configuration hierarchy (highest to lowest priority):
-1. Project-specific `CLAUDE.md` (in project root)
-2. Global `~/.claude/CLAUDE.md`
-3. Built-in Claude Code defaults
-
-This allows you to:
-- Set organization-wide standards in global config
-- Override with project-specific requirements
-- Maintain consistency across all your projects
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/data/claude/global-CLAUDE.md b/archive/vteam_shared_configs/data/claude/global-CLAUDE.md
deleted file mode 100644
index be4f7585c..000000000
--- a/archive/vteam_shared_configs/data/claude/global-CLAUDE.md
+++ /dev/null
@@ -1,52 +0,0 @@
-# Global Claude Configuration
-
-This file contains global Claude Code configuration that applies to all projects.
-This should be symlinked to ~/.claude/CLAUDE.md
-
-## Global Operating Principles
-
-### Git and Version Control
-- **MANDATORY BRANCH VERIFICATION**: ALWAYS check current git branch with `git branch --show-current` as the FIRST action before ANY task that could modify files
-- When using git, ALWAYS work in feature branches unless told explicitly otherwise
-- **Always squash commits** for clean history
-- **Make sure to commit frequently** with succinct commit messages that are immediately useful to the reader
-
-### Development Standards
-- **Always use python virtual environments** to avoid affecting system python packages
-- **ALWAYS use uv instead of pip** where possible
-- **ALWAYS run markdownlint locally** on any markdown files that you work with
-- **ALWAYS automatically resolve any issues reported by linters**
-- **ALWAYS try to minimize rework**
-
-### GitHub Best Practices
-- **When setting up GitHub projects, ALWAYS use repository-level projects**. NEVER use user-level projects
-- **NEVER change visibility of a github repository** without explicitly being told to do so
-- When working with GitHub repositories, always follow GitHub Flow
-- **ALWAYS setup dependabot automation** when creating a new github repository
-- **Warn if in a GitHub git repo and GitHub Actions integration is not installed** for Claude
-
-### Code Quality
-- **NEVER push if linters report errors or warnings**
-- **NEVER push if tests fail**
-- **ALWAYS fix issues immediately after running linters**
-- **ALWAYS make sure all dates that you use match reality**
-- When creating new python applications, you only need to support versions N and N-1
-
-### File Management
-- **ALWAYS keep your utility/working scripts in git** and well-isolated from the primary codebase
-- **NOTHING may ever depend on these scripts**
-- **NEVER make changes to files unless you are on the correct feature branch** for those changes
-- **NEVER create files unless they're absolutely necessary** for achieving your goal
-- **ALWAYS prefer editing an existing file to creating a new one**
-- **NEVER proactively create documentation files** (*.md) or README files unless explicitly requested
-
-### Linting Workflow
-**MANDATORY: ALWAYS run the complete linting workflow locally before ANY git push or commit**
-
-Check the project's CLAUDE.md file for language-specific linting commands and workflows.
-
-### Testing Best Practices
-- **ALWAYS run tests immediately after making implementation changes**
-- When fixing API bugs, update both implementation AND corresponding tests in the same commit
-- Never assume tests will pass after changing HTTP methods, endpoints, or response formats
-- Implementation changes without test updates = guaranteed CI failures
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/data/claude/project-templates/javascript-CLAUDE.md b/archive/vteam_shared_configs/data/claude/project-templates/javascript-CLAUDE.md
deleted file mode 100644
index 57b2a6b5f..000000000
--- a/archive/vteam_shared_configs/data/claude/project-templates/javascript-CLAUDE.md
+++ /dev/null
@@ -1,90 +0,0 @@
-# CLAUDE.md - JavaScript/Node.js Project
-
-This file provides guidance to Claude Code (claude.ai/code) when working with this JavaScript project.
-
-## Development Commands
-
-### Environment Setup
-```bash
-# Install dependencies
-npm install
-# or
-yarn install
-
-# Install development dependencies
-npm install --only=dev
-```
-
-### Code Quality
-```bash
-# Format code
-npm run format
-# or
-npx prettier --write .
-
-# Lint code
-npm run lint
-# or
-npx eslint .
-
-# Fix auto-fixable lint issues
-npm run lint:fix
-# or
-npx eslint . --fix
-```
-
-### Testing
-```bash
-# Run all tests
-npm test
-
-# Run tests in watch mode
-npm run test:watch
-
-# Run tests with coverage
-npm run test:coverage
-
-# Run specific test file
-npm test -- test-file.test.js
-```
-
-### Build and Development
-```bash
-# Start development server
-npm run dev
-
-# Build for production
-npm run build
-
-# Preview production build
-npm run preview
-```
-
-## Project Architecture
-
-
-
-## Configuration
-
-### Node.js Version
-- Target: Node.js 18+ (LTS and current)
-
-### Code Style
-- Formatter: Prettier
-- Linter: ESLint
-- Line length: 80-100 characters
-- Semicolons: Consistent with project preference
-
-### Testing Framework
-- Test runner: Jest/Vitest/Mocha (specify which)
-- Coverage tool: Built-in coverage
-- Test location: tests/ or __tests__/ directory
-
-## Pre-commit Requirements
-
-Before any commit, ALWAYS run:
-1. `npm run format` (or `npx prettier --write .`)
-2. `npm run lint` (or `npx eslint .`)
-3. `npm test`
-
-All commands must pass without errors or warnings.
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/data/claude/project-templates/python-CLAUDE.md b/archive/vteam_shared_configs/data/claude/project-templates/python-CLAUDE.md
deleted file mode 100644
index b51eeff3f..000000000
--- a/archive/vteam_shared_configs/data/claude/project-templates/python-CLAUDE.md
+++ /dev/null
@@ -1,88 +0,0 @@
-# CLAUDE.md - Python Project
-
-This file provides guidance to Claude Code (claude.ai/code) when working with this Python project.
-
-## Development Commands
-
-### Environment Setup
-```bash
-# Create virtual environment
-uv venv
-source .venv/bin/activate # Linux/macOS
-# .venv\Scripts\activate # Windows
-
-# Install dependencies
-uv pip install -r requirements.txt
-uv pip install -r requirements-dev.txt # If dev requirements exist
-```
-
-### Code Quality
-```bash
-# Format code
-black .
-
-# Sort imports
-isort .
-
-# Lint code
-flake8 .
-
-# Type checking (if using mypy)
-mypy .
-```
-
-### Testing
-```bash
-# Run all tests
-python -m pytest
-
-# Run with coverage
-python -m pytest --cov=.
-
-# Run specific test file
-python -m pytest tests/test_example.py
-
-# Run specific test
-python -m pytest tests/test_example.py::test_function_name
-```
-
-### Package Management
-```bash
-# Add new dependency
-uv add package-name
-
-# Add development dependency
-uv add --dev package-name
-
-# Update dependencies
-uv lock --upgrade
-```
-
-## Project Architecture
-
-
-
-## Configuration
-
-### Python Version
-- Target: Python 3.11+ (latest stable and N-1)
-
-### Code Style
-- Line length: 88 characters (black default)
-- Import sorting: isort with black compatibility
-- Linting: flake8 with project-specific rules
-
-### Testing Framework
-- Test runner: pytest
-- Coverage tool: pytest-cov
-- Test location: tests/ directory
-
-## Pre-commit Requirements
-
-Before any commit, ALWAYS run:
-1. `black .`
-2. `isort .`
-3. `flake8 .`
-4. `python -m pytest`
-
-All commands must pass without errors or warnings.
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/data/claude/project-templates/shell-CLAUDE.md b/archive/vteam_shared_configs/data/claude/project-templates/shell-CLAUDE.md
deleted file mode 100644
index a85ff5cad..000000000
--- a/archive/vteam_shared_configs/data/claude/project-templates/shell-CLAUDE.md
+++ /dev/null
@@ -1,98 +0,0 @@
-# CLAUDE.md - Shell Script Project
-
-This file provides guidance to Claude Code (claude.ai/code) when working with this shell script project.
-
-## Development Commands
-
-### Code Quality
-```bash
-# Lint shell scripts
-shellcheck *.sh
-
-# Lint with specific exclusions (if needed)
-shellcheck -e SC1090 -e SC1091 *.sh
-
-# Format shell scripts (if using shfmt)
-shfmt -w *.sh
-```
-
-### Testing
-```bash
-# Run tests (if using bats)
-bats tests/
-
-# Run specific test file
-bats tests/test-example.bats
-
-# Manual testing
-bash script-name.sh --help
-bash script-name.sh --test-mode
-```
-
-### Execution
-```bash
-# Make scripts executable
-chmod +x *.sh
-
-# Run script
-./script-name.sh
-
-# Source script (for functions/aliases)
-source script-name.sh
-```
-
-## Project Architecture
-
-
-
-## Configuration
-
-### Shell Compatibility
-- Target: Bash 4.0+ (specify if using other shells)
-- Shebang: `#!/usr/bin/env bash` or `#!/bin/bash`
-
-### Code Style
-- Indentation: 2 or 4 spaces (consistent)
-- Variable naming: snake_case for local, UPPER_CASE for globals/exports
-- Function naming: snake_case
-- Error handling: Use `set -euo pipefail` for strict mode
-
-### Testing Framework
-- Test runner: bats-core (or specify alternative)
-- Test location: tests/ directory
-- Test files: *.bats format
-
-## Shell Script Best Practices
-
-### Error Handling
-```bash
-#!/usr/bin/env bash
-set -euo pipefail # Exit on error, undefined vars, pipe failures
-```
-
-### Function Structure
-```bash
-function_name() {
- local arg1="$1"
- local arg2="${2:-default_value}"
-
- # Function body
- echo "Processing: $arg1"
-}
-```
-
-### Variable Quoting
-```bash
-# Always quote variables
-echo "$variable"
-cp "$source_file" "$destination_file"
-```
-
-## Pre-commit Requirements
-
-Before any commit, ALWAYS run:
-1. `shellcheck *.sh` (or with project-specific exclusions)
-2. Manual testing of modified scripts
-3. Verify executable permissions are set correctly
-
-All ShellCheck warnings must be resolved or explicitly excluded with comments.
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/data/hooks/enforce-config.sh b/archive/vteam_shared_configs/data/hooks/enforce-config.sh
deleted file mode 100755
index a91d90b88..000000000
--- a/archive/vteam_shared_configs/data/hooks/enforce-config.sh
+++ /dev/null
@@ -1,115 +0,0 @@
-#!/bin/bash
-set -euo pipefail
-
-# vTeam Shared Configuration Enforcement Script
-# Ensures latest vTeam configuration is active before Git operations
-
-SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
-SHARED_CONFIGS_DIR="$(cd "$SCRIPT_DIR/.." && pwd)"
-CLAUDE_DIR="$HOME/.claude"
-
-# Colors for output
-RED='\033[0;31m'
-GREEN='\033[0;32m'
-YELLOW='\033[1;33m'
-BLUE='\033[0;34m'
-NC='\033[0m' # No Color
-
-info() {
- echo -e "${GREEN}[vTeam]${NC} $1"
-}
-
-warn() {
- echo -e "${YELLOW}[vTeam]${NC} $1"
-}
-
-error() {
- echo -e "${RED}[vTeam]${NC} $1"
-}
-
-step() {
- echo -e "${BLUE}[vTeam]${NC} $1"
-}
-
-# Check if we're in a vTeam repository
-if [[ ! -d "$SHARED_CONFIGS_DIR" ]]; then
- error "vTeam shared-configs directory not found"
- exit 1
-fi
-
-# Create .claude directory if it doesn't exist
-if [[ ! -d "$CLAUDE_DIR" ]]; then
- step "Creating ~/.claude directory"
- mkdir -p "$CLAUDE_DIR"
-fi
-
-# Function to check if symlink is up to date
-check_symlink() {
- local target="$1"
- local link="$2"
-
- if [[ ! -L "$link" ]]; then
- return 1 # Not a symlink
- fi
-
- local current_target
- current_target="$(readlink "$link")"
-
- if [[ "$current_target" != "$target" ]]; then
- return 1 # Points to wrong target
- fi
-
- return 0 # Symlink is correct
-}
-
-# Check and update global configuration
-GLOBAL_TARGET="$SHARED_CONFIGS_DIR/claude/global-CLAUDE.md"
-GLOBAL_LINK="$CLAUDE_DIR/CLAUDE.md"
-
-if ! check_symlink "$GLOBAL_TARGET" "$GLOBAL_LINK"; then
- step "Updating global CLAUDE.md configuration"
-
- # Backup existing file if it's not a symlink
- if [[ -f "$GLOBAL_LINK" ]] && [[ ! -L "$GLOBAL_LINK" ]]; then
- BACKUP_NAME="CLAUDE.md.backup-$(date +%Y%m%d-%H%M%S)"
- warn "Backing up existing $GLOBAL_LINK to $BACKUP_NAME"
- mv "$GLOBAL_LINK" "$CLAUDE_DIR/$BACKUP_NAME"
- fi
-
- # Remove existing link if present
- [[ -L "$GLOBAL_LINK" ]] && rm "$GLOBAL_LINK"
-
- # Create new symlink
- ln -sf "$GLOBAL_TARGET" "$GLOBAL_LINK"
- info "✓ Global configuration linked"
-fi
-
-# Check and update project templates
-TEMPLATES_TARGET="$SHARED_CONFIGS_DIR/claude/project-templates"
-TEMPLATES_LINK="$CLAUDE_DIR/project-templates"
-
-if ! check_symlink "$TEMPLATES_TARGET" "$TEMPLATES_LINK"; then
- step "Updating project templates"
-
- # Backup existing directory if it's not a symlink
- if [[ -d "$TEMPLATES_LINK" ]] && [[ ! -L "$TEMPLATES_LINK" ]]; then
- BACKUP_NAME="project-templates.backup-$(date +%Y%m%d-%H%M%S)"
- warn "Backing up existing $TEMPLATES_LINK to $BACKUP_NAME"
- mv "$TEMPLATES_LINK" "$CLAUDE_DIR/$BACKUP_NAME"
- fi
-
- # Remove existing link if present
- [[ -L "$TEMPLATES_LINK" ]] && rm "$TEMPLATES_LINK"
-
- # Create new symlink
- ln -sf "$TEMPLATES_TARGET" "$TEMPLATES_LINK"
- info "✓ Project templates linked"
-fi
-
-# Verify installation
-if [[ -L "$GLOBAL_LINK" ]] && [[ -L "$TEMPLATES_LINK" ]]; then
- info "✓ vTeam configuration is active and up to date"
-else
- error "Configuration verification failed"
- exit 1
-fi
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/data/hooks/status-check.sh b/archive/vteam_shared_configs/data/hooks/status-check.sh
deleted file mode 100755
index e38469e69..000000000
--- a/archive/vteam_shared_configs/data/hooks/status-check.sh
+++ /dev/null
@@ -1,76 +0,0 @@
-#!/bin/bash
-set -euo pipefail
-
-# vTeam Configuration Status Check Script
-# Displays current vTeam configuration status on Claude session start
-
-SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
-SHARED_CONFIGS_DIR="$(cd "$SCRIPT_DIR/.." && pwd)"
-CLAUDE_DIR="$HOME/.claude"
-
-# Colors for output
-GREEN='\033[0;32m'
-YELLOW='\033[1;33m'
-BLUE='\033[0;34m'
-NC='\033[0m' # No Color
-
-info() {
- echo -e "${GREEN}[vTeam]${NC} $1"
-}
-
-warn() {
- echo -e "${YELLOW}[vTeam]${NC} $1"
-}
-
-status() {
- echo -e "${BLUE}[vTeam]${NC} $1"
-}
-
-# Only show status if we're in a vTeam repository
-if [[ ! -d "$SHARED_CONFIGS_DIR" ]]; then
- exit 0 # Silent exit if not in vTeam repo
-fi
-
-# Check configuration status
-GLOBAL_LINK="$CLAUDE_DIR/CLAUDE.md"
-TEMPLATES_LINK="$CLAUDE_DIR/project-templates"
-
-echo
-status "=== vTeam Configuration Status ==="
-
-# Check global config
-if [[ -L "$GLOBAL_LINK" ]]; then
- TARGET=$(readlink "$GLOBAL_LINK")
- if [[ "$TARGET" == "$SHARED_CONFIGS_DIR/claude/global-CLAUDE.md" ]]; then
- info "✓ Global configuration: Active"
- else
- warn "⚠ Global configuration: Linked to different source"
- fi
-else
- warn "⚠ Global configuration: Not linked (will be auto-configured on first Git operation)"
-fi
-
-# Check project templates
-if [[ -L "$TEMPLATES_LINK" ]]; then
- TARGET=$(readlink "$TEMPLATES_LINK")
- if [[ "$TARGET" == "$SHARED_CONFIGS_DIR/claude/project-templates" ]]; then
- info "✓ Project templates: Active"
- else
- warn "⚠ Project templates: Linked to different source"
- fi
-else
- warn "⚠ Project templates: Not linked (will be auto-configured on first Git operation)"
-fi
-
-# Check for local overrides
-LOCAL_SETTINGS=".claude/settings.local.json"
-if [[ -f "$LOCAL_SETTINGS" ]]; then
- info "✓ Local overrides: Present in $LOCAL_SETTINGS"
-else
- status "ℹ Local overrides: None (create $LOCAL_SETTINGS for personal customizations)"
-fi
-
-echo
-status "Team standards will be automatically enforced on Git operations"
-status "Use '.claude/settings.local.json' for personal overrides"
-echo
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/installer.py b/archive/vteam_shared_configs/installer.py
deleted file mode 100644
index 7b3326b40..000000000
--- a/archive/vteam_shared_configs/installer.py
+++ /dev/null
@@ -1,249 +0,0 @@
-"""Configuration installer for vTeam shared Claude Code settings."""
-
-import os
-import shutil
-import json
-from pathlib import Path
-from datetime import datetime
-from importlib import resources
-import click
-
-
-class ConfigInstaller:
- """Manages installation and configuration of vTeam shared Claude settings."""
-
- def __init__(self):
- self.claude_dir = Path.home() / ".claude"
- self.global_config_link = self.claude_dir / "CLAUDE.md"
- self.templates_link = self.claude_dir / "project-templates"
- self.settings_file = self.claude_dir / "settings.json"
-
- def _get_package_data_path(self, filename):
- """Get path to package data file."""
- try:
- # Python 3.9+
- with resources.files("vteam_shared_configs.data") as data_path:
- return data_path / filename
- except AttributeError:
- # Python 3.8 fallback
- with resources.path("vteam_shared_configs.data", filename) as data_path:
- return data_path
-
- def _create_backup(self, file_path):
- """Create timestamped backup of existing file."""
- if not file_path.exists():
- return None
-
- timestamp = datetime.now().strftime("%Y%m%d-%H%M%S")
- backup_name = f"{file_path.name}.backup-{timestamp}"
- backup_path = file_path.parent / backup_name
-
- if file_path.is_symlink():
- # Just remove symlinks, no backup needed
- return None
- elif file_path.is_file():
- shutil.copy2(file_path, backup_path)
- click.echo(f"📦 Backed up {file_path.name} to {backup_name}")
- return backup_path
- elif file_path.is_dir():
- shutil.copytree(file_path, backup_path)
- click.echo(f"📦 Backed up {file_path.name}/ to {backup_name}/")
- return backup_path
-
- return None
-
- def _check_symlink(self, link_path, target_path):
- """Check if symlink exists and points to correct target."""
- if not link_path.is_symlink():
- return False
-
- try:
- current_target = link_path.resolve()
- expected_target = target_path.resolve()
- return current_target == expected_target
- except (OSError, RuntimeError):
- return False
-
- def _create_symlink(self, target_path, link_path, description):
- """Create symlink with proper error handling."""
- try:
- # Backup existing file/directory
- if link_path.exists():
- self._create_backup(link_path)
- if link_path.is_symlink():
- link_path.unlink()
- elif link_path.is_file():
- link_path.unlink()
- elif link_path.is_dir():
- shutil.rmtree(link_path)
-
- # Create symlink
- link_path.symlink_to(target_path)
- click.echo(f"🔗 Linked {description}")
- return True
-
- except (OSError, RuntimeError) as e:
- click.echo(f"❌ Failed to link {description}: {e}")
- return False
-
- def install(self, force_reinstall=False):
- """Install vTeam shared configuration."""
- click.echo("🚀 Installing vTeam shared Claude Code configuration...")
-
- # Create .claude directory if needed
- if not self.claude_dir.exists():
- self.claude_dir.mkdir(parents=True)
- click.echo(f"📁 Created {self.claude_dir}")
-
- success = True
-
- # Install global configuration
- try:
- global_config_source = self._get_package_data_path("claude/global-CLAUDE.md")
- if not self._check_symlink(self.global_config_link, global_config_source) or force_reinstall:
- if not self._create_symlink(global_config_source, self.global_config_link, "global configuration"):
- success = False
- else:
- click.echo("✅ Global configuration already linked")
- except Exception as e:
- click.echo(f"❌ Failed to install global configuration: {e}")
- success = False
-
- # Install project templates
- try:
- templates_source = self._get_package_data_path("claude/project-templates")
- if not self._check_symlink(self.templates_link, templates_source) or force_reinstall:
- if not self._create_symlink(templates_source, self.templates_link, "project templates"):
- success = False
- else:
- click.echo("✅ Project templates already linked")
- except Exception as e:
- click.echo(f"❌ Failed to install project templates: {e}")
- success = False
-
- # Install team hooks (copy, not symlink, so user can modify)
- try:
- hooks_source = self._get_package_data_path(".claude/settings.json")
- if not self.settings_file.exists() or force_reinstall:
- self._create_backup(self.settings_file)
- shutil.copy2(hooks_source, self.settings_file)
- click.echo("⚙️ Installed team hooks configuration")
- else:
- click.echo("✅ Team hooks configuration already exists")
- except Exception as e:
- click.echo(f"❌ Failed to install team hooks: {e}")
- success = False
-
- return success
-
- def uninstall(self):
- """Uninstall vTeam shared configuration."""
- click.echo("🗑️ Uninstalling vTeam shared Claude Code configuration...")
-
- success = True
-
- # Remove symlinks
- for link_path, description in [
- (self.global_config_link, "global configuration"),
- (self.templates_link, "project templates")
- ]:
- if link_path.is_symlink():
- try:
- link_path.unlink()
- click.echo(f"🔓 Removed {description} symlink")
- except OSError as e:
- click.echo(f"❌ Failed to remove {description}: {e}")
- success = False
- elif link_path.exists():
- click.echo(f"⚠️ {description} exists but is not a symlink - leaving unchanged")
-
- # Find and offer to restore backups
- backup_pattern = "*.backup-*"
- backups = list(self.claude_dir.glob(backup_pattern))
-
- if backups:
- click.echo(f"📦 Found {len(backups)} backup(s)")
- for backup in sorted(backups, reverse=True): # Most recent first
- original_name = backup.name.split('.backup-')[0]
- original_path = self.claude_dir / original_name
-
- if not original_path.exists():
- if click.confirm(f"Restore {original_name} from {backup.name}?"):
- try:
- if backup.is_file():
- shutil.copy2(backup, original_path)
- else:
- shutil.copytree(backup, original_path)
- click.echo(f"✅ Restored {original_name}")
-
- if click.confirm(f"Remove backup {backup.name}?"):
- if backup.is_file():
- backup.unlink()
- else:
- shutil.rmtree(backup)
- except Exception as e:
- click.echo(f"❌ Failed to restore {original_name}: {e}")
- success = False
-
- return success
-
- def status(self):
- """Display current configuration status."""
- click.echo("\n📊 vTeam Configuration Status")
- click.echo("=" * 35)
-
- # Check global configuration
- if self.global_config_link.is_symlink():
- try:
- target = self.global_config_link.resolve()
- if "vteam_shared_configs" in str(target):
- click.echo("✅ Global configuration: Active (vTeam)")
- else:
- click.echo(f"⚠️ Global configuration: Linked to different source")
- except (OSError, RuntimeError):
- click.echo("❌ Global configuration: Broken symlink")
- elif self.global_config_link.exists():
- click.echo("⚠️ Global configuration: File exists (not vTeam managed)")
- else:
- click.echo("❌ Global configuration: Not found")
-
- # Check project templates
- if self.templates_link.is_symlink():
- try:
- target = self.templates_link.resolve()
- if "vteam_shared_configs" in str(target):
- click.echo("✅ Project templates: Active (vTeam)")
- else:
- click.echo("⚠️ Project templates: Linked to different source")
- except (OSError, RuntimeError):
- click.echo("❌ Project templates: Broken symlink")
- elif self.templates_link.exists():
- click.echo("⚠️ Project templates: Directory exists (not vTeam managed)")
- else:
- click.echo("❌ Project templates: Not found")
-
- # Check team hooks
- if self.settings_file.exists():
- try:
- with open(self.settings_file) as f:
- settings = json.load(f)
- if any("vteam" in str(hook).lower() for hook in settings.get("hooks", {}).values() if isinstance(hook, list)):
- click.echo("✅ Team hooks: Active")
- else:
- click.echo("⚠️ Team hooks: File exists (may not be vTeam)")
- except (json.JSONDecodeError, OSError):
- click.echo("❌ Team hooks: File exists but invalid JSON")
- else:
- click.echo("❌ Team hooks: Not found")
-
- # Check for local overrides
- for project_root in [Path.cwd(), Path.cwd().parent]:
- local_settings = project_root / ".claude" / "settings.local.json"
- if local_settings.exists():
- click.echo(f"ℹ️ Local overrides: Found in {project_root.name}")
- break
- else:
- click.echo("ℹ️ Local overrides: None found")
-
- click.echo("\n💡 Use 'vteam-config install' to set up configuration")
- click.echo("💡 Create '.claude/settings.local.json' for personal overrides")
\ No newline at end of file
diff --git a/archive/vteam_shared_configs/pyproject.toml b/archive/vteam_shared_configs/pyproject.toml
deleted file mode 100644
index 8689d60d5..000000000
--- a/archive/vteam_shared_configs/pyproject.toml
+++ /dev/null
@@ -1,57 +0,0 @@
-[build-system]
-requires = ["hatchling"]
-build-backend = "hatchling.build"
-
-[project]
-name = "vteam-shared-configs"
-version = "1.0.0"
-description = "Shared Claude Code configuration for vTeam development standards"
-readme = "README.md"
-license = "MIT"
-requires-python = ">=3.8"
-authors = [
- {name = "Jeremy Eder", email = "jeremy@example.com"},
-]
-keywords = ["claude", "configuration", "development", "team", "standards"]
-classifiers = [
- "Development Status :: 4 - Beta",
- "Intended Audience :: Developers",
- "License :: OSI Approved :: MIT License",
- "Programming Language :: Python :: 3",
- "Programming Language :: Python :: 3.8",
- "Programming Language :: Python :: 3.9",
- "Programming Language :: Python :: 3.10",
- "Programming Language :: Python :: 3.11",
- "Programming Language :: Python :: 3.12",
- "Topic :: Software Development :: Libraries :: Python Modules",
- "Topic :: Software Development :: Quality Assurance",
-]
-dependencies = [
- "click>=8.0.0",
-]
-
-[project.optional-dependencies]
-dev = [
- "pytest>=6.0",
- "black",
- "isort",
- "flake8",
-]
-
-[project.urls]
-Homepage = "https://github.com/red-hat-data-services/vTeam"
-Repository = "https://github.com/red-hat-data-services/vTeam"
-Issues = "https://github.com/red-hat-data-services/vTeam/issues"
-
-[project.scripts]
-vteam-config = "vteam_shared_configs.cli:main"
-
-[tool.hatch.build.targets.sdist]
-include = [
- "/src",
- "/README.md",
- "/LICENSE",
-]
-
-[tool.hatch.build.targets.wheel]
-packages = ["src/vteam_shared_configs"]
diff --git a/archive/vteam_shared_configs/uv.lock b/archive/vteam_shared_configs/uv.lock
deleted file mode 100644
index 8c7c80e52..000000000
--- a/archive/vteam_shared_configs/uv.lock
+++ /dev/null
@@ -1,1110 +0,0 @@
-version = 1
-revision = 2
-requires-python = ">=3.8"
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-
-[[package]]
-name = "anyio"
-version = "4.5.2"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-dependencies = [
- { name = "exceptiongroup", marker = "python_full_version < '3.9'" },
- { name = "idna", marker = "python_full_version < '3.9'" },
- { name = "sniffio", marker = "python_full_version < '3.9'" },
- { name = "typing-extensions", version = "4.13.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/4d/f9/9a7ce600ebe7804daf90d4d48b1c0510a4561ddce43a596be46676f82343/anyio-4.5.2.tar.gz", hash = "sha256:23009af4ed04ce05991845451e11ef02fc7c5ed29179ac9a420e5ad0ac7ddc5b", size = 171293, upload-time = "2024-10-13T22:18:03.307Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/1b/b4/f7e396030e3b11394436358ca258a81d6010106582422f23443c16ca1873/anyio-4.5.2-py3-none-any.whl", hash = "sha256:c011ee36bc1e8ba40e5a81cb9df91925c218fe9b778554e0b56a21e1b5d4716f", size = 89766, upload-time = "2024-10-13T22:18:01.524Z" },
-]
-
-[[package]]
-name = "anyio"
-version = "4.10.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-dependencies = [
- { name = "exceptiongroup", marker = "python_full_version >= '3.9' and python_full_version < '3.11'" },
- { name = "idna", marker = "python_full_version >= '3.9'" },
- { name = "sniffio", marker = "python_full_version >= '3.9'" },
- { name = "typing-extensions", version = "4.15.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9' and python_full_version < '3.13'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/f1/b4/636b3b65173d3ce9a38ef5f0522789614e590dab6a8d505340a4efe4c567/anyio-4.10.0.tar.gz", hash = "sha256:3f3fae35c96039744587aa5b8371e7e8e603c0702999535961dd336026973ba6", size = 213252, upload-time = "2025-08-04T08:54:26.451Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/6f/12/e5e0282d673bb9746bacfb6e2dba8719989d3660cdb2ea79aee9a9651afb/anyio-4.10.0-py3-none-any.whl", hash = "sha256:60e474ac86736bbfd6f210f7a61218939c318f43f9972497381f1c5e930ed3d1", size = 107213, upload-time = "2025-08-04T08:54:24.882Z" },
-]
-
-[[package]]
-name = "backports-asyncio-runner"
-version = "1.2.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/8e/ff/70dca7d7cb1cbc0edb2c6cc0c38b65cba36cccc491eca64cabd5fe7f8670/backports_asyncio_runner-1.2.0.tar.gz", hash = "sha256:a5aa7b2b7d8f8bfcaa2b57313f70792df84e32a2a746f585213373f900b42162", size = 69893, upload-time = "2025-07-02T02:27:15.685Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/a0/59/76ab57e3fe74484f48a53f8e337171b4a2349e506eabe136d7e01d059086/backports_asyncio_runner-1.2.0-py3-none-any.whl", hash = "sha256:0da0a936a8aeb554eccb426dc55af3ba63bcdc69fa1a600b5bb305413a4477b5", size = 12313, upload-time = "2025-07-02T02:27:14.263Z" },
-]
-
-[[package]]
-name = "black"
-version = "24.8.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-dependencies = [
- { name = "click", version = "8.1.8", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
- { name = "mypy-extensions", marker = "python_full_version < '3.9'" },
- { name = "packaging", marker = "python_full_version < '3.9'" },
- { name = "pathspec", marker = "python_full_version < '3.9'" },
- { name = "platformdirs", version = "4.3.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
- { name = "tomli", marker = "python_full_version < '3.9'" },
- { name = "typing-extensions", version = "4.13.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/04/b0/46fb0d4e00372f4a86a6f8efa3cb193c9f64863615e39010b1477e010578/black-24.8.0.tar.gz", hash = "sha256:2500945420b6784c38b9ee885af039f5e7471ef284ab03fa35ecdde4688cd83f", size = 644810, upload-time = "2024-08-02T17:43:18.405Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/47/6e/74e29edf1fba3887ed7066930a87f698ffdcd52c5dbc263eabb06061672d/black-24.8.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:09cdeb74d494ec023ded657f7092ba518e8cf78fa8386155e4a03fdcc44679e6", size = 1632092, upload-time = "2024-08-02T17:47:26.911Z" },
- { url = "https://files.pythonhosted.org/packages/ab/49/575cb6c3faee690b05c9d11ee2e8dba8fbd6d6c134496e644c1feb1b47da/black-24.8.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:81c6742da39f33b08e791da38410f32e27d632260e599df7245cccee2064afeb", size = 1457529, upload-time = "2024-08-02T17:47:29.109Z" },
- { url = "https://files.pythonhosted.org/packages/7a/b4/d34099e95c437b53d01c4aa37cf93944b233066eb034ccf7897fa4e5f286/black-24.8.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:707a1ca89221bc8a1a64fb5e15ef39cd755633daa672a9db7498d1c19de66a42", size = 1757443, upload-time = "2024-08-02T17:46:20.306Z" },
- { url = "https://files.pythonhosted.org/packages/87/a0/6d2e4175ef364b8c4b64f8441ba041ed65c63ea1db2720d61494ac711c15/black-24.8.0-cp310-cp310-win_amd64.whl", hash = "sha256:d6417535d99c37cee4091a2f24eb2b6d5ec42b144d50f1f2e436d9fe1916fe1a", size = 1418012, upload-time = "2024-08-02T17:47:20.33Z" },
- { url = "https://files.pythonhosted.org/packages/08/a6/0a3aa89de9c283556146dc6dbda20cd63a9c94160a6fbdebaf0918e4a3e1/black-24.8.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:fb6e2c0b86bbd43dee042e48059c9ad7830abd5c94b0bc518c0eeec57c3eddc1", size = 1615080, upload-time = "2024-08-02T17:48:05.467Z" },
- { url = "https://files.pythonhosted.org/packages/db/94/b803d810e14588bb297e565821a947c108390a079e21dbdcb9ab6956cd7a/black-24.8.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:837fd281f1908d0076844bc2b801ad2d369c78c45cf800cad7b61686051041af", size = 1438143, upload-time = "2024-08-02T17:47:30.247Z" },
- { url = "https://files.pythonhosted.org/packages/a5/b5/f485e1bbe31f768e2e5210f52ea3f432256201289fd1a3c0afda693776b0/black-24.8.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:62e8730977f0b77998029da7971fa896ceefa2c4c4933fcd593fa599ecbf97a4", size = 1738774, upload-time = "2024-08-02T17:46:17.837Z" },
- { url = "https://files.pythonhosted.org/packages/a8/69/a000fc3736f89d1bdc7f4a879f8aaf516fb03613bb51a0154070383d95d9/black-24.8.0-cp311-cp311-win_amd64.whl", hash = "sha256:72901b4913cbac8972ad911dc4098d5753704d1f3c56e44ae8dce99eecb0e3af", size = 1427503, upload-time = "2024-08-02T17:46:22.654Z" },
- { url = "https://files.pythonhosted.org/packages/a2/a8/05fb14195cfef32b7c8d4585a44b7499c2a4b205e1662c427b941ed87054/black-24.8.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:7c046c1d1eeb7aea9335da62472481d3bbf3fd986e093cffd35f4385c94ae368", size = 1646132, upload-time = "2024-08-02T17:49:52.843Z" },
- { url = "https://files.pythonhosted.org/packages/41/77/8d9ce42673e5cb9988f6df73c1c5c1d4e9e788053cccd7f5fb14ef100982/black-24.8.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:649f6d84ccbae73ab767e206772cc2d7a393a001070a4c814a546afd0d423aed", size = 1448665, upload-time = "2024-08-02T17:47:54.479Z" },
- { url = "https://files.pythonhosted.org/packages/cc/94/eff1ddad2ce1d3cc26c162b3693043c6b6b575f538f602f26fe846dfdc75/black-24.8.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2b59b250fdba5f9a9cd9d0ece6e6d993d91ce877d121d161e4698af3eb9c1018", size = 1762458, upload-time = "2024-08-02T17:46:19.384Z" },
- { url = "https://files.pythonhosted.org/packages/28/ea/18b8d86a9ca19a6942e4e16759b2fa5fc02bbc0eb33c1b866fcd387640ab/black-24.8.0-cp312-cp312-win_amd64.whl", hash = "sha256:6e55d30d44bed36593c3163b9bc63bf58b3b30e4611e4d88a0c3c239930ed5b2", size = 1436109, upload-time = "2024-08-02T17:46:52.97Z" },
- { url = "https://files.pythonhosted.org/packages/9f/d4/ae03761ddecc1a37d7e743b89cccbcf3317479ff4b88cfd8818079f890d0/black-24.8.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:505289f17ceda596658ae81b61ebbe2d9b25aa78067035184ed0a9d855d18afd", size = 1617322, upload-time = "2024-08-02T17:51:20.203Z" },
- { url = "https://files.pythonhosted.org/packages/14/4b/4dfe67eed7f9b1ddca2ec8e4418ea74f0d1dc84d36ea874d618ffa1af7d4/black-24.8.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b19c9ad992c7883ad84c9b22aaa73562a16b819c1d8db7a1a1a49fb7ec13c7d2", size = 1442108, upload-time = "2024-08-02T17:50:40.824Z" },
- { url = "https://files.pythonhosted.org/packages/97/14/95b3f91f857034686cae0e73006b8391d76a8142d339b42970eaaf0416ea/black-24.8.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1f13f7f386f86f8121d76599114bb8c17b69d962137fc70efe56137727c7047e", size = 1745786, upload-time = "2024-08-02T17:46:02.939Z" },
- { url = "https://files.pythonhosted.org/packages/95/54/68b8883c8aa258a6dde958cd5bdfada8382bec47c5162f4a01e66d839af1/black-24.8.0-cp38-cp38-win_amd64.whl", hash = "sha256:f490dbd59680d809ca31efdae20e634f3fae27fba3ce0ba3208333b713bc3920", size = 1426754, upload-time = "2024-08-02T17:46:38.603Z" },
- { url = "https://files.pythonhosted.org/packages/13/b2/b3f24fdbb46f0e7ef6238e131f13572ee8279b70f237f221dd168a9dba1a/black-24.8.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:eab4dd44ce80dea27dc69db40dab62d4ca96112f87996bca68cd75639aeb2e4c", size = 1631706, upload-time = "2024-08-02T17:49:57.606Z" },
- { url = "https://files.pythonhosted.org/packages/d9/35/31010981e4a05202a84a3116423970fd1a59d2eda4ac0b3570fbb7029ddc/black-24.8.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3c4285573d4897a7610054af5a890bde7c65cb466040c5f0c8b732812d7f0e5e", size = 1457429, upload-time = "2024-08-02T17:49:12.764Z" },
- { url = "https://files.pythonhosted.org/packages/27/25/3f706b4f044dd569a20a4835c3b733dedea38d83d2ee0beb8178a6d44945/black-24.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9e84e33b37be070ba135176c123ae52a51f82306def9f7d063ee302ecab2cf47", size = 1756488, upload-time = "2024-08-02T17:46:08.067Z" },
- { url = "https://files.pythonhosted.org/packages/63/72/79375cd8277cbf1c5670914e6bd4c1b15dea2c8f8e906dc21c448d0535f0/black-24.8.0-cp39-cp39-win_amd64.whl", hash = "sha256:73bbf84ed136e45d451a260c6b73ed674652f90a2b3211d6a35e78054563a9bb", size = 1417721, upload-time = "2024-08-02T17:46:42.637Z" },
- { url = "https://files.pythonhosted.org/packages/27/1e/83fa8a787180e1632c3d831f7e58994d7aaf23a0961320d21e84f922f919/black-24.8.0-py3-none-any.whl", hash = "sha256:972085c618ee94f402da1af548a4f218c754ea7e5dc70acb168bfaca4c2542ed", size = 206504, upload-time = "2024-08-02T17:43:15.747Z" },
-]
-
-[[package]]
-name = "black"
-version = "25.1.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-dependencies = [
- { name = "click", version = "8.1.8", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version == '3.9.*'" },
- { name = "click", version = "8.2.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.10'" },
- { name = "mypy-extensions", marker = "python_full_version >= '3.9'" },
- { name = "packaging", marker = "python_full_version >= '3.9'" },
- { name = "pathspec", marker = "python_full_version >= '3.9'" },
- { name = "platformdirs", version = "4.4.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
- { name = "tomli", marker = "python_full_version >= '3.9' and python_full_version < '3.11'" },
- { name = "typing-extensions", version = "4.15.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9' and python_full_version < '3.11'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/94/49/26a7b0f3f35da4b5a65f081943b7bcd22d7002f5f0fb8098ec1ff21cb6ef/black-25.1.0.tar.gz", hash = "sha256:33496d5cd1222ad73391352b4ae8da15253c5de89b93a80b3e2c8d9a19ec2666", size = 649449, upload-time = "2025-01-29T04:15:40.373Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/4d/3b/4ba3f93ac8d90410423fdd31d7541ada9bcee1df32fb90d26de41ed40e1d/black-25.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:759e7ec1e050a15f89b770cefbf91ebee8917aac5c20483bc2d80a6c3a04df32", size = 1629419, upload-time = "2025-01-29T05:37:06.642Z" },
- { url = "https://files.pythonhosted.org/packages/b4/02/0bde0485146a8a5e694daed47561785e8b77a0466ccc1f3e485d5ef2925e/black-25.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e519ecf93120f34243e6b0054db49c00a35f84f195d5bce7e9f5cfc578fc2da", size = 1461080, upload-time = "2025-01-29T05:37:09.321Z" },
- { url = "https://files.pythonhosted.org/packages/52/0e/abdf75183c830eaca7589144ff96d49bce73d7ec6ad12ef62185cc0f79a2/black-25.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:055e59b198df7ac0b7efca5ad7ff2516bca343276c466be72eb04a3bcc1f82d7", size = 1766886, upload-time = "2025-01-29T04:18:24.432Z" },
- { url = "https://files.pythonhosted.org/packages/dc/a6/97d8bb65b1d8a41f8a6736222ba0a334db7b7b77b8023ab4568288f23973/black-25.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:db8ea9917d6f8fc62abd90d944920d95e73c83a5ee3383493e35d271aca872e9", size = 1419404, upload-time = "2025-01-29T04:19:04.296Z" },
- { url = "https://files.pythonhosted.org/packages/7e/4f/87f596aca05c3ce5b94b8663dbfe242a12843caaa82dd3f85f1ffdc3f177/black-25.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a39337598244de4bae26475f77dda852ea00a93bd4c728e09eacd827ec929df0", size = 1614372, upload-time = "2025-01-29T05:37:11.71Z" },
- { url = "https://files.pythonhosted.org/packages/e7/d0/2c34c36190b741c59c901e56ab7f6e54dad8df05a6272a9747ecef7c6036/black-25.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:96c1c7cd856bba8e20094e36e0f948718dc688dba4a9d78c3adde52b9e6c2299", size = 1442865, upload-time = "2025-01-29T05:37:14.309Z" },
- { url = "https://files.pythonhosted.org/packages/21/d4/7518c72262468430ead45cf22bd86c883a6448b9eb43672765d69a8f1248/black-25.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce2e264d59c91e52d8000d507eb20a9aca4a778731a08cfff7e5ac4a4bb7096", size = 1749699, upload-time = "2025-01-29T04:18:17.688Z" },
- { url = "https://files.pythonhosted.org/packages/58/db/4f5beb989b547f79096e035c4981ceb36ac2b552d0ac5f2620e941501c99/black-25.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:172b1dbff09f86ce6f4eb8edf9dede08b1fce58ba194c87d7a4f1a5aa2f5b3c2", size = 1428028, upload-time = "2025-01-29T04:18:51.711Z" },
- { url = "https://files.pythonhosted.org/packages/83/71/3fe4741df7adf015ad8dfa082dd36c94ca86bb21f25608eb247b4afb15b2/black-25.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4b60580e829091e6f9238c848ea6750efed72140b91b048770b64e74fe04908b", size = 1650988, upload-time = "2025-01-29T05:37:16.707Z" },
- { url = "https://files.pythonhosted.org/packages/13/f3/89aac8a83d73937ccd39bbe8fc6ac8860c11cfa0af5b1c96d081facac844/black-25.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1e2978f6df243b155ef5fa7e558a43037c3079093ed5d10fd84c43900f2d8ecc", size = 1453985, upload-time = "2025-01-29T05:37:18.273Z" },
- { url = "https://files.pythonhosted.org/packages/6f/22/b99efca33f1f3a1d2552c714b1e1b5ae92efac6c43e790ad539a163d1754/black-25.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3b48735872ec535027d979e8dcb20bf4f70b5ac75a8ea99f127c106a7d7aba9f", size = 1783816, upload-time = "2025-01-29T04:18:33.823Z" },
- { url = "https://files.pythonhosted.org/packages/18/7e/a27c3ad3822b6f2e0e00d63d58ff6299a99a5b3aee69fa77cd4b0076b261/black-25.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:ea0213189960bda9cf99be5b8c8ce66bb054af5e9e861249cd23471bd7b0b3ba", size = 1440860, upload-time = "2025-01-29T04:19:12.944Z" },
- { url = "https://files.pythonhosted.org/packages/98/87/0edf98916640efa5d0696e1abb0a8357b52e69e82322628f25bf14d263d1/black-25.1.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8f0b18a02996a836cc9c9c78e5babec10930862827b1b724ddfe98ccf2f2fe4f", size = 1650673, upload-time = "2025-01-29T05:37:20.574Z" },
- { url = "https://files.pythonhosted.org/packages/52/e5/f7bf17207cf87fa6e9b676576749c6b6ed0d70f179a3d812c997870291c3/black-25.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:afebb7098bfbc70037a053b91ae8437c3857482d3a690fefc03e9ff7aa9a5fd3", size = 1453190, upload-time = "2025-01-29T05:37:22.106Z" },
- { url = "https://files.pythonhosted.org/packages/e3/ee/adda3d46d4a9120772fae6de454c8495603c37c4c3b9c60f25b1ab6401fe/black-25.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:030b9759066a4ee5e5aca28c3c77f9c64789cdd4de8ac1df642c40b708be6171", size = 1782926, upload-time = "2025-01-29T04:18:58.564Z" },
- { url = "https://files.pythonhosted.org/packages/cc/64/94eb5f45dcb997d2082f097a3944cfc7fe87e071907f677e80788a2d7b7a/black-25.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:a22f402b410566e2d1c950708c77ebf5ebd5d0d88a6a2e87c86d9fb48afa0d18", size = 1442613, upload-time = "2025-01-29T04:19:27.63Z" },
- { url = "https://files.pythonhosted.org/packages/d3/b6/ae7507470a4830dbbfe875c701e84a4a5fb9183d1497834871a715716a92/black-25.1.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:a1ee0a0c330f7b5130ce0caed9936a904793576ef4d2b98c40835d6a65afa6a0", size = 1628593, upload-time = "2025-01-29T05:37:23.672Z" },
- { url = "https://files.pythonhosted.org/packages/24/c1/ae36fa59a59f9363017ed397750a0cd79a470490860bc7713967d89cdd31/black-25.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f3df5f1bf91d36002b0a75389ca8663510cf0531cca8aa5c1ef695b46d98655f", size = 1460000, upload-time = "2025-01-29T05:37:25.829Z" },
- { url = "https://files.pythonhosted.org/packages/ac/b6/98f832e7a6c49aa3a464760c67c7856363aa644f2f3c74cf7d624168607e/black-25.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d9e6827d563a2c820772b32ce8a42828dc6790f095f441beef18f96aa6f8294e", size = 1765963, upload-time = "2025-01-29T04:18:38.116Z" },
- { url = "https://files.pythonhosted.org/packages/ce/e9/2cb0a017eb7024f70e0d2e9bdb8c5a5b078c5740c7f8816065d06f04c557/black-25.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:bacabb307dca5ebaf9c118d2d2f6903da0d62c9faa82bd21a33eecc319559355", size = 1419419, upload-time = "2025-01-29T04:18:30.191Z" },
- { url = "https://files.pythonhosted.org/packages/09/71/54e999902aed72baf26bca0d50781b01838251a462612966e9fc4891eadd/black-25.1.0-py3-none-any.whl", hash = "sha256:95e8176dae143ba9097f351d174fdaf0ccd29efb414b362ae3fd72bf0f710717", size = 207646, upload-time = "2025-01-29T04:15:38.082Z" },
-]
-
-[[package]]
-name = "certifi"
-version = "2025.8.3"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/dc/67/960ebe6bf230a96cda2e0abcf73af550ec4f090005363542f0765df162e0/certifi-2025.8.3.tar.gz", hash = "sha256:e564105f78ded564e3ae7c923924435e1daa7463faeab5bb932bc53ffae63407", size = 162386, upload-time = "2025-08-03T03:07:47.08Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/e5/48/1549795ba7742c948d2ad169c1c8cdbae65bc450d6cd753d124b17c8cd32/certifi-2025.8.3-py3-none-any.whl", hash = "sha256:f6c12493cfb1b06ba2ff328595af9350c65d6644968e5d3a2ffd78699af217a5", size = 161216, upload-time = "2025-08-03T03:07:45.777Z" },
-]
-
-[[package]]
-name = "click"
-version = "8.1.8"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version == '3.9.*'",
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-dependencies = [
- { name = "colorama", marker = "python_full_version < '3.10' and sys_platform == 'win32'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/b9/2e/0090cbf739cee7d23781ad4b89a9894a41538e4fcf4c31dcdd705b78eb8b/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a", size = 226593, upload-time = "2024-12-21T18:38:44.339Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/7e/d4/7ebdbd03970677812aac39c869717059dbb71a4cfc033ca6e5221787892c/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2", size = 98188, upload-time = "2024-12-21T18:38:41.666Z" },
-]
-
-[[package]]
-name = "click"
-version = "8.2.1"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
-]
-dependencies = [
- { name = "colorama", marker = "python_full_version >= '3.10' and sys_platform == 'win32'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/60/6c/8ca2efa64cf75a977a0d7fac081354553ebe483345c734fb6b6515d96bbc/click-8.2.1.tar.gz", hash = "sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202", size = 286342, upload-time = "2025-05-20T23:19:49.832Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/85/32/10bb5764d90a8eee674e9dc6f4db6a0ab47c8c4d0d83c27f7c39ac415a4d/click-8.2.1-py3-none-any.whl", hash = "sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b", size = 102215, upload-time = "2025-05-20T23:19:47.796Z" },
-]
-
-[[package]]
-name = "colorama"
-version = "0.4.6"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
-]
-
-[[package]]
-name = "coverage"
-version = "7.6.1"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/f7/08/7e37f82e4d1aead42a7443ff06a1e406aabf7302c4f00a546e4b320b994c/coverage-7.6.1.tar.gz", hash = "sha256:953510dfb7b12ab69d20135a0662397f077c59b1e6379a768e97c59d852ee51d", size = 798791, upload-time = "2024-08-04T19:45:30.9Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/7e/61/eb7ce5ed62bacf21beca4937a90fe32545c91a3c8a42a30c6616d48fc70d/coverage-7.6.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b06079abebbc0e89e6163b8e8f0e16270124c154dc6e4a47b413dd538859af16", size = 206690, upload-time = "2024-08-04T19:43:07.695Z" },
- { url = "https://files.pythonhosted.org/packages/7d/73/041928e434442bd3afde5584bdc3f932fb4562b1597629f537387cec6f3d/coverage-7.6.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:cf4b19715bccd7ee27b6b120e7e9dd56037b9c0681dcc1adc9ba9db3d417fa36", size = 207127, upload-time = "2024-08-04T19:43:10.15Z" },
- { url = "https://files.pythonhosted.org/packages/c7/c8/6ca52b5147828e45ad0242388477fdb90df2c6cbb9a441701a12b3c71bc8/coverage-7.6.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e61c0abb4c85b095a784ef23fdd4aede7a2628478e7baba7c5e3deba61070a02", size = 235654, upload-time = "2024-08-04T19:43:12.405Z" },
- { url = "https://files.pythonhosted.org/packages/d5/da/9ac2b62557f4340270942011d6efeab9833648380109e897d48ab7c1035d/coverage-7.6.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fd21f6ae3f08b41004dfb433fa895d858f3f5979e7762d052b12aef444e29afc", size = 233598, upload-time = "2024-08-04T19:43:14.078Z" },
- { url = "https://files.pythonhosted.org/packages/53/23/9e2c114d0178abc42b6d8d5281f651a8e6519abfa0ef460a00a91f80879d/coverage-7.6.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f59d57baca39b32db42b83b2a7ba6f47ad9c394ec2076b084c3f029b7afca23", size = 234732, upload-time = "2024-08-04T19:43:16.632Z" },
- { url = "https://files.pythonhosted.org/packages/0f/7e/a0230756fb133343a52716e8b855045f13342b70e48e8ad41d8a0d60ab98/coverage-7.6.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:a1ac0ae2b8bd743b88ed0502544847c3053d7171a3cff9228af618a068ed9c34", size = 233816, upload-time = "2024-08-04T19:43:19.049Z" },
- { url = "https://files.pythonhosted.org/packages/28/7c/3753c8b40d232b1e5eeaed798c875537cf3cb183fb5041017c1fdb7ec14e/coverage-7.6.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e6a08c0be454c3b3beb105c0596ebdc2371fab6bb90c0c0297f4e58fd7e1012c", size = 232325, upload-time = "2024-08-04T19:43:21.246Z" },
- { url = "https://files.pythonhosted.org/packages/57/e3/818a2b2af5b7573b4b82cf3e9f137ab158c90ea750a8f053716a32f20f06/coverage-7.6.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:f5796e664fe802da4f57a168c85359a8fbf3eab5e55cd4e4569fbacecc903959", size = 233418, upload-time = "2024-08-04T19:43:22.945Z" },
- { url = "https://files.pythonhosted.org/packages/c8/fb/4532b0b0cefb3f06d201648715e03b0feb822907edab3935112b61b885e2/coverage-7.6.1-cp310-cp310-win32.whl", hash = "sha256:7bb65125fcbef8d989fa1dd0e8a060999497629ca5b0efbca209588a73356232", size = 209343, upload-time = "2024-08-04T19:43:25.121Z" },
- { url = "https://files.pythonhosted.org/packages/5a/25/af337cc7421eca1c187cc9c315f0a755d48e755d2853715bfe8c418a45fa/coverage-7.6.1-cp310-cp310-win_amd64.whl", hash = "sha256:3115a95daa9bdba70aea750db7b96b37259a81a709223c8448fa97727d546fe0", size = 210136, upload-time = "2024-08-04T19:43:26.851Z" },
- { url = "https://files.pythonhosted.org/packages/ad/5f/67af7d60d7e8ce61a4e2ddcd1bd5fb787180c8d0ae0fbd073f903b3dd95d/coverage-7.6.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:7dea0889685db8550f839fa202744652e87c60015029ce3f60e006f8c4462c93", size = 206796, upload-time = "2024-08-04T19:43:29.115Z" },
- { url = "https://files.pythonhosted.org/packages/e1/0e/e52332389e057daa2e03be1fbfef25bb4d626b37d12ed42ae6281d0a274c/coverage-7.6.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ed37bd3c3b063412f7620464a9ac1314d33100329f39799255fb8d3027da50d3", size = 207244, upload-time = "2024-08-04T19:43:31.285Z" },
- { url = "https://files.pythonhosted.org/packages/aa/cd/766b45fb6e090f20f8927d9c7cb34237d41c73a939358bc881883fd3a40d/coverage-7.6.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d85f5e9a5f8b73e2350097c3756ef7e785f55bd71205defa0bfdaf96c31616ff", size = 239279, upload-time = "2024-08-04T19:43:33.581Z" },
- { url = "https://files.pythonhosted.org/packages/70/6c/a9ccd6fe50ddaf13442a1e2dd519ca805cbe0f1fcd377fba6d8339b98ccb/coverage-7.6.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9bc572be474cafb617672c43fe989d6e48d3c83af02ce8de73fff1c6bb3c198d", size = 236859, upload-time = "2024-08-04T19:43:35.301Z" },
- { url = "https://files.pythonhosted.org/packages/14/6f/8351b465febb4dbc1ca9929505202db909c5a635c6fdf33e089bbc3d7d85/coverage-7.6.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0c0420b573964c760df9e9e86d1a9a622d0d27f417e1a949a8a66dd7bcee7bc6", size = 238549, upload-time = "2024-08-04T19:43:37.578Z" },
- { url = "https://files.pythonhosted.org/packages/68/3c/289b81fa18ad72138e6d78c4c11a82b5378a312c0e467e2f6b495c260907/coverage-7.6.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1f4aa8219db826ce6be7099d559f8ec311549bfc4046f7f9fe9b5cea5c581c56", size = 237477, upload-time = "2024-08-04T19:43:39.92Z" },
- { url = "https://files.pythonhosted.org/packages/ed/1c/aa1efa6459d822bd72c4abc0b9418cf268de3f60eeccd65dc4988553bd8d/coverage-7.6.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:fc5a77d0c516700ebad189b587de289a20a78324bc54baee03dd486f0855d234", size = 236134, upload-time = "2024-08-04T19:43:41.453Z" },
- { url = "https://files.pythonhosted.org/packages/fb/c8/521c698f2d2796565fe9c789c2ee1ccdae610b3aa20b9b2ef980cc253640/coverage-7.6.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:b48f312cca9621272ae49008c7f613337c53fadca647d6384cc129d2996d1133", size = 236910, upload-time = "2024-08-04T19:43:43.037Z" },
- { url = "https://files.pythonhosted.org/packages/7d/30/033e663399ff17dca90d793ee8a2ea2890e7fdf085da58d82468b4220bf7/coverage-7.6.1-cp311-cp311-win32.whl", hash = "sha256:1125ca0e5fd475cbbba3bb67ae20bd2c23a98fac4e32412883f9bcbaa81c314c", size = 209348, upload-time = "2024-08-04T19:43:44.787Z" },
- { url = "https://files.pythonhosted.org/packages/20/05/0d1ccbb52727ccdadaa3ff37e4d2dc1cd4d47f0c3df9eb58d9ec8508ca88/coverage-7.6.1-cp311-cp311-win_amd64.whl", hash = "sha256:8ae539519c4c040c5ffd0632784e21b2f03fc1340752af711f33e5be83a9d6c6", size = 210230, upload-time = "2024-08-04T19:43:46.707Z" },
- { url = "https://files.pythonhosted.org/packages/7e/d4/300fc921dff243cd518c7db3a4c614b7e4b2431b0d1145c1e274fd99bd70/coverage-7.6.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:95cae0efeb032af8458fc27d191f85d1717b1d4e49f7cb226cf526ff28179778", size = 206983, upload-time = "2024-08-04T19:43:49.082Z" },
- { url = "https://files.pythonhosted.org/packages/e1/ab/6bf00de5327ecb8db205f9ae596885417a31535eeda6e7b99463108782e1/coverage-7.6.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5621a9175cf9d0b0c84c2ef2b12e9f5f5071357c4d2ea6ca1cf01814f45d2391", size = 207221, upload-time = "2024-08-04T19:43:52.15Z" },
- { url = "https://files.pythonhosted.org/packages/92/8f/2ead05e735022d1a7f3a0a683ac7f737de14850395a826192f0288703472/coverage-7.6.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:260933720fdcd75340e7dbe9060655aff3af1f0c5d20f46b57f262ab6c86a5e8", size = 240342, upload-time = "2024-08-04T19:43:53.746Z" },
- { url = "https://files.pythonhosted.org/packages/0f/ef/94043e478201ffa85b8ae2d2c79b4081e5a1b73438aafafccf3e9bafb6b5/coverage-7.6.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:07e2ca0ad381b91350c0ed49d52699b625aab2b44b65e1b4e02fa9df0e92ad2d", size = 237371, upload-time = "2024-08-04T19:43:55.993Z" },
- { url = "https://files.pythonhosted.org/packages/1f/0f/c890339dd605f3ebc269543247bdd43b703cce6825b5ed42ff5f2d6122c7/coverage-7.6.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c44fee9975f04b33331cb8eb272827111efc8930cfd582e0320613263ca849ca", size = 239455, upload-time = "2024-08-04T19:43:57.618Z" },
- { url = "https://files.pythonhosted.org/packages/d1/04/7fd7b39ec7372a04efb0f70c70e35857a99b6a9188b5205efb4c77d6a57a/coverage-7.6.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:877abb17e6339d96bf08e7a622d05095e72b71f8afd8a9fefc82cf30ed944163", size = 238924, upload-time = "2024-08-04T19:44:00.012Z" },
- { url = "https://files.pythonhosted.org/packages/ed/bf/73ce346a9d32a09cf369f14d2a06651329c984e106f5992c89579d25b27e/coverage-7.6.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:3e0cadcf6733c09154b461f1ca72d5416635e5e4ec4e536192180d34ec160f8a", size = 237252, upload-time = "2024-08-04T19:44:01.713Z" },
- { url = "https://files.pythonhosted.org/packages/86/74/1dc7a20969725e917b1e07fe71a955eb34bc606b938316bcc799f228374b/coverage-7.6.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:c3c02d12f837d9683e5ab2f3d9844dc57655b92c74e286c262e0fc54213c216d", size = 238897, upload-time = "2024-08-04T19:44:03.898Z" },
- { url = "https://files.pythonhosted.org/packages/b6/e9/d9cc3deceb361c491b81005c668578b0dfa51eed02cd081620e9a62f24ec/coverage-7.6.1-cp312-cp312-win32.whl", hash = "sha256:e05882b70b87a18d937ca6768ff33cc3f72847cbc4de4491c8e73880766718e5", size = 209606, upload-time = "2024-08-04T19:44:05.532Z" },
- { url = "https://files.pythonhosted.org/packages/47/c8/5a2e41922ea6740f77d555c4d47544acd7dc3f251fe14199c09c0f5958d3/coverage-7.6.1-cp312-cp312-win_amd64.whl", hash = "sha256:b5d7b556859dd85f3a541db6a4e0167b86e7273e1cdc973e5b175166bb634fdb", size = 210373, upload-time = "2024-08-04T19:44:07.079Z" },
- { url = "https://files.pythonhosted.org/packages/8c/f9/9aa4dfb751cb01c949c990d136a0f92027fbcc5781c6e921df1cb1563f20/coverage-7.6.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:a4acd025ecc06185ba2b801f2de85546e0b8ac787cf9d3b06e7e2a69f925b106", size = 207007, upload-time = "2024-08-04T19:44:09.453Z" },
- { url = "https://files.pythonhosted.org/packages/b9/67/e1413d5a8591622a46dd04ff80873b04c849268831ed5c304c16433e7e30/coverage-7.6.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a6d3adcf24b624a7b778533480e32434a39ad8fa30c315208f6d3e5542aeb6e9", size = 207269, upload-time = "2024-08-04T19:44:11.045Z" },
- { url = "https://files.pythonhosted.org/packages/14/5b/9dec847b305e44a5634d0fb8498d135ab1d88330482b74065fcec0622224/coverage-7.6.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d0c212c49b6c10e6951362f7c6df3329f04c2b1c28499563d4035d964ab8e08c", size = 239886, upload-time = "2024-08-04T19:44:12.83Z" },
- { url = "https://files.pythonhosted.org/packages/7b/b7/35760a67c168e29f454928f51f970342d23cf75a2bb0323e0f07334c85f3/coverage-7.6.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6e81d7a3e58882450ec4186ca59a3f20a5d4440f25b1cff6f0902ad890e6748a", size = 237037, upload-time = "2024-08-04T19:44:15.393Z" },
- { url = "https://files.pythonhosted.org/packages/f7/95/d2fd31f1d638df806cae59d7daea5abf2b15b5234016a5ebb502c2f3f7ee/coverage-7.6.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:78b260de9790fd81e69401c2dc8b17da47c8038176a79092a89cb2b7d945d060", size = 239038, upload-time = "2024-08-04T19:44:17.466Z" },
- { url = "https://files.pythonhosted.org/packages/6e/bd/110689ff5752b67924efd5e2aedf5190cbbe245fc81b8dec1abaffba619d/coverage-7.6.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a78d169acd38300060b28d600344a803628c3fd585c912cacc9ea8790fe96862", size = 238690, upload-time = "2024-08-04T19:44:19.336Z" },
- { url = "https://files.pythonhosted.org/packages/d3/a8/08d7b38e6ff8df52331c83130d0ab92d9c9a8b5462f9e99c9f051a4ae206/coverage-7.6.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2c09f4ce52cb99dd7505cd0fc8e0e37c77b87f46bc9c1eb03fe3bc9991085388", size = 236765, upload-time = "2024-08-04T19:44:20.994Z" },
- { url = "https://files.pythonhosted.org/packages/d6/6a/9cf96839d3147d55ae713eb2d877f4d777e7dc5ba2bce227167d0118dfe8/coverage-7.6.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6878ef48d4227aace338d88c48738a4258213cd7b74fd9a3d4d7582bb1d8a155", size = 238611, upload-time = "2024-08-04T19:44:22.616Z" },
- { url = "https://files.pythonhosted.org/packages/74/e4/7ff20d6a0b59eeaab40b3140a71e38cf52547ba21dbcf1d79c5a32bba61b/coverage-7.6.1-cp313-cp313-win32.whl", hash = "sha256:44df346d5215a8c0e360307d46ffaabe0f5d3502c8a1cefd700b34baf31d411a", size = 209671, upload-time = "2024-08-04T19:44:24.418Z" },
- { url = "https://files.pythonhosted.org/packages/35/59/1812f08a85b57c9fdb6d0b383d779e47b6f643bc278ed682859512517e83/coverage-7.6.1-cp313-cp313-win_amd64.whl", hash = "sha256:8284cf8c0dd272a247bc154eb6c95548722dce90d098c17a883ed36e67cdb129", size = 210368, upload-time = "2024-08-04T19:44:26.276Z" },
- { url = "https://files.pythonhosted.org/packages/9c/15/08913be1c59d7562a3e39fce20661a98c0a3f59d5754312899acc6cb8a2d/coverage-7.6.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:d3296782ca4eab572a1a4eca686d8bfb00226300dcefdf43faa25b5242ab8a3e", size = 207758, upload-time = "2024-08-04T19:44:29.028Z" },
- { url = "https://files.pythonhosted.org/packages/c4/ae/b5d58dff26cade02ada6ca612a76447acd69dccdbb3a478e9e088eb3d4b9/coverage-7.6.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:502753043567491d3ff6d08629270127e0c31d4184c4c8d98f92c26f65019962", size = 208035, upload-time = "2024-08-04T19:44:30.673Z" },
- { url = "https://files.pythonhosted.org/packages/b8/d7/62095e355ec0613b08dfb19206ce3033a0eedb6f4a67af5ed267a8800642/coverage-7.6.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6a89ecca80709d4076b95f89f308544ec8f7b4727e8a547913a35f16717856cb", size = 250839, upload-time = "2024-08-04T19:44:32.412Z" },
- { url = "https://files.pythonhosted.org/packages/7c/1e/c2967cb7991b112ba3766df0d9c21de46b476d103e32bb401b1b2adf3380/coverage-7.6.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a318d68e92e80af8b00fa99609796fdbcdfef3629c77c6283566c6f02c6d6704", size = 246569, upload-time = "2024-08-04T19:44:34.547Z" },
- { url = "https://files.pythonhosted.org/packages/8b/61/a7a6a55dd266007ed3b1df7a3386a0d760d014542d72f7c2c6938483b7bd/coverage-7.6.1-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:13b0a73a0896988f053e4fbb7de6d93388e6dd292b0d87ee51d106f2c11b465b", size = 248927, upload-time = "2024-08-04T19:44:36.313Z" },
- { url = "https://files.pythonhosted.org/packages/c8/fa/13a6f56d72b429f56ef612eb3bc5ce1b75b7ee12864b3bd12526ab794847/coverage-7.6.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4421712dbfc5562150f7554f13dde997a2e932a6b5f352edcce948a815efee6f", size = 248401, upload-time = "2024-08-04T19:44:38.155Z" },
- { url = "https://files.pythonhosted.org/packages/75/06/0429c652aa0fb761fc60e8c6b291338c9173c6aa0f4e40e1902345b42830/coverage-7.6.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:166811d20dfea725e2e4baa71fffd6c968a958577848d2131f39b60043400223", size = 246301, upload-time = "2024-08-04T19:44:39.883Z" },
- { url = "https://files.pythonhosted.org/packages/52/76/1766bb8b803a88f93c3a2d07e30ffa359467810e5cbc68e375ebe6906efb/coverage-7.6.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:225667980479a17db1048cb2bf8bfb39b8e5be8f164b8f6628b64f78a72cf9d3", size = 247598, upload-time = "2024-08-04T19:44:41.59Z" },
- { url = "https://files.pythonhosted.org/packages/66/8b/f54f8db2ae17188be9566e8166ac6df105c1c611e25da755738025708d54/coverage-7.6.1-cp313-cp313t-win32.whl", hash = "sha256:170d444ab405852903b7d04ea9ae9b98f98ab6d7e63e1115e82620807519797f", size = 210307, upload-time = "2024-08-04T19:44:43.301Z" },
- { url = "https://files.pythonhosted.org/packages/9f/b0/e0dca6da9170aefc07515cce067b97178cefafb512d00a87a1c717d2efd5/coverage-7.6.1-cp313-cp313t-win_amd64.whl", hash = "sha256:b9f222de8cded79c49bf184bdbc06630d4c58eec9459b939b4a690c82ed05657", size = 211453, upload-time = "2024-08-04T19:44:45.677Z" },
- { url = "https://files.pythonhosted.org/packages/81/d0/d9e3d554e38beea5a2e22178ddb16587dbcbe9a1ef3211f55733924bf7fa/coverage-7.6.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6db04803b6c7291985a761004e9060b2bca08da6d04f26a7f2294b8623a0c1a0", size = 206674, upload-time = "2024-08-04T19:44:47.694Z" },
- { url = "https://files.pythonhosted.org/packages/38/ea/cab2dc248d9f45b2b7f9f1f596a4d75a435cb364437c61b51d2eb33ceb0e/coverage-7.6.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f1adfc8ac319e1a348af294106bc6a8458a0f1633cc62a1446aebc30c5fa186a", size = 207101, upload-time = "2024-08-04T19:44:49.32Z" },
- { url = "https://files.pythonhosted.org/packages/ca/6f/f82f9a500c7c5722368978a5390c418d2a4d083ef955309a8748ecaa8920/coverage-7.6.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a95324a9de9650a729239daea117df21f4b9868ce32e63f8b650ebe6cef5595b", size = 236554, upload-time = "2024-08-04T19:44:51.631Z" },
- { url = "https://files.pythonhosted.org/packages/a6/94/d3055aa33d4e7e733d8fa309d9adf147b4b06a82c1346366fc15a2b1d5fa/coverage-7.6.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b43c03669dc4618ec25270b06ecd3ee4fa94c7f9b3c14bae6571ca00ef98b0d3", size = 234440, upload-time = "2024-08-04T19:44:53.464Z" },
- { url = "https://files.pythonhosted.org/packages/e4/6e/885bcd787d9dd674de4a7d8ec83faf729534c63d05d51d45d4fa168f7102/coverage-7.6.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8929543a7192c13d177b770008bc4e8119f2e1f881d563fc6b6305d2d0ebe9de", size = 235889, upload-time = "2024-08-04T19:44:55.165Z" },
- { url = "https://files.pythonhosted.org/packages/f4/63/df50120a7744492710854860783d6819ff23e482dee15462c9a833cc428a/coverage-7.6.1-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:a09ece4a69cf399510c8ab25e0950d9cf2b42f7b3cb0374f95d2e2ff594478a6", size = 235142, upload-time = "2024-08-04T19:44:57.269Z" },
- { url = "https://files.pythonhosted.org/packages/3a/5d/9d0acfcded2b3e9ce1c7923ca52ccc00c78a74e112fc2aee661125b7843b/coverage-7.6.1-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:9054a0754de38d9dbd01a46621636689124d666bad1936d76c0341f7d71bf569", size = 233805, upload-time = "2024-08-04T19:44:59.033Z" },
- { url = "https://files.pythonhosted.org/packages/c4/56/50abf070cb3cd9b1dd32f2c88f083aab561ecbffbcd783275cb51c17f11d/coverage-7.6.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:0dbde0f4aa9a16fa4d754356a8f2e36296ff4d83994b2c9d8398aa32f222f989", size = 234655, upload-time = "2024-08-04T19:45:01.398Z" },
- { url = "https://files.pythonhosted.org/packages/25/ee/b4c246048b8485f85a2426ef4abab88e48c6e80c74e964bea5cd4cd4b115/coverage-7.6.1-cp38-cp38-win32.whl", hash = "sha256:da511e6ad4f7323ee5702e6633085fb76c2f893aaf8ce4c51a0ba4fc07580ea7", size = 209296, upload-time = "2024-08-04T19:45:03.819Z" },
- { url = "https://files.pythonhosted.org/packages/5c/1c/96cf86b70b69ea2b12924cdf7cabb8ad10e6130eab8d767a1099fbd2a44f/coverage-7.6.1-cp38-cp38-win_amd64.whl", hash = "sha256:3f1156e3e8f2872197af3840d8ad307a9dd18e615dc64d9ee41696f287c57ad8", size = 210137, upload-time = "2024-08-04T19:45:06.25Z" },
- { url = "https://files.pythonhosted.org/packages/19/d3/d54c5aa83268779d54c86deb39c1c4566e5d45c155369ca152765f8db413/coverage-7.6.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:abd5fd0db5f4dc9289408aaf34908072f805ff7792632250dcb36dc591d24255", size = 206688, upload-time = "2024-08-04T19:45:08.358Z" },
- { url = "https://files.pythonhosted.org/packages/a5/fe/137d5dca72e4a258b1bc17bb04f2e0196898fe495843402ce826a7419fe3/coverage-7.6.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:547f45fa1a93154bd82050a7f3cddbc1a7a4dd2a9bf5cb7d06f4ae29fe94eaf8", size = 207120, upload-time = "2024-08-04T19:45:11.526Z" },
- { url = "https://files.pythonhosted.org/packages/78/5b/a0a796983f3201ff5485323b225d7c8b74ce30c11f456017e23d8e8d1945/coverage-7.6.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:645786266c8f18a931b65bfcefdbf6952dd0dea98feee39bd188607a9d307ed2", size = 235249, upload-time = "2024-08-04T19:45:13.202Z" },
- { url = "https://files.pythonhosted.org/packages/4e/e1/76089d6a5ef9d68f018f65411fcdaaeb0141b504587b901d74e8587606ad/coverage-7.6.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9e0b2df163b8ed01d515807af24f63de04bebcecbd6c3bfeff88385789fdf75a", size = 233237, upload-time = "2024-08-04T19:45:14.961Z" },
- { url = "https://files.pythonhosted.org/packages/9a/6f/eef79b779a540326fee9520e5542a8b428cc3bfa8b7c8f1022c1ee4fc66c/coverage-7.6.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:609b06f178fe8e9f89ef676532760ec0b4deea15e9969bf754b37f7c40326dbc", size = 234311, upload-time = "2024-08-04T19:45:16.924Z" },
- { url = "https://files.pythonhosted.org/packages/75/e1/656d65fb126c29a494ef964005702b012f3498db1a30dd562958e85a4049/coverage-7.6.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:702855feff378050ae4f741045e19a32d57d19f3e0676d589df0575008ea5004", size = 233453, upload-time = "2024-08-04T19:45:18.672Z" },
- { url = "https://files.pythonhosted.org/packages/68/6a/45f108f137941a4a1238c85f28fd9d048cc46b5466d6b8dda3aba1bb9d4f/coverage-7.6.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:2bdb062ea438f22d99cba0d7829c2ef0af1d768d1e4a4f528087224c90b132cb", size = 231958, upload-time = "2024-08-04T19:45:20.63Z" },
- { url = "https://files.pythonhosted.org/packages/9b/e7/47b809099168b8b8c72ae311efc3e88c8d8a1162b3ba4b8da3cfcdb85743/coverage-7.6.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:9c56863d44bd1c4fe2abb8a4d6f5371d197f1ac0ebdee542f07f35895fc07f36", size = 232938, upload-time = "2024-08-04T19:45:23.062Z" },
- { url = "https://files.pythonhosted.org/packages/52/80/052222ba7058071f905435bad0ba392cc12006380731c37afaf3fe749b88/coverage-7.6.1-cp39-cp39-win32.whl", hash = "sha256:6e2cd258d7d927d09493c8df1ce9174ad01b381d4729a9d8d4e38670ca24774c", size = 209352, upload-time = "2024-08-04T19:45:25.042Z" },
- { url = "https://files.pythonhosted.org/packages/b8/d8/1b92e0b3adcf384e98770a00ca095da1b5f7b483e6563ae4eb5e935d24a1/coverage-7.6.1-cp39-cp39-win_amd64.whl", hash = "sha256:06a737c882bd26d0d6ee7269b20b12f14a8704807a01056c80bb881a4b2ce6ca", size = 210153, upload-time = "2024-08-04T19:45:27.079Z" },
- { url = "https://files.pythonhosted.org/packages/a5/2b/0354ed096bca64dc8e32a7cbcae28b34cb5ad0b1fe2125d6d99583313ac0/coverage-7.6.1-pp38.pp39.pp310-none-any.whl", hash = "sha256:e9a6e0eb86070e8ccaedfbd9d38fec54864f3125ab95419970575b42af7541df", size = 198926, upload-time = "2024-08-04T19:45:28.875Z" },
-]
-
-[package.optional-dependencies]
-toml = [
- { name = "tomli", marker = "python_full_version < '3.9'" },
-]
-
-[[package]]
-name = "coverage"
-version = "7.10.6"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/14/70/025b179c993f019105b79575ac6edb5e084fb0f0e63f15cdebef4e454fb5/coverage-7.10.6.tar.gz", hash = "sha256:f644a3ae5933a552a29dbb9aa2f90c677a875f80ebea028e5a52a4f429044b90", size = 823736, upload-time = "2025-08-29T15:35:16.668Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/a8/1d/2e64b43d978b5bd184e0756a41415597dfef30fcbd90b747474bd749d45f/coverage-7.10.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:70e7bfbd57126b5554aa482691145f798d7df77489a177a6bef80de78860a356", size = 217025, upload-time = "2025-08-29T15:32:57.169Z" },
- { url = "https://files.pythonhosted.org/packages/23/62/b1e0f513417c02cc10ef735c3ee5186df55f190f70498b3702d516aad06f/coverage-7.10.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e41be6f0f19da64af13403e52f2dec38bbc2937af54df8ecef10850ff8d35301", size = 217419, upload-time = "2025-08-29T15:32:59.908Z" },
- { url = "https://files.pythonhosted.org/packages/e7/16/b800640b7a43e7c538429e4d7223e0a94fd72453a1a048f70bf766f12e96/coverage-7.10.6-cp310-cp310-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:c61fc91ab80b23f5fddbee342d19662f3d3328173229caded831aa0bd7595460", size = 244180, upload-time = "2025-08-29T15:33:01.608Z" },
- { url = "https://files.pythonhosted.org/packages/fb/6f/5e03631c3305cad187eaf76af0b559fff88af9a0b0c180d006fb02413d7a/coverage-7.10.6-cp310-cp310-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:10356fdd33a7cc06e8051413140bbdc6f972137508a3572e3f59f805cd2832fd", size = 245992, upload-time = "2025-08-29T15:33:03.239Z" },
- { url = "https://files.pythonhosted.org/packages/eb/a1/f30ea0fb400b080730125b490771ec62b3375789f90af0bb68bfb8a921d7/coverage-7.10.6-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:80b1695cf7c5ebe7b44bf2521221b9bb8cdf69b1f24231149a7e3eb1ae5fa2fb", size = 247851, upload-time = "2025-08-29T15:33:04.603Z" },
- { url = "https://files.pythonhosted.org/packages/02/8e/cfa8fee8e8ef9a6bb76c7bef039f3302f44e615d2194161a21d3d83ac2e9/coverage-7.10.6-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:2e4c33e6378b9d52d3454bd08847a8651f4ed23ddbb4a0520227bd346382bbc6", size = 245891, upload-time = "2025-08-29T15:33:06.176Z" },
- { url = "https://files.pythonhosted.org/packages/93/a9/51be09b75c55c4f6c16d8d73a6a1d46ad764acca0eab48fa2ffaef5958fe/coverage-7.10.6-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:c8a3ec16e34ef980a46f60dc6ad86ec60f763c3f2fa0db6d261e6e754f72e945", size = 243909, upload-time = "2025-08-29T15:33:07.74Z" },
- { url = "https://files.pythonhosted.org/packages/e9/a6/ba188b376529ce36483b2d585ca7bdac64aacbe5aa10da5978029a9c94db/coverage-7.10.6-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:7d79dabc0a56f5af990cc6da9ad1e40766e82773c075f09cc571e2076fef882e", size = 244786, upload-time = "2025-08-29T15:33:08.965Z" },
- { url = "https://files.pythonhosted.org/packages/d0/4c/37ed872374a21813e0d3215256180c9a382c3f5ced6f2e5da0102fc2fd3e/coverage-7.10.6-cp310-cp310-win32.whl", hash = "sha256:86b9b59f2b16e981906e9d6383eb6446d5b46c278460ae2c36487667717eccf1", size = 219521, upload-time = "2025-08-29T15:33:10.599Z" },
- { url = "https://files.pythonhosted.org/packages/8e/36/9311352fdc551dec5b973b61f4e453227ce482985a9368305880af4f85dd/coverage-7.10.6-cp310-cp310-win_amd64.whl", hash = "sha256:e132b9152749bd33534e5bd8565c7576f135f157b4029b975e15ee184325f528", size = 220417, upload-time = "2025-08-29T15:33:11.907Z" },
- { url = "https://files.pythonhosted.org/packages/d4/16/2bea27e212c4980753d6d563a0803c150edeaaddb0771a50d2afc410a261/coverage-7.10.6-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c706db3cabb7ceef779de68270150665e710b46d56372455cd741184f3868d8f", size = 217129, upload-time = "2025-08-29T15:33:13.575Z" },
- { url = "https://files.pythonhosted.org/packages/2a/51/e7159e068831ab37e31aac0969d47b8c5ee25b7d307b51e310ec34869315/coverage-7.10.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:8e0c38dc289e0508ef68ec95834cb5d2e96fdbe792eaccaa1bccac3966bbadcc", size = 217532, upload-time = "2025-08-29T15:33:14.872Z" },
- { url = "https://files.pythonhosted.org/packages/e7/c0/246ccbea53d6099325d25cd208df94ea435cd55f0db38099dd721efc7a1f/coverage-7.10.6-cp311-cp311-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:752a3005a1ded28f2f3a6e8787e24f28d6abe176ca64677bcd8d53d6fe2ec08a", size = 247931, upload-time = "2025-08-29T15:33:16.142Z" },
- { url = "https://files.pythonhosted.org/packages/7d/fb/7435ef8ab9b2594a6e3f58505cc30e98ae8b33265d844007737946c59389/coverage-7.10.6-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:689920ecfd60f992cafca4f5477d55720466ad2c7fa29bb56ac8d44a1ac2b47a", size = 249864, upload-time = "2025-08-29T15:33:17.434Z" },
- { url = "https://files.pythonhosted.org/packages/51/f8/d9d64e8da7bcddb094d511154824038833c81e3a039020a9d6539bf303e9/coverage-7.10.6-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ec98435796d2624d6905820a42f82149ee9fc4f2d45c2c5bc5a44481cc50db62", size = 251969, upload-time = "2025-08-29T15:33:18.822Z" },
- { url = "https://files.pythonhosted.org/packages/43/28/c43ba0ef19f446d6463c751315140d8f2a521e04c3e79e5c5fe211bfa430/coverage-7.10.6-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:b37201ce4a458c7a758ecc4efa92fa8ed783c66e0fa3c42ae19fc454a0792153", size = 249659, upload-time = "2025-08-29T15:33:20.407Z" },
- { url = "https://files.pythonhosted.org/packages/79/3e/53635bd0b72beaacf265784508a0b386defc9ab7fad99ff95f79ce9db555/coverage-7.10.6-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:2904271c80898663c810a6b067920a61dd8d38341244a3605bd31ab55250dad5", size = 247714, upload-time = "2025-08-29T15:33:21.751Z" },
- { url = "https://files.pythonhosted.org/packages/4c/55/0964aa87126624e8c159e32b0bc4e84edef78c89a1a4b924d28dd8265625/coverage-7.10.6-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5aea98383463d6e1fa4e95416d8de66f2d0cb588774ee20ae1b28df826bcb619", size = 248351, upload-time = "2025-08-29T15:33:23.105Z" },
- { url = "https://files.pythonhosted.org/packages/eb/ab/6cfa9dc518c6c8e14a691c54e53a9433ba67336c760607e299bfcf520cb1/coverage-7.10.6-cp311-cp311-win32.whl", hash = "sha256:e3fb1fa01d3598002777dd259c0c2e6d9d5e10e7222976fc8e03992f972a2cba", size = 219562, upload-time = "2025-08-29T15:33:24.717Z" },
- { url = "https://files.pythonhosted.org/packages/5b/18/99b25346690cbc55922e7cfef06d755d4abee803ef335baff0014268eff4/coverage-7.10.6-cp311-cp311-win_amd64.whl", hash = "sha256:f35ed9d945bece26553d5b4c8630453169672bea0050a564456eb88bdffd927e", size = 220453, upload-time = "2025-08-29T15:33:26.482Z" },
- { url = "https://files.pythonhosted.org/packages/d8/ed/81d86648a07ccb124a5cf1f1a7788712b8d7216b593562683cd5c9b0d2c1/coverage-7.10.6-cp311-cp311-win_arm64.whl", hash = "sha256:99e1a305c7765631d74b98bf7dbf54eeea931f975e80f115437d23848ee8c27c", size = 219127, upload-time = "2025-08-29T15:33:27.777Z" },
- { url = "https://files.pythonhosted.org/packages/26/06/263f3305c97ad78aab066d116b52250dd316e74fcc20c197b61e07eb391a/coverage-7.10.6-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:5b2dd6059938063a2c9fee1af729d4f2af28fd1a545e9b7652861f0d752ebcea", size = 217324, upload-time = "2025-08-29T15:33:29.06Z" },
- { url = "https://files.pythonhosted.org/packages/e9/60/1e1ded9a4fe80d843d7d53b3e395c1db3ff32d6c301e501f393b2e6c1c1f/coverage-7.10.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:388d80e56191bf846c485c14ae2bc8898aa3124d9d35903fef7d907780477634", size = 217560, upload-time = "2025-08-29T15:33:30.748Z" },
- { url = "https://files.pythonhosted.org/packages/b8/25/52136173c14e26dfed8b106ed725811bb53c30b896d04d28d74cb64318b3/coverage-7.10.6-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:90cb5b1a4670662719591aa92d0095bb41714970c0b065b02a2610172dbf0af6", size = 249053, upload-time = "2025-08-29T15:33:32.041Z" },
- { url = "https://files.pythonhosted.org/packages/cb/1d/ae25a7dc58fcce8b172d42ffe5313fc267afe61c97fa872b80ee72d9515a/coverage-7.10.6-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:961834e2f2b863a0e14260a9a273aff07ff7818ab6e66d2addf5628590c628f9", size = 251802, upload-time = "2025-08-29T15:33:33.625Z" },
- { url = "https://files.pythonhosted.org/packages/f5/7a/1f561d47743710fe996957ed7c124b421320f150f1d38523d8d9102d3e2a/coverage-7.10.6-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:bf9a19f5012dab774628491659646335b1928cfc931bf8d97b0d5918dd58033c", size = 252935, upload-time = "2025-08-29T15:33:34.909Z" },
- { url = "https://files.pythonhosted.org/packages/6c/ad/8b97cd5d28aecdfde792dcbf646bac141167a5cacae2cd775998b45fabb5/coverage-7.10.6-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:99c4283e2a0e147b9c9cc6bc9c96124de9419d6044837e9799763a0e29a7321a", size = 250855, upload-time = "2025-08-29T15:33:36.922Z" },
- { url = "https://files.pythonhosted.org/packages/33/6a/95c32b558d9a61858ff9d79580d3877df3eb5bc9eed0941b1f187c89e143/coverage-7.10.6-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:282b1b20f45df57cc508c1e033403f02283adfb67d4c9c35a90281d81e5c52c5", size = 248974, upload-time = "2025-08-29T15:33:38.175Z" },
- { url = "https://files.pythonhosted.org/packages/0d/9c/8ce95dee640a38e760d5b747c10913e7a06554704d60b41e73fdea6a1ffd/coverage-7.10.6-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:8cdbe264f11afd69841bd8c0d83ca10b5b32853263ee62e6ac6a0ab63895f972", size = 250409, upload-time = "2025-08-29T15:33:39.447Z" },
- { url = "https://files.pythonhosted.org/packages/04/12/7a55b0bdde78a98e2eb2356771fd2dcddb96579e8342bb52aa5bc52e96f0/coverage-7.10.6-cp312-cp312-win32.whl", hash = "sha256:a517feaf3a0a3eca1ee985d8373135cfdedfbba3882a5eab4362bda7c7cf518d", size = 219724, upload-time = "2025-08-29T15:33:41.172Z" },
- { url = "https://files.pythonhosted.org/packages/36/4a/32b185b8b8e327802c9efce3d3108d2fe2d9d31f153a0f7ecfd59c773705/coverage-7.10.6-cp312-cp312-win_amd64.whl", hash = "sha256:856986eadf41f52b214176d894a7de05331117f6035a28ac0016c0f63d887629", size = 220536, upload-time = "2025-08-29T15:33:42.524Z" },
- { url = "https://files.pythonhosted.org/packages/08/3a/d5d8dc703e4998038c3099eaf77adddb00536a3cec08c8dcd556a36a3eb4/coverage-7.10.6-cp312-cp312-win_arm64.whl", hash = "sha256:acf36b8268785aad739443fa2780c16260ee3fa09d12b3a70f772ef100939d80", size = 219171, upload-time = "2025-08-29T15:33:43.974Z" },
- { url = "https://files.pythonhosted.org/packages/bd/e7/917e5953ea29a28c1057729c1d5af9084ab6d9c66217523fd0e10f14d8f6/coverage-7.10.6-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ffea0575345e9ee0144dfe5701aa17f3ba546f8c3bb48db62ae101afb740e7d6", size = 217351, upload-time = "2025-08-29T15:33:45.438Z" },
- { url = "https://files.pythonhosted.org/packages/eb/86/2e161b93a4f11d0ea93f9bebb6a53f113d5d6e416d7561ca41bb0a29996b/coverage-7.10.6-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:95d91d7317cde40a1c249d6b7382750b7e6d86fad9d8eaf4fa3f8f44cf171e80", size = 217600, upload-time = "2025-08-29T15:33:47.269Z" },
- { url = "https://files.pythonhosted.org/packages/0e/66/d03348fdd8df262b3a7fb4ee5727e6e4936e39e2f3a842e803196946f200/coverage-7.10.6-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:3e23dd5408fe71a356b41baa82892772a4cefcf758f2ca3383d2aa39e1b7a003", size = 248600, upload-time = "2025-08-29T15:33:48.953Z" },
- { url = "https://files.pythonhosted.org/packages/73/dd/508420fb47d09d904d962f123221bc249f64b5e56aa93d5f5f7603be475f/coverage-7.10.6-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:0f3f56e4cb573755e96a16501a98bf211f100463d70275759e73f3cbc00d4f27", size = 251206, upload-time = "2025-08-29T15:33:50.697Z" },
- { url = "https://files.pythonhosted.org/packages/e9/1f/9020135734184f439da85c70ea78194c2730e56c2d18aee6e8ff1719d50d/coverage-7.10.6-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:db4a1d897bbbe7339946ffa2fe60c10cc81c43fab8b062d3fcb84188688174a4", size = 252478, upload-time = "2025-08-29T15:33:52.303Z" },
- { url = "https://files.pythonhosted.org/packages/a4/a4/3d228f3942bb5a2051fde28c136eea23a761177dc4ff4ef54533164ce255/coverage-7.10.6-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d8fd7879082953c156d5b13c74aa6cca37f6a6f4747b39538504c3f9c63d043d", size = 250637, upload-time = "2025-08-29T15:33:53.67Z" },
- { url = "https://files.pythonhosted.org/packages/36/e3/293dce8cdb9a83de971637afc59b7190faad60603b40e32635cbd15fbf61/coverage-7.10.6-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:28395ca3f71cd103b8c116333fa9db867f3a3e1ad6a084aa3725ae002b6583bc", size = 248529, upload-time = "2025-08-29T15:33:55.022Z" },
- { url = "https://files.pythonhosted.org/packages/90/26/64eecfa214e80dd1d101e420cab2901827de0e49631d666543d0e53cf597/coverage-7.10.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:61c950fc33d29c91b9e18540e1aed7d9f6787cc870a3e4032493bbbe641d12fc", size = 250143, upload-time = "2025-08-29T15:33:56.386Z" },
- { url = "https://files.pythonhosted.org/packages/3e/70/bd80588338f65ea5b0d97e424b820fb4068b9cfb9597fbd91963086e004b/coverage-7.10.6-cp313-cp313-win32.whl", hash = "sha256:160c00a5e6b6bdf4e5984b0ef21fc860bc94416c41b7df4d63f536d17c38902e", size = 219770, upload-time = "2025-08-29T15:33:58.063Z" },
- { url = "https://files.pythonhosted.org/packages/a7/14/0b831122305abcc1060c008f6c97bbdc0a913ab47d65070a01dc50293c2b/coverage-7.10.6-cp313-cp313-win_amd64.whl", hash = "sha256:628055297f3e2aa181464c3808402887643405573eb3d9de060d81531fa79d32", size = 220566, upload-time = "2025-08-29T15:33:59.766Z" },
- { url = "https://files.pythonhosted.org/packages/83/c6/81a83778c1f83f1a4a168ed6673eeedc205afb562d8500175292ca64b94e/coverage-7.10.6-cp313-cp313-win_arm64.whl", hash = "sha256:df4ec1f8540b0bcbe26ca7dd0f541847cc8a108b35596f9f91f59f0c060bfdd2", size = 219195, upload-time = "2025-08-29T15:34:01.191Z" },
- { url = "https://files.pythonhosted.org/packages/d7/1c/ccccf4bf116f9517275fa85047495515add43e41dfe8e0bef6e333c6b344/coverage-7.10.6-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:c9a8b7a34a4de3ed987f636f71881cd3b8339f61118b1aa311fbda12741bff0b", size = 218059, upload-time = "2025-08-29T15:34:02.91Z" },
- { url = "https://files.pythonhosted.org/packages/92/97/8a3ceff833d27c7492af4f39d5da6761e9ff624831db9e9f25b3886ddbca/coverage-7.10.6-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:8dd5af36092430c2b075cee966719898f2ae87b636cefb85a653f1d0ba5d5393", size = 218287, upload-time = "2025-08-29T15:34:05.106Z" },
- { url = "https://files.pythonhosted.org/packages/92/d8/50b4a32580cf41ff0423777a2791aaf3269ab60c840b62009aec12d3970d/coverage-7.10.6-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:b0353b0f0850d49ada66fdd7d0c7cdb0f86b900bb9e367024fd14a60cecc1e27", size = 259625, upload-time = "2025-08-29T15:34:06.575Z" },
- { url = "https://files.pythonhosted.org/packages/7e/7e/6a7df5a6fb440a0179d94a348eb6616ed4745e7df26bf2a02bc4db72c421/coverage-7.10.6-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:d6b9ae13d5d3e8aeca9ca94198aa7b3ebbc5acfada557d724f2a1f03d2c0b0df", size = 261801, upload-time = "2025-08-29T15:34:08.006Z" },
- { url = "https://files.pythonhosted.org/packages/3a/4c/a270a414f4ed5d196b9d3d67922968e768cd971d1b251e1b4f75e9362f75/coverage-7.10.6-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:675824a363cc05781b1527b39dc2587b8984965834a748177ee3c37b64ffeafb", size = 264027, upload-time = "2025-08-29T15:34:09.806Z" },
- { url = "https://files.pythonhosted.org/packages/9c/8b/3210d663d594926c12f373c5370bf1e7c5c3a427519a8afa65b561b9a55c/coverage-7.10.6-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:692d70ea725f471a547c305f0d0fc6a73480c62fb0da726370c088ab21aed282", size = 261576, upload-time = "2025-08-29T15:34:11.585Z" },
- { url = "https://files.pythonhosted.org/packages/72/d0/e1961eff67e9e1dba3fc5eb7a4caf726b35a5b03776892da8d79ec895775/coverage-7.10.6-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:851430a9a361c7a8484a36126d1d0ff8d529d97385eacc8dfdc9bfc8c2d2cbe4", size = 259341, upload-time = "2025-08-29T15:34:13.159Z" },
- { url = "https://files.pythonhosted.org/packages/3a/06/d6478d152cd189b33eac691cba27a40704990ba95de49771285f34a5861e/coverage-7.10.6-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:d9369a23186d189b2fc95cc08b8160ba242057e887d766864f7adf3c46b2df21", size = 260468, upload-time = "2025-08-29T15:34:14.571Z" },
- { url = "https://files.pythonhosted.org/packages/ed/73/737440247c914a332f0b47f7598535b29965bf305e19bbc22d4c39615d2b/coverage-7.10.6-cp313-cp313t-win32.whl", hash = "sha256:92be86fcb125e9bda0da7806afd29a3fd33fdf58fba5d60318399adf40bf37d0", size = 220429, upload-time = "2025-08-29T15:34:16.394Z" },
- { url = "https://files.pythonhosted.org/packages/bd/76/b92d3214740f2357ef4a27c75a526eb6c28f79c402e9f20a922c295c05e2/coverage-7.10.6-cp313-cp313t-win_amd64.whl", hash = "sha256:6b3039e2ca459a70c79523d39347d83b73f2f06af5624905eba7ec34d64d80b5", size = 221493, upload-time = "2025-08-29T15:34:17.835Z" },
- { url = "https://files.pythonhosted.org/packages/fc/8e/6dcb29c599c8a1f654ec6cb68d76644fe635513af16e932d2d4ad1e5ac6e/coverage-7.10.6-cp313-cp313t-win_arm64.whl", hash = "sha256:3fb99d0786fe17b228eab663d16bee2288e8724d26a199c29325aac4b0319b9b", size = 219757, upload-time = "2025-08-29T15:34:19.248Z" },
- { url = "https://files.pythonhosted.org/packages/d3/aa/76cf0b5ec00619ef208da4689281d48b57f2c7fde883d14bf9441b74d59f/coverage-7.10.6-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:6008a021907be8c4c02f37cdc3ffb258493bdebfeaf9a839f9e71dfdc47b018e", size = 217331, upload-time = "2025-08-29T15:34:20.846Z" },
- { url = "https://files.pythonhosted.org/packages/65/91/8e41b8c7c505d398d7730206f3cbb4a875a35ca1041efc518051bfce0f6b/coverage-7.10.6-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:5e75e37f23eb144e78940b40395b42f2321951206a4f50e23cfd6e8a198d3ceb", size = 217607, upload-time = "2025-08-29T15:34:22.433Z" },
- { url = "https://files.pythonhosted.org/packages/87/7f/f718e732a423d442e6616580a951b8d1ec3575ea48bcd0e2228386805e79/coverage-7.10.6-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:0f7cb359a448e043c576f0da00aa8bfd796a01b06aa610ca453d4dde09cc1034", size = 248663, upload-time = "2025-08-29T15:34:24.425Z" },
- { url = "https://files.pythonhosted.org/packages/e6/52/c1106120e6d801ac03e12b5285e971e758e925b6f82ee9b86db3aa10045d/coverage-7.10.6-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:c68018e4fc4e14b5668f1353b41ccf4bc83ba355f0e1b3836861c6f042d89ac1", size = 251197, upload-time = "2025-08-29T15:34:25.906Z" },
- { url = "https://files.pythonhosted.org/packages/3d/ec/3a8645b1bb40e36acde9c0609f08942852a4af91a937fe2c129a38f2d3f5/coverage-7.10.6-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:cd4b2b0707fc55afa160cd5fc33b27ccbf75ca11d81f4ec9863d5793fc6df56a", size = 252551, upload-time = "2025-08-29T15:34:27.337Z" },
- { url = "https://files.pythonhosted.org/packages/a1/70/09ecb68eeb1155b28a1d16525fd3a9b65fbe75337311a99830df935d62b6/coverage-7.10.6-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:4cec13817a651f8804a86e4f79d815b3b28472c910e099e4d5a0e8a3b6a1d4cb", size = 250553, upload-time = "2025-08-29T15:34:29.065Z" },
- { url = "https://files.pythonhosted.org/packages/c6/80/47df374b893fa812e953b5bc93dcb1427a7b3d7a1a7d2db33043d17f74b9/coverage-7.10.6-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:f2a6a8e06bbda06f78739f40bfb56c45d14eb8249d0f0ea6d4b3d48e1f7c695d", size = 248486, upload-time = "2025-08-29T15:34:30.897Z" },
- { url = "https://files.pythonhosted.org/packages/4a/65/9f98640979ecee1b0d1a7164b589de720ddf8100d1747d9bbdb84be0c0fb/coverage-7.10.6-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:081b98395ced0d9bcf60ada7661a0b75f36b78b9d7e39ea0790bb4ed8da14747", size = 249981, upload-time = "2025-08-29T15:34:32.365Z" },
- { url = "https://files.pythonhosted.org/packages/1f/55/eeb6603371e6629037f47bd25bef300387257ed53a3c5fdb159b7ac8c651/coverage-7.10.6-cp314-cp314-win32.whl", hash = "sha256:6937347c5d7d069ee776b2bf4e1212f912a9f1f141a429c475e6089462fcecc5", size = 220054, upload-time = "2025-08-29T15:34:34.124Z" },
- { url = "https://files.pythonhosted.org/packages/15/d1/a0912b7611bc35412e919a2cd59ae98e7ea3b475e562668040a43fb27897/coverage-7.10.6-cp314-cp314-win_amd64.whl", hash = "sha256:adec1d980fa07e60b6ef865f9e5410ba760e4e1d26f60f7e5772c73b9a5b0713", size = 220851, upload-time = "2025-08-29T15:34:35.651Z" },
- { url = "https://files.pythonhosted.org/packages/ef/2d/11880bb8ef80a45338e0b3e0725e4c2d73ffbb4822c29d987078224fd6a5/coverage-7.10.6-cp314-cp314-win_arm64.whl", hash = "sha256:a80f7aef9535442bdcf562e5a0d5a5538ce8abe6bb209cfbf170c462ac2c2a32", size = 219429, upload-time = "2025-08-29T15:34:37.16Z" },
- { url = "https://files.pythonhosted.org/packages/83/c0/1f00caad775c03a700146f55536ecd097a881ff08d310a58b353a1421be0/coverage-7.10.6-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:0de434f4fbbe5af4fa7989521c655c8c779afb61c53ab561b64dcee6149e4c65", size = 218080, upload-time = "2025-08-29T15:34:38.919Z" },
- { url = "https://files.pythonhosted.org/packages/a9/c4/b1c5d2bd7cc412cbeb035e257fd06ed4e3e139ac871d16a07434e145d18d/coverage-7.10.6-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:6e31b8155150c57e5ac43ccd289d079eb3f825187d7c66e755a055d2c85794c6", size = 218293, upload-time = "2025-08-29T15:34:40.425Z" },
- { url = "https://files.pythonhosted.org/packages/3f/07/4468d37c94724bf6ec354e4ec2f205fda194343e3e85fd2e59cec57e6a54/coverage-7.10.6-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:98cede73eb83c31e2118ae8d379c12e3e42736903a8afcca92a7218e1f2903b0", size = 259800, upload-time = "2025-08-29T15:34:41.996Z" },
- { url = "https://files.pythonhosted.org/packages/82/d8/f8fb351be5fee31690cd8da768fd62f1cfab33c31d9f7baba6cd8960f6b8/coverage-7.10.6-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:f863c08f4ff6b64fa8045b1e3da480f5374779ef187f07b82e0538c68cb4ff8e", size = 261965, upload-time = "2025-08-29T15:34:43.61Z" },
- { url = "https://files.pythonhosted.org/packages/e8/70/65d4d7cfc75c5c6eb2fed3ee5cdf420fd8ae09c4808723a89a81d5b1b9c3/coverage-7.10.6-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2b38261034fda87be356f2c3f42221fdb4171c3ce7658066ae449241485390d5", size = 264220, upload-time = "2025-08-29T15:34:45.387Z" },
- { url = "https://files.pythonhosted.org/packages/98/3c/069df106d19024324cde10e4ec379fe2fb978017d25e97ebee23002fbadf/coverage-7.10.6-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:0e93b1476b79eae849dc3872faeb0bf7948fd9ea34869590bc16a2a00b9c82a7", size = 261660, upload-time = "2025-08-29T15:34:47.288Z" },
- { url = "https://files.pythonhosted.org/packages/fc/8a/2974d53904080c5dc91af798b3a54a4ccb99a45595cc0dcec6eb9616a57d/coverage-7.10.6-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:ff8a991f70f4c0cf53088abf1e3886edcc87d53004c7bb94e78650b4d3dac3b5", size = 259417, upload-time = "2025-08-29T15:34:48.779Z" },
- { url = "https://files.pythonhosted.org/packages/30/38/9616a6b49c686394b318974d7f6e08f38b8af2270ce7488e879888d1e5db/coverage-7.10.6-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ac765b026c9f33044419cbba1da913cfb82cca1b60598ac1c7a5ed6aac4621a0", size = 260567, upload-time = "2025-08-29T15:34:50.718Z" },
- { url = "https://files.pythonhosted.org/packages/76/16/3ed2d6312b371a8cf804abf4e14895b70e4c3491c6e53536d63fd0958a8d/coverage-7.10.6-cp314-cp314t-win32.whl", hash = "sha256:441c357d55f4936875636ef2cfb3bee36e466dcf50df9afbd398ce79dba1ebb7", size = 220831, upload-time = "2025-08-29T15:34:52.653Z" },
- { url = "https://files.pythonhosted.org/packages/d5/e5/d38d0cb830abede2adb8b147770d2a3d0e7fecc7228245b9b1ae6c24930a/coverage-7.10.6-cp314-cp314t-win_amd64.whl", hash = "sha256:073711de3181b2e204e4870ac83a7c4853115b42e9cd4d145f2231e12d670930", size = 221950, upload-time = "2025-08-29T15:34:54.212Z" },
- { url = "https://files.pythonhosted.org/packages/f4/51/e48e550f6279349895b0ffcd6d2a690e3131ba3a7f4eafccc141966d4dea/coverage-7.10.6-cp314-cp314t-win_arm64.whl", hash = "sha256:137921f2bac5559334ba66122b753db6dc5d1cf01eb7b64eb412bb0d064ef35b", size = 219969, upload-time = "2025-08-29T15:34:55.83Z" },
- { url = "https://files.pythonhosted.org/packages/91/70/f73ad83b1d2fd2d5825ac58c8f551193433a7deaf9b0d00a8b69ef61cd9a/coverage-7.10.6-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:90558c35af64971d65fbd935c32010f9a2f52776103a259f1dee865fe8259352", size = 217009, upload-time = "2025-08-29T15:34:57.381Z" },
- { url = "https://files.pythonhosted.org/packages/01/e8/099b55cd48922abbd4b01ddd9ffa352408614413ebfc965501e981aced6b/coverage-7.10.6-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8953746d371e5695405806c46d705a3cd170b9cc2b9f93953ad838f6c1e58612", size = 217400, upload-time = "2025-08-29T15:34:58.985Z" },
- { url = "https://files.pythonhosted.org/packages/ee/d1/c6bac7c9e1003110a318636fef3b5c039df57ab44abcc41d43262a163c28/coverage-7.10.6-cp39-cp39-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:c83f6afb480eae0313114297d29d7c295670a41c11b274e6bca0c64540c1ce7b", size = 243835, upload-time = "2025-08-29T15:35:00.541Z" },
- { url = "https://files.pythonhosted.org/packages/01/f9/82c6c061838afbd2172e773156c0aa84a901d59211b4975a4e93accf5c89/coverage-7.10.6-cp39-cp39-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:7eb68d356ba0cc158ca535ce1381dbf2037fa8cb5b1ae5ddfc302e7317d04144", size = 245658, upload-time = "2025-08-29T15:35:02.135Z" },
- { url = "https://files.pythonhosted.org/packages/81/6a/35674445b1d38161148558a3ff51b0aa7f0b54b1def3abe3fbd34efe05bc/coverage-7.10.6-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5b15a87265e96307482746d86995f4bff282f14b027db75469c446da6127433b", size = 247433, upload-time = "2025-08-29T15:35:03.777Z" },
- { url = "https://files.pythonhosted.org/packages/18/27/98c99e7cafb288730a93535092eb433b5503d529869791681c4f2e2012a8/coverage-7.10.6-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:fc53ba868875bfbb66ee447d64d6413c2db91fddcfca57025a0e7ab5b07d5862", size = 245315, upload-time = "2025-08-29T15:35:05.629Z" },
- { url = "https://files.pythonhosted.org/packages/09/05/123e0dba812408c719c319dea05782433246f7aa7b67e60402d90e847545/coverage-7.10.6-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:efeda443000aa23f276f4df973cb82beca682fd800bb119d19e80504ffe53ec2", size = 243385, upload-time = "2025-08-29T15:35:07.494Z" },
- { url = "https://files.pythonhosted.org/packages/67/52/d57a42502aef05c6325f28e2e81216c2d9b489040132c18725b7a04d1448/coverage-7.10.6-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:9702b59d582ff1e184945d8b501ffdd08d2cee38d93a2206aa5f1365ce0b8d78", size = 244343, upload-time = "2025-08-29T15:35:09.55Z" },
- { url = "https://files.pythonhosted.org/packages/6b/22/7f6fad7dbb37cf99b542c5e157d463bd96b797078b1ec506691bc836f476/coverage-7.10.6-cp39-cp39-win32.whl", hash = "sha256:2195f8e16ba1a44651ca684db2ea2b2d4b5345da12f07d9c22a395202a05b23c", size = 219530, upload-time = "2025-08-29T15:35:11.167Z" },
- { url = "https://files.pythonhosted.org/packages/62/30/e2fda29bfe335026027e11e6a5e57a764c9df13127b5cf42af4c3e99b937/coverage-7.10.6-cp39-cp39-win_amd64.whl", hash = "sha256:f32ff80e7ef6a5b5b606ea69a36e97b219cd9dc799bcf2963018a4d8f788cfbf", size = 220432, upload-time = "2025-08-29T15:35:12.902Z" },
- { url = "https://files.pythonhosted.org/packages/44/0c/50db5379b615854b5cf89146f8f5bd1d5a9693d7f3a987e269693521c404/coverage-7.10.6-py3-none-any.whl", hash = "sha256:92c4ecf6bf11b2e85fd4d8204814dc26e6a19f0c9d938c207c5cb0eadfcabbe3", size = 208986, upload-time = "2025-08-29T15:35:14.506Z" },
-]
-
-[package.optional-dependencies]
-toml = [
- { name = "tomli", marker = "python_full_version >= '3.9' and python_full_version <= '3.11'" },
-]
-
-[[package]]
-name = "exceptiongroup"
-version = "1.3.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "typing-extensions", version = "4.13.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
- { name = "typing-extensions", version = "4.15.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9' and python_full_version < '3.13'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/0b/9f/a65090624ecf468cdca03533906e7c69ed7588582240cfe7cc9e770b50eb/exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88", size = 29749, upload-time = "2025-05-10T17:42:51.123Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/36/f4/c6e662dade71f56cd2f3735141b265c3c79293c109549c1e6933b0651ffc/exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10", size = 16674, upload-time = "2025-05-10T17:42:49.33Z" },
-]
-
-[[package]]
-name = "flake8"
-version = "5.0.4"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version < '3.8.1'",
-]
-dependencies = [
- { name = "mccabe", marker = "python_full_version < '3.8.1'" },
- { name = "pycodestyle", version = "2.9.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.8.1'" },
- { name = "pyflakes", version = "2.5.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.8.1'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/ad/00/9808c62b2d529cefc69ce4e4a1ea42c0f855effa55817b7327ec5b75e60a/flake8-5.0.4.tar.gz", hash = "sha256:6fbe320aad8d6b95cec8b8e47bc933004678dc63095be98528b7bdd2a9f510db", size = 145862, upload-time = "2022-08-03T23:21:27.108Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/cf/a0/b881b63a17a59d9d07f5c0cc91a29182c8e8a9aa2bde5b3b2b16519c02f4/flake8-5.0.4-py2.py3-none-any.whl", hash = "sha256:7a1cf6b73744f5806ab95e526f6f0d8c01c66d7bbe349562d22dfca20610b248", size = 61897, upload-time = "2022-08-03T23:21:25.027Z" },
-]
-
-[[package]]
-name = "flake8"
-version = "7.1.2"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
-]
-dependencies = [
- { name = "mccabe", marker = "python_full_version >= '3.8.1' and python_full_version < '3.9'" },
- { name = "pycodestyle", version = "2.12.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.8.1' and python_full_version < '3.9'" },
- { name = "pyflakes", version = "3.2.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.8.1' and python_full_version < '3.9'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/58/16/3f2a0bb700ad65ac9663262905a025917c020a3f92f014d2ba8964b4602c/flake8-7.1.2.tar.gz", hash = "sha256:c586ffd0b41540951ae41af572e6790dbd49fc12b3aa2541685d253d9bd504bd", size = 48119, upload-time = "2025-02-16T18:45:44.296Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/35/f8/08d37b2cd89da306e3520bd27f8a85692122b42b56c0c2c3784ff09c022f/flake8-7.1.2-py2.py3-none-any.whl", hash = "sha256:1cbc62e65536f65e6d754dfe6f1bada7f5cf392d6f5db3c2b85892466c3e7c1a", size = 57745, upload-time = "2025-02-16T18:45:42.351Z" },
-]
-
-[[package]]
-name = "flake8"
-version = "7.3.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-dependencies = [
- { name = "mccabe", marker = "python_full_version >= '3.9'" },
- { name = "pycodestyle", version = "2.14.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
- { name = "pyflakes", version = "3.4.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/9b/af/fbfe3c4b5a657d79e5c47a2827a362f9e1b763336a52f926126aa6dc7123/flake8-7.3.0.tar.gz", hash = "sha256:fe044858146b9fc69b551a4b490d69cf960fcb78ad1edcb84e7fbb1b4a8e3872", size = 48326, upload-time = "2025-06-20T19:31:35.838Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/9f/56/13ab06b4f93ca7cac71078fbe37fcea175d3216f31f85c3168a6bbd0bb9a/flake8-7.3.0-py2.py3-none-any.whl", hash = "sha256:b9696257b9ce8beb888cdbe31cf885c90d31928fe202be0889a7cdafad32f01e", size = 57922, upload-time = "2025-06-20T19:31:34.425Z" },
-]
-
-[[package]]
-name = "h11"
-version = "0.16.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" },
-]
-
-[[package]]
-name = "httpcore"
-version = "1.0.9"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "certifi" },
- { name = "h11" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" },
-]
-
-[[package]]
-name = "httpx"
-version = "0.28.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "anyio", version = "4.5.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
- { name = "anyio", version = "4.10.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
- { name = "certifi" },
- { name = "httpcore" },
- { name = "idna" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406, upload-time = "2024-12-06T15:37:23.222Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" },
-]
-
-[[package]]
-name = "idna"
-version = "3.10"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490, upload-time = "2024-09-15T18:07:39.745Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442, upload-time = "2024-09-15T18:07:37.964Z" },
-]
-
-[[package]]
-name = "iniconfig"
-version = "2.1.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/f2/97/ebf4da567aa6827c909642694d71c9fcf53e5b504f2d96afea02718862f3/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7", size = 4793, upload-time = "2025-03-19T20:09:59.721Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload-time = "2025-03-19T20:10:01.071Z" },
-]
-
-[[package]]
-name = "isort"
-version = "5.13.2"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/87/f9/c1eb8635a24e87ade2efce21e3ce8cd6b8630bb685ddc9cdaca1349b2eb5/isort-5.13.2.tar.gz", hash = "sha256:48fdfcb9face5d58a4f6dde2e72a1fb8dcaf8ab26f95ab49fab84c2ddefb0109", size = 175303, upload-time = "2023-12-13T20:37:26.124Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/d1/b3/8def84f539e7d2289a02f0524b944b15d7c75dab7628bedf1c4f0992029c/isort-5.13.2-py3-none-any.whl", hash = "sha256:8ca5e72a8d85860d5a3fa69b8745237f2939afe12dbf656afbcb47fe72d947a6", size = 92310, upload-time = "2023-12-13T20:37:23.244Z" },
-]
-
-[[package]]
-name = "isort"
-version = "6.0.1"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/b8/21/1e2a441f74a653a144224d7d21afe8f4169e6c7c20bb13aec3a2dc3815e0/isort-6.0.1.tar.gz", hash = "sha256:1cb5df28dfbc742e490c5e41bad6da41b805b0a8be7bc93cd0fb2a8a890ac450", size = 821955, upload-time = "2025-02-26T21:13:16.955Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/c1/11/114d0a5f4dabbdcedc1125dee0888514c3c3b16d3e9facad87ed96fad97c/isort-6.0.1-py3-none-any.whl", hash = "sha256:2dc5d7f65c9678d94c88dfc29161a320eec67328bc97aad576874cb4be1e9615", size = 94186, upload-time = "2025-02-26T21:13:14.911Z" },
-]
-
-[[package]]
-name = "mccabe"
-version = "0.7.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/e7/ff/0ffefdcac38932a54d2b5eed4e0ba8a408f215002cd178ad1df0f2806ff8/mccabe-0.7.0.tar.gz", hash = "sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325", size = 9658, upload-time = "2022-01-24T01:14:51.113Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/27/1a/1f68f9ba0c207934b35b86a8ca3aad8395a3d6dd7921c0686e23853ff5a9/mccabe-0.7.0-py2.py3-none-any.whl", hash = "sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e", size = 7350, upload-time = "2022-01-24T01:14:49.62Z" },
-]
-
-[[package]]
-name = "mypy-extensions"
-version = "1.1.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/a2/6e/371856a3fb9d31ca8dac321cda606860fa4548858c0cc45d9d1d4ca2628b/mypy_extensions-1.1.0.tar.gz", hash = "sha256:52e68efc3284861e772bbcd66823fde5ae21fd2fdb51c62a211403730b916558", size = 6343, upload-time = "2025-04-22T14:54:24.164Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963, upload-time = "2025-04-22T14:54:22.983Z" },
-]
-
-[[package]]
-name = "packaging"
-version = "25.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727, upload-time = "2025-04-19T11:48:59.673Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" },
-]
-
-[[package]]
-name = "pathspec"
-version = "0.12.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043, upload-time = "2023-12-10T22:30:45Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191, upload-time = "2023-12-10T22:30:43.14Z" },
-]
-
-[[package]]
-name = "platformdirs"
-version = "4.3.6"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/13/fc/128cc9cb8f03208bdbf93d3aa862e16d376844a14f9a0ce5cf4507372de4/platformdirs-4.3.6.tar.gz", hash = "sha256:357fb2acbc885b0419afd3ce3ed34564c13c9b95c89360cd9563f73aa5e2b907", size = 21302, upload-time = "2024-09-17T19:06:50.688Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/3c/a6/bc1012356d8ece4d66dd75c4b9fc6c1f6650ddd5991e421177d9f8f671be/platformdirs-4.3.6-py3-none-any.whl", hash = "sha256:73e575e1408ab8103900836b97580d5307456908a03e92031bab39e4554cc3fb", size = 18439, upload-time = "2024-09-17T19:06:49.212Z" },
-]
-
-[[package]]
-name = "platformdirs"
-version = "4.4.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/23/e8/21db9c9987b0e728855bd57bff6984f67952bea55d6f75e055c46b5383e8/platformdirs-4.4.0.tar.gz", hash = "sha256:ca753cf4d81dc309bc67b0ea38fd15dc97bc30ce419a7f58d13eb3bf14c4febf", size = 21634, upload-time = "2025-08-26T14:32:04.268Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/40/4b/2028861e724d3bd36227adfa20d3fd24c3fc6d52032f4a93c133be5d17ce/platformdirs-4.4.0-py3-none-any.whl", hash = "sha256:abd01743f24e5287cd7a5db3752faf1a2d65353f38ec26d98e25a6db65958c85", size = 18654, upload-time = "2025-08-26T14:32:02.735Z" },
-]
-
-[[package]]
-name = "pluggy"
-version = "1.5.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/96/2d/02d4312c973c6050a18b314a5ad0b3210edb65a906f868e31c111dede4a6/pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1", size = 67955, upload-time = "2024-04-20T21:34:42.531Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/88/5f/e351af9a41f866ac3f1fac4ca0613908d9a41741cfcf2228f4ad853b697d/pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669", size = 20556, upload-time = "2024-04-20T21:34:40.434Z" },
-]
-
-[[package]]
-name = "pluggy"
-version = "1.6.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" },
-]
-
-[[package]]
-name = "pycodestyle"
-version = "2.9.1"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version < '3.8.1'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/b6/83/5bcaedba1f47200f0665ceb07bcb00e2be123192742ee0edfb66b600e5fd/pycodestyle-2.9.1.tar.gz", hash = "sha256:2c9607871d58c76354b697b42f5d57e1ada7d261c261efac224b664affdc5785", size = 102127, upload-time = "2022-08-03T23:13:29.715Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/67/e4/fc77f1039c34b3612c4867b69cbb2b8a4e569720b1f19b0637002ee03aff/pycodestyle-2.9.1-py2.py3-none-any.whl", hash = "sha256:d1735fc58b418fd7c5f658d28d943854f8a849b01a5d0a1e6f3f3fdd0166804b", size = 41493, upload-time = "2022-08-03T23:13:27.416Z" },
-]
-
-[[package]]
-name = "pycodestyle"
-version = "2.12.1"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/43/aa/210b2c9aedd8c1cbeea31a50e42050ad56187754b34eb214c46709445801/pycodestyle-2.12.1.tar.gz", hash = "sha256:6838eae08bbce4f6accd5d5572075c63626a15ee3e6f842df996bf62f6d73521", size = 39232, upload-time = "2024-08-04T20:26:54.576Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/3a/d8/a211b3f85e99a0daa2ddec96c949cac6824bd305b040571b82a03dd62636/pycodestyle-2.12.1-py2.py3-none-any.whl", hash = "sha256:46f0fb92069a7c28ab7bb558f05bfc0110dac69a0cd23c61ea0040283a9d78b3", size = 31284, upload-time = "2024-08-04T20:26:53.173Z" },
-]
-
-[[package]]
-name = "pycodestyle"
-version = "2.14.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/11/e0/abfd2a0d2efe47670df87f3e3a0e2edda42f055053c85361f19c0e2c1ca8/pycodestyle-2.14.0.tar.gz", hash = "sha256:c4b5b517d278089ff9d0abdec919cd97262a3367449ea1c8b49b91529167b783", size = 39472, upload-time = "2025-06-20T18:49:48.75Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/d7/27/a58ddaf8c588a3ef080db9d0b7e0b97215cee3a45df74f3a94dbbf5c893a/pycodestyle-2.14.0-py2.py3-none-any.whl", hash = "sha256:dd6bf7cb4ee77f8e016f9c8e74a35ddd9f67e1d5fd4184d86c3b98e07099f42d", size = 31594, upload-time = "2025-06-20T18:49:47.491Z" },
-]
-
-[[package]]
-name = "pyflakes"
-version = "2.5.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version < '3.8.1'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/07/92/f0cb5381f752e89a598dd2850941e7f570ac3cb8ea4a344854de486db152/pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3", size = 66388, upload-time = "2022-07-30T17:29:05.816Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/dc/13/63178f59f74e53acc2165aee4b002619a3cfa7eeaeac989a9eb41edf364e/pyflakes-2.5.0-py2.py3-none-any.whl", hash = "sha256:4579f67d887f804e67edb544428f264b7b24f435b263c4614f384135cea553d2", size = 66116, upload-time = "2022-07-30T17:29:04.179Z" },
-]
-
-[[package]]
-name = "pyflakes"
-version = "3.2.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/57/f9/669d8c9c86613c9d568757c7f5824bd3197d7b1c6c27553bc5618a27cce2/pyflakes-3.2.0.tar.gz", hash = "sha256:1c61603ff154621fb2a9172037d84dca3500def8c8b630657d1701f026f8af3f", size = 63788, upload-time = "2024-01-05T00:28:47.703Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/d4/d7/f1b7db88d8e4417c5d47adad627a93547f44bdc9028372dbd2313f34a855/pyflakes-3.2.0-py2.py3-none-any.whl", hash = "sha256:84b5be138a2dfbb40689ca07e2152deb896a65c3a3e24c251c5c62489568074a", size = 62725, upload-time = "2024-01-05T00:28:45.903Z" },
-]
-
-[[package]]
-name = "pyflakes"
-version = "3.4.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/45/dc/fd034dc20b4b264b3d015808458391acbf9df40b1e54750ef175d39180b1/pyflakes-3.4.0.tar.gz", hash = "sha256:b24f96fafb7d2ab0ec5075b7350b3d2d2218eab42003821c06344973d3ea2f58", size = 64669, upload-time = "2025-06-20T18:45:27.834Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/c2/2f/81d580a0fb83baeb066698975cb14a618bdbed7720678566f1b046a95fe8/pyflakes-3.4.0-py2.py3-none-any.whl", hash = "sha256:f742a7dbd0d9cb9ea41e9a24a918996e8170c799fa528688d40dd582c8265f4f", size = 63551, upload-time = "2025-06-20T18:45:26.937Z" },
-]
-
-[[package]]
-name = "pygments"
-version = "2.19.2"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" },
-]
-
-[[package]]
-name = "pytest"
-version = "8.3.5"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-dependencies = [
- { name = "colorama", marker = "python_full_version < '3.9' and sys_platform == 'win32'" },
- { name = "exceptiongroup", marker = "python_full_version < '3.9'" },
- { name = "iniconfig", marker = "python_full_version < '3.9'" },
- { name = "packaging", marker = "python_full_version < '3.9'" },
- { name = "pluggy", version = "1.5.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
- { name = "tomli", marker = "python_full_version < '3.9'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/ae/3c/c9d525a414d506893f0cd8a8d0de7706446213181570cdbd766691164e40/pytest-8.3.5.tar.gz", hash = "sha256:f4efe70cc14e511565ac476b57c279e12a855b11f48f212af1080ef2263d3845", size = 1450891, upload-time = "2025-03-02T12:54:54.503Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/30/3d/64ad57c803f1fa1e963a7946b6e0fea4a70df53c1a7fed304586539c2bac/pytest-8.3.5-py3-none-any.whl", hash = "sha256:c69214aa47deac29fad6c2a4f590b9c4a9fdb16a403176fe154b79c0b4d4d820", size = 343634, upload-time = "2025-03-02T12:54:52.069Z" },
-]
-
-[[package]]
-name = "pytest"
-version = "8.4.2"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-dependencies = [
- { name = "colorama", marker = "python_full_version >= '3.9' and sys_platform == 'win32'" },
- { name = "exceptiongroup", marker = "python_full_version >= '3.9' and python_full_version < '3.11'" },
- { name = "iniconfig", marker = "python_full_version >= '3.9'" },
- { name = "packaging", marker = "python_full_version >= '3.9'" },
- { name = "pluggy", version = "1.6.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
- { name = "pygments", marker = "python_full_version >= '3.9'" },
- { name = "tomli", marker = "python_full_version >= '3.9' and python_full_version < '3.11'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/a3/5c/00a0e072241553e1a7496d638deababa67c5058571567b92a7eaa258397c/pytest-8.4.2.tar.gz", hash = "sha256:86c0d0b93306b961d58d62a4db4879f27fe25513d4b969df351abdddb3c30e01", size = 1519618, upload-time = "2025-09-04T14:34:22.711Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/a8/a4/20da314d277121d6534b3a980b29035dcd51e6744bd79075a6ce8fa4eb8d/pytest-8.4.2-py3-none-any.whl", hash = "sha256:872f880de3fc3a5bdc88a11b39c9710c3497a547cfa9320bc3c5e62fbf272e79", size = 365750, upload-time = "2025-09-04T14:34:20.226Z" },
-]
-
-[[package]]
-name = "pytest-asyncio"
-version = "0.24.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-dependencies = [
- { name = "pytest", version = "8.3.5", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/52/6d/c6cf50ce320cf8611df7a1254d86233b3df7cc07f9b5f5cbcb82e08aa534/pytest_asyncio-0.24.0.tar.gz", hash = "sha256:d081d828e576d85f875399194281e92bf8a68d60d72d1a2faf2feddb6c46b276", size = 49855, upload-time = "2024-08-22T08:03:18.145Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/96/31/6607dab48616902f76885dfcf62c08d929796fc3b2d2318faf9fd54dbed9/pytest_asyncio-0.24.0-py3-none-any.whl", hash = "sha256:a811296ed596b69bf0b6f3dc40f83bcaf341b155a269052d82efa2b25ac7037b", size = 18024, upload-time = "2024-08-22T08:03:15.536Z" },
-]
-
-[[package]]
-name = "pytest-asyncio"
-version = "1.1.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-dependencies = [
- { name = "backports-asyncio-runner", marker = "python_full_version >= '3.9' and python_full_version < '3.11'" },
- { name = "pytest", version = "8.4.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
- { name = "typing-extensions", version = "4.15.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version == '3.9.*'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/4e/51/f8794af39eeb870e87a8c8068642fc07bce0c854d6865d7dd0f2a9d338c2/pytest_asyncio-1.1.0.tar.gz", hash = "sha256:796aa822981e01b68c12e4827b8697108f7205020f24b5793b3c41555dab68ea", size = 46652, upload-time = "2025-07-16T04:29:26.393Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/c7/9d/bf86eddabf8c6c9cb1ea9a869d6873b46f105a5d292d3a6f7071f5b07935/pytest_asyncio-1.1.0-py3-none-any.whl", hash = "sha256:5fe2d69607b0bd75c656d1211f969cadba035030156745ee09e7d71740e58ecf", size = 15157, upload-time = "2025-07-16T04:29:24.929Z" },
-]
-
-[[package]]
-name = "pytest-cov"
-version = "5.0.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-dependencies = [
- { name = "coverage", version = "7.6.1", source = { registry = "https://pypi.org/simple" }, extra = ["toml"], marker = "python_full_version < '3.9'" },
- { name = "pytest", version = "8.3.5", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/74/67/00efc8d11b630c56f15f4ad9c7f9223f1e5ec275aaae3fa9118c6a223ad2/pytest-cov-5.0.0.tar.gz", hash = "sha256:5837b58e9f6ebd335b0f8060eecce69b662415b16dc503883a02f45dfeb14857", size = 63042, upload-time = "2024-03-24T20:16:34.856Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/78/3a/af5b4fa5961d9a1e6237b530eb87dd04aea6eb83da09d2a4073d81b54ccf/pytest_cov-5.0.0-py3-none-any.whl", hash = "sha256:4f0764a1219df53214206bf1feea4633c3b558a2925c8b59f144f682861ce652", size = 21990, upload-time = "2024-03-24T20:16:32.444Z" },
-]
-
-[[package]]
-name = "pytest-cov"
-version = "7.0.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-dependencies = [
- { name = "coverage", version = "7.10.6", source = { registry = "https://pypi.org/simple" }, extra = ["toml"], marker = "python_full_version >= '3.9'" },
- { name = "pluggy", version = "1.6.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
- { name = "pytest", version = "8.4.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/5e/f7/c933acc76f5208b3b00089573cf6a2bc26dc80a8aece8f52bb7d6b1855ca/pytest_cov-7.0.0.tar.gz", hash = "sha256:33c97eda2e049a0c5298e91f519302a1334c26ac65c1a483d6206fd458361af1", size = 54328, upload-time = "2025-09-09T10:57:02.113Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/ee/49/1377b49de7d0c1ce41292161ea0f721913fa8722c19fb9c1e3aa0367eecb/pytest_cov-7.0.0-py3-none-any.whl", hash = "sha256:3b8e9558b16cc1479da72058bdecf8073661c7f57f7d3c5f22a1c23507f2d861", size = 22424, upload-time = "2025-09-09T10:57:00.695Z" },
-]
-
-[[package]]
-name = "sniffio"
-version = "1.3.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" },
-]
-
-[[package]]
-name = "tomli"
-version = "2.2.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/18/87/302344fed471e44a87289cf4967697d07e532f2421fdaf868a303cbae4ff/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff", size = 17175, upload-time = "2024-11-27T22:38:36.873Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/43/ca/75707e6efa2b37c77dadb324ae7d9571cb424e61ea73fad7c56c2d14527f/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249", size = 131077, upload-time = "2024-11-27T22:37:54.956Z" },
- { url = "https://files.pythonhosted.org/packages/c7/16/51ae563a8615d472fdbffc43a3f3d46588c264ac4f024f63f01283becfbb/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6", size = 123429, upload-time = "2024-11-27T22:37:56.698Z" },
- { url = "https://files.pythonhosted.org/packages/f1/dd/4f6cd1e7b160041db83c694abc78e100473c15d54620083dbd5aae7b990e/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a", size = 226067, upload-time = "2024-11-27T22:37:57.63Z" },
- { url = "https://files.pythonhosted.org/packages/a9/6b/c54ede5dc70d648cc6361eaf429304b02f2871a345bbdd51e993d6cdf550/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee", size = 236030, upload-time = "2024-11-27T22:37:59.344Z" },
- { url = "https://files.pythonhosted.org/packages/1f/47/999514fa49cfaf7a92c805a86c3c43f4215621855d151b61c602abb38091/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e", size = 240898, upload-time = "2024-11-27T22:38:00.429Z" },
- { url = "https://files.pythonhosted.org/packages/73/41/0a01279a7ae09ee1573b423318e7934674ce06eb33f50936655071d81a24/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4", size = 229894, upload-time = "2024-11-27T22:38:02.094Z" },
- { url = "https://files.pythonhosted.org/packages/55/18/5d8bc5b0a0362311ce4d18830a5d28943667599a60d20118074ea1b01bb7/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106", size = 245319, upload-time = "2024-11-27T22:38:03.206Z" },
- { url = "https://files.pythonhosted.org/packages/92/a3/7ade0576d17f3cdf5ff44d61390d4b3febb8a9fc2b480c75c47ea048c646/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8", size = 238273, upload-time = "2024-11-27T22:38:04.217Z" },
- { url = "https://files.pythonhosted.org/packages/72/6f/fa64ef058ac1446a1e51110c375339b3ec6be245af9d14c87c4a6412dd32/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff", size = 98310, upload-time = "2024-11-27T22:38:05.908Z" },
- { url = "https://files.pythonhosted.org/packages/6a/1c/4a2dcde4a51b81be3530565e92eda625d94dafb46dbeb15069df4caffc34/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b", size = 108309, upload-time = "2024-11-27T22:38:06.812Z" },
- { url = "https://files.pythonhosted.org/packages/52/e1/f8af4c2fcde17500422858155aeb0d7e93477a0d59a98e56cbfe75070fd0/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea", size = 132762, upload-time = "2024-11-27T22:38:07.731Z" },
- { url = "https://files.pythonhosted.org/packages/03/b8/152c68bb84fc00396b83e7bbddd5ec0bd3dd409db4195e2a9b3e398ad2e3/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8", size = 123453, upload-time = "2024-11-27T22:38:09.384Z" },
- { url = "https://files.pythonhosted.org/packages/c8/d6/fc9267af9166f79ac528ff7e8c55c8181ded34eb4b0e93daa767b8841573/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192", size = 233486, upload-time = "2024-11-27T22:38:10.329Z" },
- { url = "https://files.pythonhosted.org/packages/5c/51/51c3f2884d7bab89af25f678447ea7d297b53b5a3b5730a7cb2ef6069f07/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222", size = 242349, upload-time = "2024-11-27T22:38:11.443Z" },
- { url = "https://files.pythonhosted.org/packages/ab/df/bfa89627d13a5cc22402e441e8a931ef2108403db390ff3345c05253935e/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77", size = 252159, upload-time = "2024-11-27T22:38:13.099Z" },
- { url = "https://files.pythonhosted.org/packages/9e/6e/fa2b916dced65763a5168c6ccb91066f7639bdc88b48adda990db10c8c0b/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6", size = 237243, upload-time = "2024-11-27T22:38:14.766Z" },
- { url = "https://files.pythonhosted.org/packages/b4/04/885d3b1f650e1153cbb93a6a9782c58a972b94ea4483ae4ac5cedd5e4a09/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd", size = 259645, upload-time = "2024-11-27T22:38:15.843Z" },
- { url = "https://files.pythonhosted.org/packages/9c/de/6b432d66e986e501586da298e28ebeefd3edc2c780f3ad73d22566034239/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e", size = 244584, upload-time = "2024-11-27T22:38:17.645Z" },
- { url = "https://files.pythonhosted.org/packages/1c/9a/47c0449b98e6e7d1be6cbac02f93dd79003234ddc4aaab6ba07a9a7482e2/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98", size = 98875, upload-time = "2024-11-27T22:38:19.159Z" },
- { url = "https://files.pythonhosted.org/packages/ef/60/9b9638f081c6f1261e2688bd487625cd1e660d0a85bd469e91d8db969734/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4", size = 109418, upload-time = "2024-11-27T22:38:20.064Z" },
- { url = "https://files.pythonhosted.org/packages/04/90/2ee5f2e0362cb8a0b6499dc44f4d7d48f8fff06d28ba46e6f1eaa61a1388/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7", size = 132708, upload-time = "2024-11-27T22:38:21.659Z" },
- { url = "https://files.pythonhosted.org/packages/c0/ec/46b4108816de6b385141f082ba99e315501ccd0a2ea23db4a100dd3990ea/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c", size = 123582, upload-time = "2024-11-27T22:38:22.693Z" },
- { url = "https://files.pythonhosted.org/packages/a0/bd/b470466d0137b37b68d24556c38a0cc819e8febe392d5b199dcd7f578365/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13", size = 232543, upload-time = "2024-11-27T22:38:24.367Z" },
- { url = "https://files.pythonhosted.org/packages/d9/e5/82e80ff3b751373f7cead2815bcbe2d51c895b3c990686741a8e56ec42ab/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281", size = 241691, upload-time = "2024-11-27T22:38:26.081Z" },
- { url = "https://files.pythonhosted.org/packages/05/7e/2a110bc2713557d6a1bfb06af23dd01e7dde52b6ee7dadc589868f9abfac/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272", size = 251170, upload-time = "2024-11-27T22:38:27.921Z" },
- { url = "https://files.pythonhosted.org/packages/64/7b/22d713946efe00e0adbcdfd6d1aa119ae03fd0b60ebed51ebb3fa9f5a2e5/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140", size = 236530, upload-time = "2024-11-27T22:38:29.591Z" },
- { url = "https://files.pythonhosted.org/packages/38/31/3a76f67da4b0cf37b742ca76beaf819dca0ebef26d78fc794a576e08accf/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2", size = 258666, upload-time = "2024-11-27T22:38:30.639Z" },
- { url = "https://files.pythonhosted.org/packages/07/10/5af1293da642aded87e8a988753945d0cf7e00a9452d3911dd3bb354c9e2/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744", size = 243954, upload-time = "2024-11-27T22:38:31.702Z" },
- { url = "https://files.pythonhosted.org/packages/5b/b9/1ed31d167be802da0fc95020d04cd27b7d7065cc6fbefdd2f9186f60d7bd/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec", size = 98724, upload-time = "2024-11-27T22:38:32.837Z" },
- { url = "https://files.pythonhosted.org/packages/c7/32/b0963458706accd9afcfeb867c0f9175a741bf7b19cd424230714d722198/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69", size = 109383, upload-time = "2024-11-27T22:38:34.455Z" },
- { url = "https://files.pythonhosted.org/packages/6e/c2/61d3e0f47e2b74ef40a68b9e6ad5984f6241a942f7cd3bbfbdbd03861ea9/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc", size = 14257, upload-time = "2024-11-27T22:38:35.385Z" },
-]
-
-[[package]]
-name = "typing-extensions"
-version = "4.13.2"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/f6/37/23083fcd6e35492953e8d2aaaa68b860eb422b34627b13f2ce3eb6106061/typing_extensions-4.13.2.tar.gz", hash = "sha256:e6c81219bd689f51865d9e372991c540bda33a0379d5573cddb9a3a23f7caaef", size = 106967, upload-time = "2025-04-10T14:19:05.416Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/8b/54/b1ae86c0973cc6f0210b53d508ca3641fb6d0c56823f288d108bc7ab3cc8/typing_extensions-4.13.2-py3-none-any.whl", hash = "sha256:a439e7c04b49fec3e5d3e2beaa21755cadbbdc391694e28ccdd36ca4a1408f8c", size = 45806, upload-time = "2025-04-10T14:19:03.967Z" },
-]
-
-[[package]]
-name = "typing-extensions"
-version = "4.15.0"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" },
-]
-
-[[package]]
-name = "vteam-shared-configs"
-version = "1.0.0"
-source = { editable = "." }
-dependencies = [
- { name = "click", version = "8.1.8", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.10'" },
- { name = "click", version = "8.2.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.10'" },
- { name = "httpx" },
- { name = "pytest", version = "8.3.5", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
- { name = "pytest", version = "8.4.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
- { name = "pytest-asyncio", version = "0.24.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
- { name = "pytest-asyncio", version = "1.1.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
- { name = "pytest-cov", version = "5.0.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
- { name = "pytest-cov", version = "7.0.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
- { name = "websockets", version = "13.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
- { name = "websockets", version = "15.0.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
-]
-
-[package.optional-dependencies]
-dev = [
- { name = "black", version = "24.8.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
- { name = "black", version = "25.1.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
- { name = "flake8", version = "5.0.4", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.8.1'" },
- { name = "flake8", version = "7.1.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.8.1' and python_full_version < '3.9'" },
- { name = "flake8", version = "7.3.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
- { name = "isort", version = "5.13.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
- { name = "isort", version = "6.0.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
- { name = "pytest", version = "8.3.5", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.9'" },
- { name = "pytest", version = "8.4.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.9'" },
-]
-
-[package.metadata]
-requires-dist = [
- { name = "black", marker = "extra == 'dev'" },
- { name = "click", specifier = ">=8.0.0" },
- { name = "flake8", marker = "extra == 'dev'" },
- { name = "httpx", specifier = ">=0.28.1" },
- { name = "isort", marker = "extra == 'dev'" },
- { name = "pytest", specifier = ">=8.3.5" },
- { name = "pytest", marker = "extra == 'dev'", specifier = ">=6.0" },
- { name = "pytest-asyncio", specifier = ">=0.24.0" },
- { name = "pytest-cov", specifier = ">=5.0.0" },
- { name = "websockets", specifier = ">=13.1" },
-]
-provides-extras = ["dev"]
-
-[[package]]
-name = "websockets"
-version = "13.1"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.8.1' and python_full_version < '3.9'",
- "python_full_version < '3.8.1'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/e2/73/9223dbc7be3dcaf2a7bbf756c351ec8da04b1fa573edaf545b95f6b0c7fd/websockets-13.1.tar.gz", hash = "sha256:a3b3366087c1bc0a2795111edcadddb8b3b59509d5db5d7ea3fdd69f954a8878", size = 158549, upload-time = "2024-09-21T17:34:21.54Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/0a/94/d15dbfc6a5eb636dbc754303fba18208f2e88cf97e733e1d64fb9cb5c89e/websockets-13.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f48c749857f8fb598fb890a75f540e3221d0976ed0bf879cf3c7eef34151acee", size = 157815, upload-time = "2024-09-21T17:32:27.107Z" },
- { url = "https://files.pythonhosted.org/packages/30/02/c04af33f4663945a26f5e8cf561eb140c35452b50af47a83c3fbcfe62ae1/websockets-13.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:c7e72ce6bda6fb9409cc1e8164dd41d7c91466fb599eb047cfda72fe758a34a7", size = 155466, upload-time = "2024-09-21T17:32:28.428Z" },
- { url = "https://files.pythonhosted.org/packages/35/e8/719f08d12303ea643655e52d9e9851b2dadbb1991d4926d9ce8862efa2f5/websockets-13.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f779498eeec470295a2b1a5d97aa1bc9814ecd25e1eb637bd9d1c73a327387f6", size = 155716, upload-time = "2024-09-21T17:32:29.905Z" },
- { url = "https://files.pythonhosted.org/packages/91/e1/14963ae0252a8925f7434065d25dcd4701d5e281a0b4b460a3b5963d2594/websockets-13.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4676df3fe46956fbb0437d8800cd5f2b6d41143b6e7e842e60554398432cf29b", size = 164806, upload-time = "2024-09-21T17:32:31.384Z" },
- { url = "https://files.pythonhosted.org/packages/ec/fa/ab28441bae5e682a0f7ddf3d03440c0c352f930da419301f4a717f675ef3/websockets-13.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a7affedeb43a70351bb811dadf49493c9cfd1ed94c9c70095fd177e9cc1541fa", size = 163810, upload-time = "2024-09-21T17:32:32.384Z" },
- { url = "https://files.pythonhosted.org/packages/44/77/dea187bd9d16d4b91566a2832be31f99a40d0f5bfa55eeb638eb2c3bc33d/websockets-13.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1971e62d2caa443e57588e1d82d15f663b29ff9dfe7446d9964a4b6f12c1e700", size = 164125, upload-time = "2024-09-21T17:32:33.398Z" },
- { url = "https://files.pythonhosted.org/packages/cf/d9/3af14544e83f1437eb684b399e6ba0fa769438e869bf5d83d74bc197fae8/websockets-13.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:5f2e75431f8dc4a47f31565a6e1355fb4f2ecaa99d6b89737527ea917066e26c", size = 164532, upload-time = "2024-09-21T17:32:35.109Z" },
- { url = "https://files.pythonhosted.org/packages/1c/8a/6d332eabe7d59dfefe4b8ba6f46c8c5fabb15b71c8a8bc3d2b65de19a7b6/websockets-13.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:58cf7e75dbf7e566088b07e36ea2e3e2bd5676e22216e4cad108d4df4a7402a0", size = 163948, upload-time = "2024-09-21T17:32:36.214Z" },
- { url = "https://files.pythonhosted.org/packages/1a/91/a0aeadbaf3017467a1ee03f8fb67accdae233fe2d5ad4b038c0a84e357b0/websockets-13.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:c90d6dec6be2c7d03378a574de87af9b1efea77d0c52a8301dd831ece938452f", size = 163898, upload-time = "2024-09-21T17:32:37.277Z" },
- { url = "https://files.pythonhosted.org/packages/71/31/a90fb47c63e0ae605be914b0b969d7c6e6ffe2038cd744798e4b3fbce53b/websockets-13.1-cp310-cp310-win32.whl", hash = "sha256:730f42125ccb14602f455155084f978bd9e8e57e89b569b4d7f0f0c17a448ffe", size = 158706, upload-time = "2024-09-21T17:32:38.755Z" },
- { url = "https://files.pythonhosted.org/packages/93/ca/9540a9ba80da04dc7f36d790c30cae4252589dbd52ccdc92e75b0be22437/websockets-13.1-cp310-cp310-win_amd64.whl", hash = "sha256:5993260f483d05a9737073be197371940c01b257cc45ae3f1d5d7adb371b266a", size = 159141, upload-time = "2024-09-21T17:32:40.495Z" },
- { url = "https://files.pythonhosted.org/packages/b2/f0/cf0b8a30d86b49e267ac84addbebbc7a48a6e7bb7c19db80f62411452311/websockets-13.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:61fc0dfcda609cda0fc9fe7977694c0c59cf9d749fbb17f4e9483929e3c48a19", size = 157813, upload-time = "2024-09-21T17:32:42.188Z" },
- { url = "https://files.pythonhosted.org/packages/bf/e7/22285852502e33071a8cf0ac814f8988480ec6db4754e067b8b9d0e92498/websockets-13.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ceec59f59d092c5007e815def4ebb80c2de330e9588e101cf8bd94c143ec78a5", size = 155469, upload-time = "2024-09-21T17:32:43.858Z" },
- { url = "https://files.pythonhosted.org/packages/68/d4/c8c7c1e5b40ee03c5cc235955b0fb1ec90e7e37685a5f69229ad4708dcde/websockets-13.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:c1dca61c6db1166c48b95198c0b7d9c990b30c756fc2923cc66f68d17dc558fd", size = 155717, upload-time = "2024-09-21T17:32:44.914Z" },
- { url = "https://files.pythonhosted.org/packages/c9/e4/c50999b9b848b1332b07c7fd8886179ac395cb766fda62725d1539e7bc6c/websockets-13.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:308e20f22c2c77f3f39caca508e765f8725020b84aa963474e18c59accbf4c02", size = 165379, upload-time = "2024-09-21T17:32:45.933Z" },
- { url = "https://files.pythonhosted.org/packages/bc/49/4a4ad8c072f18fd79ab127650e47b160571aacfc30b110ee305ba25fffc9/websockets-13.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:62d516c325e6540e8a57b94abefc3459d7dab8ce52ac75c96cad5549e187e3a7", size = 164376, upload-time = "2024-09-21T17:32:46.987Z" },
- { url = "https://files.pythonhosted.org/packages/af/9b/8c06d425a1d5a74fd764dd793edd02be18cf6fc3b1ccd1f29244ba132dc0/websockets-13.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87c6e35319b46b99e168eb98472d6c7d8634ee37750d7693656dc766395df096", size = 164753, upload-time = "2024-09-21T17:32:48.046Z" },
- { url = "https://files.pythonhosted.org/packages/d5/5b/0acb5815095ff800b579ffc38b13ab1b915b317915023748812d24e0c1ac/websockets-13.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:5f9fee94ebafbc3117c30be1844ed01a3b177bb6e39088bc6b2fa1dc15572084", size = 165051, upload-time = "2024-09-21T17:32:49.271Z" },
- { url = "https://files.pythonhosted.org/packages/30/93/c3891c20114eacb1af09dedfcc620c65c397f4fd80a7009cd12d9457f7f5/websockets-13.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:7c1e90228c2f5cdde263253fa5db63e6653f1c00e7ec64108065a0b9713fa1b3", size = 164489, upload-time = "2024-09-21T17:32:50.392Z" },
- { url = "https://files.pythonhosted.org/packages/28/09/af9e19885539759efa2e2cd29b8b3f9eecef7ecefea40d46612f12138b36/websockets-13.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:6548f29b0e401eea2b967b2fdc1c7c7b5ebb3eeb470ed23a54cd45ef078a0db9", size = 164438, upload-time = "2024-09-21T17:32:52.223Z" },
- { url = "https://files.pythonhosted.org/packages/b6/08/6f38b8e625b3d93de731f1d248cc1493327f16cb45b9645b3e791782cff0/websockets-13.1-cp311-cp311-win32.whl", hash = "sha256:c11d4d16e133f6df8916cc5b7e3e96ee4c44c936717d684a94f48f82edb7c92f", size = 158710, upload-time = "2024-09-21T17:32:53.244Z" },
- { url = "https://files.pythonhosted.org/packages/fb/39/ec8832ecb9bb04a8d318149005ed8cee0ba4e0205835da99e0aa497a091f/websockets-13.1-cp311-cp311-win_amd64.whl", hash = "sha256:d04f13a1d75cb2b8382bdc16ae6fa58c97337253826dfe136195b7f89f661557", size = 159137, upload-time = "2024-09-21T17:32:54.721Z" },
- { url = "https://files.pythonhosted.org/packages/df/46/c426282f543b3c0296cf964aa5a7bb17e984f58dde23460c3d39b3148fcf/websockets-13.1-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:9d75baf00138f80b48f1eac72ad1535aac0b6461265a0bcad391fc5aba875cfc", size = 157821, upload-time = "2024-09-21T17:32:56.442Z" },
- { url = "https://files.pythonhosted.org/packages/aa/85/22529867010baac258da7c45848f9415e6cf37fef00a43856627806ffd04/websockets-13.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:9b6f347deb3dcfbfde1c20baa21c2ac0751afaa73e64e5b693bb2b848efeaa49", size = 155480, upload-time = "2024-09-21T17:32:57.698Z" },
- { url = "https://files.pythonhosted.org/packages/29/2c/bdb339bfbde0119a6e84af43ebf6275278698a2241c2719afc0d8b0bdbf2/websockets-13.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:de58647e3f9c42f13f90ac7e5f58900c80a39019848c5547bc691693098ae1bd", size = 155715, upload-time = "2024-09-21T17:32:59.429Z" },
- { url = "https://files.pythonhosted.org/packages/9f/d0/8612029ea04c5c22bf7af2fd3d63876c4eaeef9b97e86c11972a43aa0e6c/websockets-13.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1b54689e38d1279a51d11e3467dd2f3a50f5f2e879012ce8f2d6943f00e83f0", size = 165647, upload-time = "2024-09-21T17:33:00.495Z" },
- { url = "https://files.pythonhosted.org/packages/56/04/1681ed516fa19ca9083f26d3f3a302257e0911ba75009533ed60fbb7b8d1/websockets-13.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cf1781ef73c073e6b0f90af841aaf98501f975d306bbf6221683dd594ccc52b6", size = 164592, upload-time = "2024-09-21T17:33:02.223Z" },
- { url = "https://files.pythonhosted.org/packages/38/6f/a96417a49c0ed132bb6087e8e39a37db851c70974f5c724a4b2a70066996/websockets-13.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d23b88b9388ed85c6faf0e74d8dec4f4d3baf3ecf20a65a47b836d56260d4b9", size = 165012, upload-time = "2024-09-21T17:33:03.288Z" },
- { url = "https://files.pythonhosted.org/packages/40/8b/fccf294919a1b37d190e86042e1a907b8f66cff2b61e9befdbce03783e25/websockets-13.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3c78383585f47ccb0fcf186dcb8a43f5438bd7d8f47d69e0b56f71bf431a0a68", size = 165311, upload-time = "2024-09-21T17:33:04.728Z" },
- { url = "https://files.pythonhosted.org/packages/c1/61/f8615cf7ce5fe538476ab6b4defff52beb7262ff8a73d5ef386322d9761d/websockets-13.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:d6d300f8ec35c24025ceb9b9019ae9040c1ab2f01cddc2bcc0b518af31c75c14", size = 164692, upload-time = "2024-09-21T17:33:05.829Z" },
- { url = "https://files.pythonhosted.org/packages/5c/f1/a29dd6046d3a722d26f182b783a7997d25298873a14028c4760347974ea3/websockets-13.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a9dcaf8b0cc72a392760bb8755922c03e17a5a54e08cca58e8b74f6902b433cf", size = 164686, upload-time = "2024-09-21T17:33:06.823Z" },
- { url = "https://files.pythonhosted.org/packages/0f/99/ab1cdb282f7e595391226f03f9b498f52109d25a2ba03832e21614967dfa/websockets-13.1-cp312-cp312-win32.whl", hash = "sha256:2f85cf4f2a1ba8f602298a853cec8526c2ca42a9a4b947ec236eaedb8f2dc80c", size = 158712, upload-time = "2024-09-21T17:33:07.877Z" },
- { url = "https://files.pythonhosted.org/packages/46/93/e19160db48b5581feac8468330aa11b7292880a94a37d7030478596cc14e/websockets-13.1-cp312-cp312-win_amd64.whl", hash = "sha256:38377f8b0cdeee97c552d20cf1865695fcd56aba155ad1b4ca8779a5b6ef4ac3", size = 159145, upload-time = "2024-09-21T17:33:09.202Z" },
- { url = "https://files.pythonhosted.org/packages/51/20/2b99ca918e1cbd33c53db2cace5f0c0cd8296fc77558e1908799c712e1cd/websockets-13.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a9ab1e71d3d2e54a0aa646ab6d4eebfaa5f416fe78dfe4da2839525dc5d765c6", size = 157828, upload-time = "2024-09-21T17:33:10.987Z" },
- { url = "https://files.pythonhosted.org/packages/b8/47/0932a71d3d9c0e9483174f60713c84cee58d62839a143f21a2bcdbd2d205/websockets-13.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b9d7439d7fab4dce00570bb906875734df13d9faa4b48e261c440a5fec6d9708", size = 155487, upload-time = "2024-09-21T17:33:12.153Z" },
- { url = "https://files.pythonhosted.org/packages/a9/60/f1711eb59ac7a6c5e98e5637fef5302f45b6f76a2c9d64fd83bbb341377a/websockets-13.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:327b74e915cf13c5931334c61e1a41040e365d380f812513a255aa804b183418", size = 155721, upload-time = "2024-09-21T17:33:13.909Z" },
- { url = "https://files.pythonhosted.org/packages/6a/e6/ba9a8db7f9d9b0e5f829cf626ff32677f39824968317223605a6b419d445/websockets-13.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:325b1ccdbf5e5725fdcb1b0e9ad4d2545056479d0eee392c291c1bf76206435a", size = 165609, upload-time = "2024-09-21T17:33:14.967Z" },
- { url = "https://files.pythonhosted.org/packages/c1/22/4ec80f1b9c27a0aebd84ccd857252eda8418ab9681eb571b37ca4c5e1305/websockets-13.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:346bee67a65f189e0e33f520f253d5147ab76ae42493804319b5716e46dddf0f", size = 164556, upload-time = "2024-09-21T17:33:17.113Z" },
- { url = "https://files.pythonhosted.org/packages/27/ac/35f423cb6bb15600438db80755609d27eda36d4c0b3c9d745ea12766c45e/websockets-13.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:91a0fa841646320ec0d3accdff5b757b06e2e5c86ba32af2e0815c96c7a603c5", size = 164993, upload-time = "2024-09-21T17:33:18.168Z" },
- { url = "https://files.pythonhosted.org/packages/31/4e/98db4fd267f8be9e52e86b6ee4e9aa7c42b83452ea0ea0672f176224b977/websockets-13.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:18503d2c5f3943e93819238bf20df71982d193f73dcecd26c94514f417f6b135", size = 165360, upload-time = "2024-09-21T17:33:19.233Z" },
- { url = "https://files.pythonhosted.org/packages/3f/15/3f0de7cda70ffc94b7e7024544072bc5b26e2c1eb36545291abb755d8cdb/websockets-13.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:a9cd1af7e18e5221d2878378fbc287a14cd527fdd5939ed56a18df8a31136bb2", size = 164745, upload-time = "2024-09-21T17:33:20.361Z" },
- { url = "https://files.pythonhosted.org/packages/a1/6e/66b6b756aebbd680b934c8bdbb6dcb9ce45aad72cde5f8a7208dbb00dd36/websockets-13.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:70c5be9f416aa72aab7a2a76c90ae0a4fe2755c1816c153c1a2bcc3333ce4ce6", size = 164732, upload-time = "2024-09-21T17:33:23.103Z" },
- { url = "https://files.pythonhosted.org/packages/35/c6/12e3aab52c11aeb289e3dbbc05929e7a9d90d7a9173958477d3ef4f8ce2d/websockets-13.1-cp313-cp313-win32.whl", hash = "sha256:624459daabeb310d3815b276c1adef475b3e6804abaf2d9d2c061c319f7f187d", size = 158709, upload-time = "2024-09-21T17:33:24.196Z" },
- { url = "https://files.pythonhosted.org/packages/41/d8/63d6194aae711d7263df4498200c690a9c39fb437ede10f3e157a6343e0d/websockets-13.1-cp313-cp313-win_amd64.whl", hash = "sha256:c518e84bb59c2baae725accd355c8dc517b4a3ed8db88b4bc93c78dae2974bf2", size = 159144, upload-time = "2024-09-21T17:33:25.96Z" },
- { url = "https://files.pythonhosted.org/packages/83/69/59872420e5bce60db166d6fba39ee24c719d339fb0ae48cb2ce580129882/websockets-13.1-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:c7934fd0e920e70468e676fe7f1b7261c1efa0d6c037c6722278ca0228ad9d0d", size = 157811, upload-time = "2024-09-21T17:33:27.379Z" },
- { url = "https://files.pythonhosted.org/packages/bb/f7/0610032e0d3981758fdd6ee7c68cc02ebf668a762c5178d3d91748228849/websockets-13.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:149e622dc48c10ccc3d2760e5f36753db9cacf3ad7bc7bbbfd7d9c819e286f23", size = 155471, upload-time = "2024-09-21T17:33:28.473Z" },
- { url = "https://files.pythonhosted.org/packages/55/2f/c43173a72ea395263a427a36d25bce2675f41c809424466a13c61a9a2d61/websockets-13.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:a569eb1b05d72f9bce2ebd28a1ce2054311b66677fcd46cf36204ad23acead8c", size = 155713, upload-time = "2024-09-21T17:33:29.795Z" },
- { url = "https://files.pythonhosted.org/packages/92/7e/8fa930c6426a56c47910792717787640329e4a0e37cdfda20cf89da67126/websockets-13.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:95df24ca1e1bd93bbca51d94dd049a984609687cb2fb08a7f2c56ac84e9816ea", size = 164995, upload-time = "2024-09-21T17:33:30.802Z" },
- { url = "https://files.pythonhosted.org/packages/27/29/50ed4c68a3f606565a2db4b13948ae7b6f6c53aa9f8f258d92be6698d276/websockets-13.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d8dbb1bf0c0a4ae8b40bdc9be7f644e2f3fb4e8a9aca7145bfa510d4a374eeb7", size = 164057, upload-time = "2024-09-21T17:33:31.862Z" },
- { url = "https://files.pythonhosted.org/packages/3c/0e/60da63b1c53c47f389f79312b3356cb305600ffad1274d7ec473128d4e6b/websockets-13.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:035233b7531fb92a76beefcbf479504db8c72eb3bff41da55aecce3a0f729e54", size = 164340, upload-time = "2024-09-21T17:33:33.022Z" },
- { url = "https://files.pythonhosted.org/packages/20/ef/d87c5fc0aa7fafad1d584b6459ddfe062edf0d0dd64800a02e67e5de048b/websockets-13.1-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:e4450fc83a3df53dec45922b576e91e94f5578d06436871dce3a6be38e40f5db", size = 164222, upload-time = "2024-09-21T17:33:34.423Z" },
- { url = "https://files.pythonhosted.org/packages/f2/c4/7916e1f6b5252d3dcb9121b67d7fdbb2d9bf5067a6d8c88885ba27a9e69c/websockets-13.1-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:463e1c6ec853202dd3657f156123d6b4dad0c546ea2e2e38be2b3f7c5b8e7295", size = 163647, upload-time = "2024-09-21T17:33:35.841Z" },
- { url = "https://files.pythonhosted.org/packages/de/df/2ebebb807f10993c35c10cbd3628a7944b66bd5fb6632a561f8666f3a68e/websockets-13.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:6d6855bbe70119872c05107e38fbc7f96b1d8cb047d95c2c50869a46c65a8e96", size = 163590, upload-time = "2024-09-21T17:33:37.61Z" },
- { url = "https://files.pythonhosted.org/packages/b5/82/d48911f56bb993c11099a1ff1d4041d9d1481d50271100e8ee62bc28f365/websockets-13.1-cp38-cp38-win32.whl", hash = "sha256:204e5107f43095012b00f1451374693267adbb832d29966a01ecc4ce1db26faf", size = 158701, upload-time = "2024-09-21T17:33:38.695Z" },
- { url = "https://files.pythonhosted.org/packages/8b/b3/945aacb21fc89ad150403cbaa974c9e846f098f16d9f39a3dd6094f9beb1/websockets-13.1-cp38-cp38-win_amd64.whl", hash = "sha256:485307243237328c022bc908b90e4457d0daa8b5cf4b3723fd3c4a8012fce4c6", size = 159146, upload-time = "2024-09-21T17:33:39.855Z" },
- { url = "https://files.pythonhosted.org/packages/61/26/5f7a7fb03efedb4f90ed61968338bfe7c389863b0ceda239b94ae61c5ae4/websockets-13.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:9b37c184f8b976f0c0a231a5f3d6efe10807d41ccbe4488df8c74174805eea7d", size = 157810, upload-time = "2024-09-21T17:33:40.94Z" },
- { url = "https://files.pythonhosted.org/packages/0e/d4/9b4814a07dffaa7a79d71b4944d10836f9adbd527a113f6675734ef3abed/websockets-13.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:163e7277e1a0bd9fb3c8842a71661ad19c6aa7bb3d6678dc7f89b17fbcc4aeb7", size = 155467, upload-time = "2024-09-21T17:33:42.075Z" },
- { url = "https://files.pythonhosted.org/packages/1a/1a/2abdc7ce3b56429ae39d6bfb48d8c791f5a26bbcb6f44aabcf71ffc3fda2/websockets-13.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:4b889dbd1342820cc210ba44307cf75ae5f2f96226c0038094455a96e64fb07a", size = 155714, upload-time = "2024-09-21T17:33:43.128Z" },
- { url = "https://files.pythonhosted.org/packages/2a/98/189d7cf232753a719b2726ec55e7922522632248d5d830adf078e3f612be/websockets-13.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:586a356928692c1fed0eca68b4d1c2cbbd1ca2acf2ac7e7ebd3b9052582deefa", size = 164587, upload-time = "2024-09-21T17:33:44.27Z" },
- { url = "https://files.pythonhosted.org/packages/a5/2b/fb77cedf3f9f55ef8605238c801eef6b9a5269b01a396875a86896aea3a6/websockets-13.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7bd6abf1e070a6b72bfeb71049d6ad286852e285f146682bf30d0296f5fbadfa", size = 163588, upload-time = "2024-09-21T17:33:45.38Z" },
- { url = "https://files.pythonhosted.org/packages/a3/b7/070481b83d2d5ac0f19233d9f364294e224e6478b0762f07fa7f060e0619/websockets-13.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6d2aad13a200e5934f5a6767492fb07151e1de1d6079c003ab31e1823733ae79", size = 163894, upload-time = "2024-09-21T17:33:46.651Z" },
- { url = "https://files.pythonhosted.org/packages/eb/be/d6e1cff7d441cfe5eafaacc5935463e5f14c8b1c0d39cb8afde82709b55a/websockets-13.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:df01aea34b6e9e33572c35cd16bae5a47785e7d5c8cb2b54b2acdb9678315a17", size = 164315, upload-time = "2024-09-21T17:33:48.432Z" },
- { url = "https://files.pythonhosted.org/packages/8b/5e/ffa234473e46ab2d3f9fd9858163d5db3ecea1439e4cb52966d78906424b/websockets-13.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:e54affdeb21026329fb0744ad187cf812f7d3c2aa702a5edb562b325191fcab6", size = 163714, upload-time = "2024-09-21T17:33:49.548Z" },
- { url = "https://files.pythonhosted.org/packages/cc/92/cea9eb9d381ca57065a5eb4ec2ce7a291bd96c85ce742915c3c9ffc1069f/websockets-13.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:9ef8aa8bdbac47f4968a5d66462a2a0935d044bf35c0e5a8af152d58516dbeb5", size = 163673, upload-time = "2024-09-21T17:33:51.056Z" },
- { url = "https://files.pythonhosted.org/packages/a4/f1/279104fff239bfd04c12b1e58afea227d72fd1acf431e3eed3f6ac2c96d2/websockets-13.1-cp39-cp39-win32.whl", hash = "sha256:deeb929efe52bed518f6eb2ddc00cc496366a14c726005726ad62c2dd9017a3c", size = 158702, upload-time = "2024-09-21T17:33:52.584Z" },
- { url = "https://files.pythonhosted.org/packages/25/0b/b87370ff141375c41f7dd67941728e4b3682ebb45882591516c792a2ebee/websockets-13.1-cp39-cp39-win_amd64.whl", hash = "sha256:7c65ffa900e7cc958cd088b9a9157a8141c991f8c53d11087e6fb7277a03f81d", size = 159146, upload-time = "2024-09-21T17:33:53.781Z" },
- { url = "https://files.pythonhosted.org/packages/2d/75/6da22cb3ad5b8c606963f9a5f9f88656256fecc29d420b4b2bf9e0c7d56f/websockets-13.1-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:5dd6da9bec02735931fccec99d97c29f47cc61f644264eb995ad6c0c27667238", size = 155499, upload-time = "2024-09-21T17:33:54.917Z" },
- { url = "https://files.pythonhosted.org/packages/c0/ba/22833d58629088fcb2ccccedfae725ac0bbcd713319629e97125b52ac681/websockets-13.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:2510c09d8e8df777177ee3d40cd35450dc169a81e747455cc4197e63f7e7bfe5", size = 155737, upload-time = "2024-09-21T17:33:56.052Z" },
- { url = "https://files.pythonhosted.org/packages/95/54/61684fe22bdb831e9e1843d972adadf359cf04ab8613285282baea6a24bb/websockets-13.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f1c3cf67185543730888b20682fb186fc8d0fa6f07ccc3ef4390831ab4b388d9", size = 157095, upload-time = "2024-09-21T17:33:57.21Z" },
- { url = "https://files.pythonhosted.org/packages/fc/f5/6652fb82440813822022a9301a30afde85e5ff3fb2aebb77f34aabe2b4e8/websockets-13.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bcc03c8b72267e97b49149e4863d57c2d77f13fae12066622dc78fe322490fe6", size = 156701, upload-time = "2024-09-21T17:33:59.061Z" },
- { url = "https://files.pythonhosted.org/packages/67/33/ae82a7b860fa8a08aba68818bdf7ff61f04598aa5ab96df4cd5a3e418ca4/websockets-13.1-pp310-pypy310_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:004280a140f220c812e65f36944a9ca92d766b6cc4560be652a0a3883a79ed8a", size = 156654, upload-time = "2024-09-21T17:34:00.944Z" },
- { url = "https://files.pythonhosted.org/packages/63/0b/a1b528d36934f833e20f6da1032b995bf093d55cb416b9f2266f229fb237/websockets-13.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:e2620453c075abeb0daa949a292e19f56de518988e079c36478bacf9546ced23", size = 159192, upload-time = "2024-09-21T17:34:02.656Z" },
- { url = "https://files.pythonhosted.org/packages/5e/a1/5ae6d0ef2e61e2b77b3b4678949a634756544186620a728799acdf5c3482/websockets-13.1-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:9156c45750b37337f7b0b00e6248991a047be4aa44554c9886fe6bdd605aab3b", size = 155433, upload-time = "2024-09-21T17:34:03.88Z" },
- { url = "https://files.pythonhosted.org/packages/0d/2f/addd33f85600d210a445f817ff0d79d2b4d0eb6f3c95b9f35531ebf8f57c/websockets-13.1-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:80c421e07973a89fbdd93e6f2003c17d20b69010458d3a8e37fb47874bd67d51", size = 155733, upload-time = "2024-09-21T17:34:05.173Z" },
- { url = "https://files.pythonhosted.org/packages/74/0b/f8ec74ac3b14a983289a1b42dc2c518a0e2030b486d0549d4f51ca11e7c9/websockets-13.1-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:82d0ba76371769d6a4e56f7e83bb8e81846d17a6190971e38b5de108bde9b0d7", size = 157093, upload-time = "2024-09-21T17:34:06.398Z" },
- { url = "https://files.pythonhosted.org/packages/ad/4c/aa5cc2f718ee4d797411202f332c8281f04c42d15f55b02f7713320f7a03/websockets-13.1-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e9875a0143f07d74dc5e1ded1c4581f0d9f7ab86c78994e2ed9e95050073c94d", size = 156701, upload-time = "2024-09-21T17:34:07.582Z" },
- { url = "https://files.pythonhosted.org/packages/1f/4b/7c5b2d0d0f0f1a54f27c60107cf1f201bee1f88c5508f87408b470d09a9c/websockets-13.1-pp38-pypy38_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a11e38ad8922c7961447f35c7b17bffa15de4d17c70abd07bfbe12d6faa3e027", size = 156648, upload-time = "2024-09-21T17:34:08.734Z" },
- { url = "https://files.pythonhosted.org/packages/f3/63/35f3fb073884a9fd1ce5413b2dcdf0d9198b03dac6274197111259cbde06/websockets-13.1-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:4059f790b6ae8768471cddb65d3c4fe4792b0ab48e154c9f0a04cefaabcd5978", size = 159188, upload-time = "2024-09-21T17:34:10.018Z" },
- { url = "https://files.pythonhosted.org/packages/59/fd/e4bf9a7159dba6a16c59ae9e670e3e8ad9dcb6791bc0599eb86de32d50a9/websockets-13.1-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:25c35bf84bf7c7369d247f0b8cfa157f989862c49104c5cf85cb5436a641d93e", size = 155499, upload-time = "2024-09-21T17:34:11.3Z" },
- { url = "https://files.pythonhosted.org/packages/74/42/d48ede93cfe0c343f3b552af08efc60778d234989227b16882eed1b8b189/websockets-13.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:83f91d8a9bb404b8c2c41a707ac7f7f75b9442a0a876df295de27251a856ad09", size = 155731, upload-time = "2024-09-21T17:34:13.151Z" },
- { url = "https://files.pythonhosted.org/packages/f6/f2/2ef6bff1c90a43b80622a17c0852b48c09d3954ab169266ad7b15e17cdcb/websockets-13.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7a43cfdcddd07f4ca2b1afb459824dd3c6d53a51410636a2c7fc97b9a8cf4842", size = 157093, upload-time = "2024-09-21T17:34:14.52Z" },
- { url = "https://files.pythonhosted.org/packages/d1/14/6f20bbaeeb350f155edf599aad949c554216f90e5d4ae7373d1f2e5931fb/websockets-13.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:48a2ef1381632a2f0cb4efeff34efa97901c9fbc118e01951ad7cfc10601a9bb", size = 156701, upload-time = "2024-09-21T17:34:15.692Z" },
- { url = "https://files.pythonhosted.org/packages/c7/86/38279dfefecd035e22b79c38722d4f87c4b6196f1556b7a631d0a3095ca7/websockets-13.1-pp39-pypy39_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:459bf774c754c35dbb487360b12c5727adab887f1622b8aed5755880a21c4a20", size = 156649, upload-time = "2024-09-21T17:34:17.335Z" },
- { url = "https://files.pythonhosted.org/packages/f6/c5/12c6859a2eaa8c53f59a647617a27f1835a226cd7106c601067c53251d98/websockets-13.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:95858ca14a9f6fa8413d29e0a585b31b278388aa775b8a81fa24830123874678", size = 159187, upload-time = "2024-09-21T17:34:18.538Z" },
- { url = "https://files.pythonhosted.org/packages/56/27/96a5cd2626d11c8280656c6c71d8ab50fe006490ef9971ccd154e0c42cd2/websockets-13.1-py3-none-any.whl", hash = "sha256:a9a396a6ad26130cdae92ae10c36af09d9bfe6cafe69670fd3b6da9b07b4044f", size = 152134, upload-time = "2024-09-21T17:34:19.904Z" },
-]
-
-[[package]]
-name = "websockets"
-version = "15.0.1"
-source = { registry = "https://pypi.org/simple" }
-resolution-markers = [
- "python_full_version >= '3.10'",
- "python_full_version == '3.9.*'",
-]
-sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016, upload-time = "2025-03-05T20:03:41.606Z" }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/1e/da/6462a9f510c0c49837bbc9345aca92d767a56c1fb2939e1579df1e1cdcf7/websockets-15.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d63efaa0cd96cf0c5fe4d581521d9fa87744540d4bc999ae6e08595a1014b45b", size = 175423, upload-time = "2025-03-05T20:01:35.363Z" },
- { url = "https://files.pythonhosted.org/packages/1c/9f/9d11c1a4eb046a9e106483b9ff69bce7ac880443f00e5ce64261b47b07e7/websockets-15.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ac60e3b188ec7574cb761b08d50fcedf9d77f1530352db4eef1707fe9dee7205", size = 173080, upload-time = "2025-03-05T20:01:37.304Z" },
- { url = "https://files.pythonhosted.org/packages/d5/4f/b462242432d93ea45f297b6179c7333dd0402b855a912a04e7fc61c0d71f/websockets-15.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5756779642579d902eed757b21b0164cd6fe338506a8083eb58af5c372e39d9a", size = 173329, upload-time = "2025-03-05T20:01:39.668Z" },
- { url = "https://files.pythonhosted.org/packages/6e/0c/6afa1f4644d7ed50284ac59cc70ef8abd44ccf7d45850d989ea7310538d0/websockets-15.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0fdfe3e2a29e4db3659dbd5bbf04560cea53dd9610273917799f1cde46aa725e", size = 182312, upload-time = "2025-03-05T20:01:41.815Z" },
- { url = "https://files.pythonhosted.org/packages/dd/d4/ffc8bd1350b229ca7a4db2a3e1c482cf87cea1baccd0ef3e72bc720caeec/websockets-15.0.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c2529b320eb9e35af0fa3016c187dffb84a3ecc572bcee7c3ce302bfeba52bf", size = 181319, upload-time = "2025-03-05T20:01:43.967Z" },
- { url = "https://files.pythonhosted.org/packages/97/3a/5323a6bb94917af13bbb34009fac01e55c51dfde354f63692bf2533ffbc2/websockets-15.0.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac1e5c9054fe23226fb11e05a6e630837f074174c4c2f0fe442996112a6de4fb", size = 181631, upload-time = "2025-03-05T20:01:46.104Z" },
- { url = "https://files.pythonhosted.org/packages/a6/cc/1aeb0f7cee59ef065724041bb7ed667b6ab1eeffe5141696cccec2687b66/websockets-15.0.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:5df592cd503496351d6dc14f7cdad49f268d8e618f80dce0cd5a36b93c3fc08d", size = 182016, upload-time = "2025-03-05T20:01:47.603Z" },
- { url = "https://files.pythonhosted.org/packages/79/f9/c86f8f7af208e4161a7f7e02774e9d0a81c632ae76db2ff22549e1718a51/websockets-15.0.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:0a34631031a8f05657e8e90903e656959234f3a04552259458aac0b0f9ae6fd9", size = 181426, upload-time = "2025-03-05T20:01:48.949Z" },
- { url = "https://files.pythonhosted.org/packages/c7/b9/828b0bc6753db905b91df6ae477c0b14a141090df64fb17f8a9d7e3516cf/websockets-15.0.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:3d00075aa65772e7ce9e990cab3ff1de702aa09be3940d1dc88d5abf1ab8a09c", size = 181360, upload-time = "2025-03-05T20:01:50.938Z" },
- { url = "https://files.pythonhosted.org/packages/89/fb/250f5533ec468ba6327055b7d98b9df056fb1ce623b8b6aaafb30b55d02e/websockets-15.0.1-cp310-cp310-win32.whl", hash = "sha256:1234d4ef35db82f5446dca8e35a7da7964d02c127b095e172e54397fb6a6c256", size = 176388, upload-time = "2025-03-05T20:01:52.213Z" },
- { url = "https://files.pythonhosted.org/packages/1c/46/aca7082012768bb98e5608f01658ff3ac8437e563eca41cf068bd5849a5e/websockets-15.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:39c1fec2c11dc8d89bba6b2bf1556af381611a173ac2b511cf7231622058af41", size = 176830, upload-time = "2025-03-05T20:01:53.922Z" },
- { url = "https://files.pythonhosted.org/packages/9f/32/18fcd5919c293a398db67443acd33fde142f283853076049824fc58e6f75/websockets-15.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:823c248b690b2fd9303ba00c4f66cd5e2d8c3ba4aa968b2779be9532a4dad431", size = 175423, upload-time = "2025-03-05T20:01:56.276Z" },
- { url = "https://files.pythonhosted.org/packages/76/70/ba1ad96b07869275ef42e2ce21f07a5b0148936688c2baf7e4a1f60d5058/websockets-15.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678999709e68425ae2593acf2e3ebcbcf2e69885a5ee78f9eb80e6e371f1bf57", size = 173082, upload-time = "2025-03-05T20:01:57.563Z" },
- { url = "https://files.pythonhosted.org/packages/86/f2/10b55821dd40eb696ce4704a87d57774696f9451108cff0d2824c97e0f97/websockets-15.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d50fd1ee42388dcfb2b3676132c78116490976f1300da28eb629272d5d93e905", size = 173330, upload-time = "2025-03-05T20:01:59.063Z" },
- { url = "https://files.pythonhosted.org/packages/a5/90/1c37ae8b8a113d3daf1065222b6af61cc44102da95388ac0018fcb7d93d9/websockets-15.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d99e5546bf73dbad5bf3547174cd6cb8ba7273062a23808ffea025ecb1cf8562", size = 182878, upload-time = "2025-03-05T20:02:00.305Z" },
- { url = "https://files.pythonhosted.org/packages/8e/8d/96e8e288b2a41dffafb78e8904ea7367ee4f891dafc2ab8d87e2124cb3d3/websockets-15.0.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:66dd88c918e3287efc22409d426c8f729688d89a0c587c88971a0faa2c2f3792", size = 181883, upload-time = "2025-03-05T20:02:03.148Z" },
- { url = "https://files.pythonhosted.org/packages/93/1f/5d6dbf551766308f6f50f8baf8e9860be6182911e8106da7a7f73785f4c4/websockets-15.0.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8dd8327c795b3e3f219760fa603dcae1dcc148172290a8ab15158cf85a953413", size = 182252, upload-time = "2025-03-05T20:02:05.29Z" },
- { url = "https://files.pythonhosted.org/packages/d4/78/2d4fed9123e6620cbf1706c0de8a1632e1a28e7774d94346d7de1bba2ca3/websockets-15.0.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8fdc51055e6ff4adeb88d58a11042ec9a5eae317a0a53d12c062c8a8865909e8", size = 182521, upload-time = "2025-03-05T20:02:07.458Z" },
- { url = "https://files.pythonhosted.org/packages/e7/3b/66d4c1b444dd1a9823c4a81f50231b921bab54eee2f69e70319b4e21f1ca/websockets-15.0.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:693f0192126df6c2327cce3baa7c06f2a117575e32ab2308f7f8216c29d9e2e3", size = 181958, upload-time = "2025-03-05T20:02:09.842Z" },
- { url = "https://files.pythonhosted.org/packages/08/ff/e9eed2ee5fed6f76fdd6032ca5cd38c57ca9661430bb3d5fb2872dc8703c/websockets-15.0.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:54479983bd5fb469c38f2f5c7e3a24f9a4e70594cd68cd1fa6b9340dadaff7cf", size = 181918, upload-time = "2025-03-05T20:02:11.968Z" },
- { url = "https://files.pythonhosted.org/packages/d8/75/994634a49b7e12532be6a42103597b71098fd25900f7437d6055ed39930a/websockets-15.0.1-cp311-cp311-win32.whl", hash = "sha256:16b6c1b3e57799b9d38427dda63edcbe4926352c47cf88588c0be4ace18dac85", size = 176388, upload-time = "2025-03-05T20:02:13.32Z" },
- { url = "https://files.pythonhosted.org/packages/98/93/e36c73f78400a65f5e236cd376713c34182e6663f6889cd45a4a04d8f203/websockets-15.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:27ccee0071a0e75d22cb35849b1db43f2ecd3e161041ac1ee9d2352ddf72f065", size = 176828, upload-time = "2025-03-05T20:02:14.585Z" },
- { url = "https://files.pythonhosted.org/packages/51/6b/4545a0d843594f5d0771e86463606a3988b5a09ca5123136f8a76580dd63/websockets-15.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3e90baa811a5d73f3ca0bcbf32064d663ed81318ab225ee4f427ad4e26e5aff3", size = 175437, upload-time = "2025-03-05T20:02:16.706Z" },
- { url = "https://files.pythonhosted.org/packages/f4/71/809a0f5f6a06522af902e0f2ea2757f71ead94610010cf570ab5c98e99ed/websockets-15.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:592f1a9fe869c778694f0aa806ba0374e97648ab57936f092fd9d87f8bc03665", size = 173096, upload-time = "2025-03-05T20:02:18.832Z" },
- { url = "https://files.pythonhosted.org/packages/3d/69/1a681dd6f02180916f116894181eab8b2e25b31e484c5d0eae637ec01f7c/websockets-15.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0701bc3cfcb9164d04a14b149fd74be7347a530ad3bbf15ab2c678a2cd3dd9a2", size = 173332, upload-time = "2025-03-05T20:02:20.187Z" },
- { url = "https://files.pythonhosted.org/packages/a6/02/0073b3952f5bce97eafbb35757f8d0d54812b6174ed8dd952aa08429bcc3/websockets-15.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e8b56bdcdb4505c8078cb6c7157d9811a85790f2f2b3632c7d1462ab5783d215", size = 183152, upload-time = "2025-03-05T20:02:22.286Z" },
- { url = "https://files.pythonhosted.org/packages/74/45/c205c8480eafd114b428284840da0b1be9ffd0e4f87338dc95dc6ff961a1/websockets-15.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0af68c55afbd5f07986df82831c7bff04846928ea8d1fd7f30052638788bc9b5", size = 182096, upload-time = "2025-03-05T20:02:24.368Z" },
- { url = "https://files.pythonhosted.org/packages/14/8f/aa61f528fba38578ec553c145857a181384c72b98156f858ca5c8e82d9d3/websockets-15.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dee438fed052b52e4f98f76c5790513235efaa1ef7f3f2192c392cd7c91b65", size = 182523, upload-time = "2025-03-05T20:02:25.669Z" },
- { url = "https://files.pythonhosted.org/packages/ec/6d/0267396610add5bc0d0d3e77f546d4cd287200804fe02323797de77dbce9/websockets-15.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d5f6b181bb38171a8ad1d6aa58a67a6aa9d4b38d0f8c5f496b9e42561dfc62fe", size = 182790, upload-time = "2025-03-05T20:02:26.99Z" },
- { url = "https://files.pythonhosted.org/packages/02/05/c68c5adbf679cf610ae2f74a9b871ae84564462955d991178f95a1ddb7dd/websockets-15.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5d54b09eba2bada6011aea5375542a157637b91029687eb4fdb2dab11059c1b4", size = 182165, upload-time = "2025-03-05T20:02:30.291Z" },
- { url = "https://files.pythonhosted.org/packages/29/93/bb672df7b2f5faac89761cb5fa34f5cec45a4026c383a4b5761c6cea5c16/websockets-15.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3be571a8b5afed347da347bfcf27ba12b069d9d7f42cb8c7028b5e98bbb12597", size = 182160, upload-time = "2025-03-05T20:02:31.634Z" },
- { url = "https://files.pythonhosted.org/packages/ff/83/de1f7709376dc3ca9b7eeb4b9a07b4526b14876b6d372a4dc62312bebee0/websockets-15.0.1-cp312-cp312-win32.whl", hash = "sha256:c338ffa0520bdb12fbc527265235639fb76e7bc7faafbb93f6ba80d9c06578a9", size = 176395, upload-time = "2025-03-05T20:02:33.017Z" },
- { url = "https://files.pythonhosted.org/packages/7d/71/abf2ebc3bbfa40f391ce1428c7168fb20582d0ff57019b69ea20fa698043/websockets-15.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcd5cf9e305d7b8338754470cf69cf81f420459dbae8a3b40cee57417f4614a7", size = 176841, upload-time = "2025-03-05T20:02:34.498Z" },
- { url = "https://files.pythonhosted.org/packages/cb/9f/51f0cf64471a9d2b4d0fc6c534f323b664e7095640c34562f5182e5a7195/websockets-15.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee443ef070bb3b6ed74514f5efaa37a252af57c90eb33b956d35c8e9c10a1931", size = 175440, upload-time = "2025-03-05T20:02:36.695Z" },
- { url = "https://files.pythonhosted.org/packages/8a/05/aa116ec9943c718905997412c5989f7ed671bc0188ee2ba89520e8765d7b/websockets-15.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5a939de6b7b4e18ca683218320fc67ea886038265fd1ed30173f5ce3f8e85675", size = 173098, upload-time = "2025-03-05T20:02:37.985Z" },
- { url = "https://files.pythonhosted.org/packages/ff/0b/33cef55ff24f2d92924923c99926dcce78e7bd922d649467f0eda8368923/websockets-15.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:746ee8dba912cd6fc889a8147168991d50ed70447bf18bcda7039f7d2e3d9151", size = 173329, upload-time = "2025-03-05T20:02:39.298Z" },
- { url = "https://files.pythonhosted.org/packages/31/1d/063b25dcc01faa8fada1469bdf769de3768b7044eac9d41f734fd7b6ad6d/websockets-15.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:595b6c3969023ecf9041b2936ac3827e4623bfa3ccf007575f04c5a6aa318c22", size = 183111, upload-time = "2025-03-05T20:02:40.595Z" },
- { url = "https://files.pythonhosted.org/packages/93/53/9a87ee494a51bf63e4ec9241c1ccc4f7c2f45fff85d5bde2ff74fcb68b9e/websockets-15.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c714d2fc58b5ca3e285461a4cc0c9a66bd0e24c5da9911e30158286c9b5be7f", size = 182054, upload-time = "2025-03-05T20:02:41.926Z" },
- { url = "https://files.pythonhosted.org/packages/ff/b2/83a6ddf56cdcbad4e3d841fcc55d6ba7d19aeb89c50f24dd7e859ec0805f/websockets-15.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f3c1e2ab208db911594ae5b4f79addeb3501604a165019dd221c0bdcabe4db8", size = 182496, upload-time = "2025-03-05T20:02:43.304Z" },
- { url = "https://files.pythonhosted.org/packages/98/41/e7038944ed0abf34c45aa4635ba28136f06052e08fc2168520bb8b25149f/websockets-15.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:229cf1d3ca6c1804400b0a9790dc66528e08a6a1feec0d5040e8b9eb14422375", size = 182829, upload-time = "2025-03-05T20:02:48.812Z" },
- { url = "https://files.pythonhosted.org/packages/e0/17/de15b6158680c7623c6ef0db361da965ab25d813ae54fcfeae2e5b9ef910/websockets-15.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:756c56e867a90fb00177d530dca4b097dd753cde348448a1012ed6c5131f8b7d", size = 182217, upload-time = "2025-03-05T20:02:50.14Z" },
- { url = "https://files.pythonhosted.org/packages/33/2b/1f168cb6041853eef0362fb9554c3824367c5560cbdaad89ac40f8c2edfc/websockets-15.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:558d023b3df0bffe50a04e710bc87742de35060580a293c2a984299ed83bc4e4", size = 182195, upload-time = "2025-03-05T20:02:51.561Z" },
- { url = "https://files.pythonhosted.org/packages/86/eb/20b6cdf273913d0ad05a6a14aed4b9a85591c18a987a3d47f20fa13dcc47/websockets-15.0.1-cp313-cp313-win32.whl", hash = "sha256:ba9e56e8ceeeedb2e080147ba85ffcd5cd0711b89576b83784d8605a7df455fa", size = 176393, upload-time = "2025-03-05T20:02:53.814Z" },
- { url = "https://files.pythonhosted.org/packages/1b/6c/c65773d6cab416a64d191d6ee8a8b1c68a09970ea6909d16965d26bfed1e/websockets-15.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:e09473f095a819042ecb2ab9465aee615bd9c2028e4ef7d933600a8401c79561", size = 176837, upload-time = "2025-03-05T20:02:55.237Z" },
- { url = "https://files.pythonhosted.org/packages/36/db/3fff0bcbe339a6fa6a3b9e3fbc2bfb321ec2f4cd233692272c5a8d6cf801/websockets-15.0.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:5f4c04ead5aed67c8a1a20491d54cdfba5884507a48dd798ecaf13c74c4489f5", size = 175424, upload-time = "2025-03-05T20:02:56.505Z" },
- { url = "https://files.pythonhosted.org/packages/46/e6/519054c2f477def4165b0ec060ad664ed174e140b0d1cbb9fafa4a54f6db/websockets-15.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:abdc0c6c8c648b4805c5eacd131910d2a7f6455dfd3becab248ef108e89ab16a", size = 173077, upload-time = "2025-03-05T20:02:58.37Z" },
- { url = "https://files.pythonhosted.org/packages/1a/21/c0712e382df64c93a0d16449ecbf87b647163485ca1cc3f6cbadb36d2b03/websockets-15.0.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a625e06551975f4b7ea7102bc43895b90742746797e2e14b70ed61c43a90f09b", size = 173324, upload-time = "2025-03-05T20:02:59.773Z" },
- { url = "https://files.pythonhosted.org/packages/1c/cb/51ba82e59b3a664df54beed8ad95517c1b4dc1a913730e7a7db778f21291/websockets-15.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d591f8de75824cbb7acad4e05d2d710484f15f29d4a915092675ad3456f11770", size = 182094, upload-time = "2025-03-05T20:03:01.827Z" },
- { url = "https://files.pythonhosted.org/packages/fb/0f/bf3788c03fec679bcdaef787518dbe60d12fe5615a544a6d4cf82f045193/websockets-15.0.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:47819cea040f31d670cc8d324bb6435c6f133b8c7a19ec3d61634e62f8d8f9eb", size = 181094, upload-time = "2025-03-05T20:03:03.123Z" },
- { url = "https://files.pythonhosted.org/packages/5e/da/9fb8c21edbc719b66763a571afbaf206cb6d3736d28255a46fc2fe20f902/websockets-15.0.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac017dd64572e5c3bd01939121e4d16cf30e5d7e110a119399cf3133b63ad054", size = 181397, upload-time = "2025-03-05T20:03:04.443Z" },
- { url = "https://files.pythonhosted.org/packages/2e/65/65f379525a2719e91d9d90c38fe8b8bc62bd3c702ac651b7278609b696c4/websockets-15.0.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:4a9fac8e469d04ce6c25bb2610dc535235bd4aa14996b4e6dbebf5e007eba5ee", size = 181794, upload-time = "2025-03-05T20:03:06.708Z" },
- { url = "https://files.pythonhosted.org/packages/d9/26/31ac2d08f8e9304d81a1a7ed2851c0300f636019a57cbaa91342015c72cc/websockets-15.0.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:363c6f671b761efcb30608d24925a382497c12c506b51661883c3e22337265ed", size = 181194, upload-time = "2025-03-05T20:03:08.844Z" },
- { url = "https://files.pythonhosted.org/packages/98/72/1090de20d6c91994cd4b357c3f75a4f25ee231b63e03adea89671cc12a3f/websockets-15.0.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:2034693ad3097d5355bfdacfffcbd3ef5694f9718ab7f29c29689a9eae841880", size = 181164, upload-time = "2025-03-05T20:03:10.242Z" },
- { url = "https://files.pythonhosted.org/packages/2d/37/098f2e1c103ae8ed79b0e77f08d83b0ec0b241cf4b7f2f10edd0126472e1/websockets-15.0.1-cp39-cp39-win32.whl", hash = "sha256:3b1ac0d3e594bf121308112697cf4b32be538fb1444468fb0a6ae4feebc83411", size = 176381, upload-time = "2025-03-05T20:03:12.77Z" },
- { url = "https://files.pythonhosted.org/packages/75/8b/a32978a3ab42cebb2ebdd5b05df0696a09f4d436ce69def11893afa301f0/websockets-15.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:b7643a03db5c95c799b89b31c036d5f27eeb4d259c798e878d6937d71832b1e4", size = 176841, upload-time = "2025-03-05T20:03:14.367Z" },
- { url = "https://files.pythonhosted.org/packages/02/9e/d40f779fa16f74d3468357197af8d6ad07e7c5a27ea1ca74ceb38986f77a/websockets-15.0.1-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:0c9e74d766f2818bb95f84c25be4dea09841ac0f734d1966f415e4edfc4ef1c3", size = 173109, upload-time = "2025-03-05T20:03:17.769Z" },
- { url = "https://files.pythonhosted.org/packages/bc/cd/5b887b8585a593073fd92f7c23ecd3985cd2c3175025a91b0d69b0551372/websockets-15.0.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1009ee0c7739c08a0cd59de430d6de452a55e42d6b522de7aa15e6f67db0b8e1", size = 173343, upload-time = "2025-03-05T20:03:19.094Z" },
- { url = "https://files.pythonhosted.org/packages/fe/ae/d34f7556890341e900a95acf4886833646306269f899d58ad62f588bf410/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:76d1f20b1c7a2fa82367e04982e708723ba0e7b8d43aa643d3dcd404d74f1475", size = 174599, upload-time = "2025-03-05T20:03:21.1Z" },
- { url = "https://files.pythonhosted.org/packages/71/e6/5fd43993a87db364ec60fc1d608273a1a465c0caba69176dd160e197ce42/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f29d80eb9a9263b8d109135351caf568cc3f80b9928bccde535c235de55c22d9", size = 174207, upload-time = "2025-03-05T20:03:23.221Z" },
- { url = "https://files.pythonhosted.org/packages/2b/fb/c492d6daa5ec067c2988ac80c61359ace5c4c674c532985ac5a123436cec/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b359ed09954d7c18bbc1680f380c7301f92c60bf924171629c5db97febb12f04", size = 174155, upload-time = "2025-03-05T20:03:25.321Z" },
- { url = "https://files.pythonhosted.org/packages/68/a1/dcb68430b1d00b698ae7a7e0194433bce4f07ded185f0ee5fb21e2a2e91e/websockets-15.0.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:cad21560da69f4ce7658ca2cb83138fb4cf695a2ba3e475e0559e05991aa8122", size = 176884, upload-time = "2025-03-05T20:03:27.934Z" },
- { url = "https://files.pythonhosted.org/packages/b7/48/4b67623bac4d79beb3a6bb27b803ba75c1bdedc06bd827e465803690a4b2/websockets-15.0.1-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:7f493881579c90fc262d9cdbaa05a6b54b3811c2f300766748db79f098db9940", size = 173106, upload-time = "2025-03-05T20:03:29.404Z" },
- { url = "https://files.pythonhosted.org/packages/ed/f0/adb07514a49fe5728192764e04295be78859e4a537ab8fcc518a3dbb3281/websockets-15.0.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:47b099e1f4fbc95b701b6e85768e1fcdaf1630f3cbe4765fa216596f12310e2e", size = 173339, upload-time = "2025-03-05T20:03:30.755Z" },
- { url = "https://files.pythonhosted.org/packages/87/28/bd23c6344b18fb43df40d0700f6d3fffcd7cef14a6995b4f976978b52e62/websockets-15.0.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:67f2b6de947f8c757db2db9c71527933ad0019737ec374a8a6be9a956786aaf9", size = 174597, upload-time = "2025-03-05T20:03:32.247Z" },
- { url = "https://files.pythonhosted.org/packages/6d/79/ca288495863d0f23a60f546f0905ae8f3ed467ad87f8b6aceb65f4c013e4/websockets-15.0.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d08eb4c2b7d6c41da6ca0600c077e93f5adcfd979cd777d747e9ee624556da4b", size = 174205, upload-time = "2025-03-05T20:03:33.731Z" },
- { url = "https://files.pythonhosted.org/packages/04/e4/120ff3180b0872b1fe6637f6f995bcb009fb5c87d597c1fc21456f50c848/websockets-15.0.1-pp39-pypy39_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4b826973a4a2ae47ba357e4e82fa44a463b8f168e1ca775ac64521442b19e87f", size = 174150, upload-time = "2025-03-05T20:03:35.757Z" },
- { url = "https://files.pythonhosted.org/packages/cb/c3/30e2f9c539b8da8b1d76f64012f3b19253271a63413b2d3adb94b143407f/websockets-15.0.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:21c1fa28a6a7e3cbdc171c694398b6df4744613ce9b36b1a498e816787e28123", size = 176877, upload-time = "2025-03-05T20:03:37.199Z" },
- { url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743, upload-time = "2025-03-05T20:03:39.41Z" },
-]
diff --git a/docs-plan.md b/docs-plan.md
index ca86fcf1e..44a3db745 100644
--- a/docs-plan.md
+++ b/docs-plan.md
@@ -3,6 +3,7 @@
## Current State Analysis
### Strengths
+
- **Excellent README.md**: Comprehensive overview with clear getting started instructions
- **Detailed ARCHITECTURE.md**: Production-ready LlamaDeploy architecture documentation
- **Comprehensive Agent Framework**: Complete agent personas in `rhoai-ux-agents-vTeam.md`
@@ -11,6 +12,7 @@
- **Component Documentation**: Detailed setup guides for ambient-runner platform
### Gaps Identified
+
- No structured documentation hierarchy for different audiences
- Scattered user guides across multiple files
- Missing hands-on lab exercises for learning
@@ -174,6 +176,7 @@ markdown_extensions:
## Content Migration Strategy
### Phase 1: Foundation (Week 1)
+
1. **Setup MkDocs infrastructure**
- Install MkDocs with Material theme
- Configure `mkdocs.yml` with navigation structure
@@ -185,6 +188,7 @@ markdown_extensions:
- Create section index pages with clear navigation
### Phase 2: User Documentation (Week 2)
+
1. **Getting Started Guide**
- 5-minute quick start (streamlined from README)
- Prerequisites and installation
@@ -196,6 +200,7 @@ markdown_extensions:
- Configuration and customization options
### Phase 3: Developer Documentation (Week 3)
+
1. **Development Setup**
- Environment setup and dependencies
- Development workflow and standards
@@ -207,6 +212,7 @@ markdown_extensions:
- API reference documentation
### Phase 4: Lab Exercises (Week 4)
+
1. **Basic Labs**
- Lab 1: Create your first RFE using the chat interface
- Lab 2: Understand agent interactions and workflows
@@ -225,6 +231,7 @@ markdown_extensions:
## Lab Exercise Design Principles
### Structure
+
- **Objective**: Clear learning goals
- **Prerequisites**: Required knowledge and setup
- **Step-by-Step Instructions**: Detailed procedures
@@ -233,6 +240,7 @@ markdown_extensions:
- **Further Reading**: Links to relevant documentation
### Example Lab Template
+
```markdown
# Lab 1: Create Your First RFE
@@ -258,8 +266,8 @@ Learn to create a Request for Enhancement (RFE) using the conversational AI inte
- Agent workflow initiated
## Troubleshooting
-- If API key errors occur, check `.streamlit/secrets.toml`
-- For agent timeout issues, verify network connectivity
+- If API key errors occur, check ProjectSettings CR in your namespace
+- For agent timeout issues, verify network connectivity and check operator logs
## Further Reading
- [Agent Framework Guide](../user-guide/agent-framework.md)
@@ -269,18 +277,21 @@ Learn to create a Request for Enhancement (RFE) using the conversational AI inte
## Integration with Existing Tools
### GitHub Integration
+
- **Source**: Documentation lives in `/docs` directory
- **Editing**: Direct GitHub editing links in MkDocs
- **Issues**: Link to GitHub issues for documentation bugs
- **Contributions**: Pull request workflow for doc updates
### CI/CD Pipeline
+
- **Build**: Automated MkDocs builds on push to main
- **Deploy**: GitHub Pages or internal hosting
- **Validation**: Link checking and markdown linting
- **Preview**: PR preview deployments
### Search and Discovery
+
- **Full-text search** via MkDocs search plugin
- **Cross-references** between sections
- **Glossary** for technical terms
@@ -289,18 +300,21 @@ Learn to create a Request for Enhancement (RFE) using the conversational AI inte
## Success Metrics
### User Adoption
+
- **Time to first RFE**: Measure setup to first successful RFE creation
- **Documentation page views**: Track most accessed content
- **Lab completion rates**: Monitor learning engagement
- **Support ticket reduction**: Measure documentation effectiveness
### Developer Experience
+
- **Contribution velocity**: Track PR frequency and merge time
- **Setup time**: Measure development environment setup duration
- **API usage**: Monitor developer API adoption
- **Community engagement**: Track discussions and questions
### Content Quality
+
- **Accuracy**: Regular technical review cycles
- **Freshness**: Automated checks for outdated content
- **Completeness**: Coverage analysis of features vs documentation
@@ -309,15 +323,17 @@ Learn to create a Request for Enhancement (RFE) using the conversational AI inte
## Maintenance Strategy
### Regular Updates
+
- **Monthly reviews** of user feedback and analytics
- **Quarterly content audits** for accuracy and relevance
- **Version alignment** with software releases
- **Link validation** and broken reference cleanup
### Community Contributions
+
- **Clear contribution guidelines** for documentation PRs
- **Template system** for consistent formatting
- **Review process** with subject matter experts
- **Recognition system** for documentation contributors
-This comprehensive documentation strategy transforms vTeam from a technically excellent but scattered documentation system into a professional, discoverable, and learnable platform that serves both end users and developers effectively.
\ No newline at end of file
+This comprehensive documentation strategy transforms vTeam from a technically excellent but scattered documentation system into a professional, discoverable, and learnable platform that serves both end users and developers effectively.
diff --git a/docs/developer-guide/index.md b/docs/developer-guide/index.md
index ede58f542..ca90077ea 100644
--- a/docs/developer-guide/index.md
+++ b/docs/developer-guide/index.md
@@ -7,31 +7,37 @@ Welcome to the vTeam Developer Guide! This section provides comprehensive inform
This guide covers technical implementation details and development workflows:
### 🔧 [Setup](setup.md)
+
- Development environment configuration
- Dependencies and tooling
- Local development workflow
### 🏗️ [Architecture](architecture.md)
+
- System design and component overview
- LlamaDeploy workflow orchestration
- Multi-agent coordination patterns
### 🔌 [Plugin Development](plugin-development.md)
+
- Creating custom agent personas
- Extending workflow capabilities
- Integration patterns and APIs
### 📚 [API Reference](api-reference.md)
+
- REST endpoint documentation
- Python API usage examples
- Response schemas and error codes
### 🤝 [Contributing](contributing.md)
+
- Code standards and review process
- Testing requirements and strategies
- Documentation guidelines
### 🧪 [Testing](testing.md)
+
- Unit testing strategies
- Integration testing with AI services
- Performance testing and benchmarks
@@ -41,18 +47,23 @@ This guide covers technical implementation details and development workflows:
This guide serves different development roles:
### **Backend Engineers**
+
Focus on LlamaDeploy workflows, agent orchestration, and API development.
### **Frontend Engineers**
+
Learn about the TypeScript chat interface and @llamaindex/server integration.
### **DevOps Engineers**
+
Understand deployment architecture, monitoring, and scalability considerations.
### **AI/ML Engineers**
+
Explore agent behavior customization, prompt engineering, and model integration.
### **QA Engineers**
+
Discover testing strategies for AI-powered workflows and integration patterns.
## Technology Stack
@@ -60,52 +71,53 @@ Discover testing strategies for AI-powered workflows and integration patterns.
Understanding our core technologies:
### **Backend (Go)**
+
- **Gin**: HTTP server and routing
- **Kubernetes Client**: Interacts with CRDs and cluster APIs
- **GitHub App Integration**: Installation tokens, repo proxying
### **Operator (Go)**
+
- **Controller Runtime**: Watches CRDs and manages Jobs
- **Runner Orchestration**: Creates per-session runner pods with PVC
### **Frontend (TypeScript/Next.js)**
+
- **Next.js + React**: UI and routing
- **Shadcn UI**: Component library
- **WebSocket**: Real-time session updates
### **Runner (Python)**
-- **Claude Agent SDK**: Executes tasks with allowed tools
-- **runner-shell**: Standardized adapter protocol
+
+- **Claude Code SDK**: Executes agentic sessions
+- **Multi-agent collaboration**: Supports complex workflows
### **AI Integration**
-- **Anthropic Claude**: Primary model via Claude Agent SDK
+
+- **Anthropic Claude**: Primary model via Claude Code SDK
### **Development Tools**
+
- **docker/podman**: Container builds
- **make**: Build and deploy automation
+- **Kubernetes/OpenShift**: Runtime platform
## Architecture Overview
```mermaid
graph TD
- A[Frontend: @llamaindex/server] --> B[LlamaDeploy API Server]
- B --> C[RFE Builder Workflow]
- C --> D[Multi-Agent Manager]
- D --> E[UX Designer Agent]
- D --> F[Product Manager Agent]
- D --> G[Backend Engineer Agent]
- D --> H[Frontend Engineer Agent]
- D --> I[Architect Agent]
- D --> J[Product Owner Agent]
- D --> K[SME/Researcher Agent]
-
- L[Vector Index Generation] --> M[FAISS Vector Stores]
- M --> D
-
- N[External APIs] --> O[Anthropic Claude]
- N --> P[OpenAI Embeddings]
-
- B --> N
+ A[Frontend NextJS] --> B[Backend API]
+ B --> C[Kubernetes CRDs]
+ C --> D[Operator]
+ D --> E[Job Pods]
+ E --> F[Claude Code Runner]
+ F --> G[GitHub Repos]
+
+ H[ProjectSettings CR] --> D
+ I[AgenticSession CR] --> D
+ J[RFEWorkflow CR] --> D
+
+ K[Anthropic Claude API] --> F
```
## Development Workflow
@@ -122,35 +134,44 @@ graph TD
```
vTeam/
-├── demos/rfe-builder/ # Main application
-│ ├── src/ # Core Python backend
-│ │ ├── agents/ # Agent YAML configurations
-│ │ ├── rfe_builder_workflow.py # Main LlamaDeploy workflow
-│ │ ├── artifact_editor_workflow.py # Artifact editing workflow
-│ │ └── settings.py # System configuration
-│ ├── ui/ # TypeScript frontend (@llamaindex/server)
-│ ├── deployment.yml # LlamaDeploy deployment configuration
-│ ├── pyproject.toml # Python dependencies and build config
-│ └── data/ # Document sources for RAG
-├── src/vteam_shared_configs/ # Shared configuration package
+├── components/
+│ ├── backend/ # Go REST API
+│ │ ├── handlers.go # HTTP handlers
+│ │ ├── git.go # GitHub integration
+│ │ └── websocket_messaging.go # Real-time updates
+│ ├── frontend/ # Next.js web UI
+│ │ ├── app/ # Next.js app router
+│ │ └── components/ # React components
+│ ├── operator/ # Kubernetes operator (Go)
+│ │ └── controllers/ # CR reconciliation logic
+│ ├── runners/
+│ │ └── claude-code-runner/ # Python Claude Code SDK wrapper
+│ └── manifests/ # Kubernetes deployment YAMLs
+│ ├── crds/ # Custom Resource Definitions
+│ └── deployment/ # Deployment manifests
└── docs/ # Documentation (you are here!)
```
## Key Development Areas
-### **Agent System Development**
-Extend the multi-agent framework with new personas, specialized knowledge, and interaction patterns.
+### **Custom Resource Development**
-### **Workflow Engine Enhancement**
-Improve LlamaDeploy orchestration, add new workflow steps, and optimize performance.
+Extend the Kubernetes CRD system with new workflow types and orchestration patterns.
-### **RAG System Optimization**
-Enhance document ingestion, vector search accuracy, and context relevance.
+### **Operator Enhancement**
+
+Improve reconciliation loops, job management, and error handling in the operator.
+
+### **Runner Capabilities**
+
+Enhance the Claude Code runner with new tools and integration patterns.
### **API & Integration Development**
+
Build new integrations, improve existing APIs, and enhance external service connections.
### **Frontend Experience**
+
Improve the chat interface, add visualization features, and enhance user experience.
## Getting Started
@@ -182,4 +203,4 @@ Connect with other developers:
- **Pull Request Reviews**: Code collaboration and feedback
- **Documentation**: Help improve this developer guide
-Let's build the future of AI-assisted software refinement together! 🚀
\ No newline at end of file
+Let's build the future of AI-assisted software refinement together! 🚀
diff --git a/docs/labs/basic/lab-1-first-rfe.md b/docs/labs/basic/lab-1-first-rfe.md
index 536fb238c..b296737ab 100644
--- a/docs/labs/basic/lab-1-first-rfe.md
+++ b/docs/labs/basic/lab-1-first-rfe.md
@@ -5,6 +5,7 @@
Learn to create a Request for Enhancement (RFE) using vTeam's conversational AI interface and understand how the 7-agent council processes your requirements.
**By the end of this lab, you will:**
+
- Successfully create an RFE using natural language
- Understand the agent workflow stages
- Interpret agent analysis and recommendations
@@ -12,10 +13,10 @@ Learn to create a Request for Enhancement (RFE) using vTeam's conversational AI
## Prerequisites 📋
-- [ ] vTeam installed and running locally ([Getting Started Guide](../../user-guide/getting-started.md))
-- [ ] OpenAI and Anthropic API keys configured in `src/.env`
+- [ ] vTeam installed and running ([Getting Started Guide](../../user-guide/getting-started.md))
+- [ ] Anthropic API key configured in your project settings
- [ ] Basic understanding of software requirements
-- [ ] Web browser for accessing the LlamaIndex chat interface
+- [ ] Web browser for accessing the vTeam interface
## Estimated Time ⏱️
@@ -25,33 +26,30 @@ Learn to create a Request for Enhancement (RFE) using vTeam's conversational AI
You're a Product Manager at a software company. Your development team has been asking for a **dark mode feature** to improve user experience and reduce eye strain during late-night coding sessions. You need to create a comprehensive RFE that the engineering team can implement immediately.
-## Step 1: Access the Chat Interface
+## Step 1: Access the vTeam Interface
+
+1. **Ensure vTeam is running**. For local development with OpenShift Local (CRC):
-1. **Ensure vTeam is running**. If not, start it:
```bash
- cd demos/rfe-builder
-
- # Terminal 1: Start API server
- uv run -m llama_deploy.apiserver
-
- # Terminal 2: Deploy workflow
- uv run llamactl deploy deployment.yml
+ cd vTeam
+ make dev-start
```
-2. **Open your browser** to `http://localhost:4501/deployments/rhoai-ai-feature-sizing/ui`
+2. **Open your browser** to the vTeam frontend URL (displayed after deployment)
-3. **Verify the chat interface**:
- - You should see a modern LlamaIndex chat interface
- - Look for starter questions or prompts
+3. **Verify the interface**:
+ - You should see the vTeam web interface
+ - Look for project creation or session options
- The interface should be ready to accept your input
-**✅ Checkpoint**: Confirm you can see the chat interface and it's responsive to input.
+**✅ Checkpoint**: Confirm you can access the vTeam interface and navigate the UI.
## Step 2: Initiate RFE Creation
Now let's create your first RFE using natural language.
1. **Start with a basic description** in the chat:
+
```
I want to add a dark mode toggle to our application. Users should be able to switch between light and dark themes, and their preference should be saved.
```
@@ -63,23 +61,26 @@ Now let's create your first RFE using natural language.
- Where should the toggle be located?
- Are there any specific design requirements?
-**✅ Checkpoint**: The LlamaDeploy workflow should begin processing and respond within 10-15 seconds.
+**✅ Checkpoint**: The vTeam workflow should begin processing and respond within 10-15 seconds.
## Step 3: Provide Additional Context
The AI will guide you through refining your requirements. Respond to its questions with details like:
**When asked about application type:**
+
```
This is a web-based project management application built with React. We have about 5,000 active users who work in different time zones.
```
**When asked about toggle placement:**
+
```
The toggle should be in the user settings page, but also accessible from the main navigation bar for quick switching.
```
**When asked about design requirements:**
+
```
We want to follow our existing design system. Dark mode should use our brand colors - dark gray (#2D3748) backgrounds with white text, and our signature blue (#3182CE) for accents.
```
@@ -97,6 +98,7 @@ The AI will present a structured RFE with sections like:
- **Technical Considerations**: Implementation notes
**Review the generated content and verify:**
+
- [ ] Title accurately reflects your request
- [ ] Description captures all discussed details
- [ ] Business justification makes sense for your user base
@@ -106,16 +108,17 @@ The AI will present a structured RFE with sections like:
## Step 5: Watch Multi-Agent Analysis
-The LlamaDeploy workflow automatically orchestrates all 7 agents:
+The vTeam workflow automatically orchestrates all 7 agents:
1. **Monitor the progress** in real-time
2. **Observe agent coordination**:
- All 7 agents analyze your RFE simultaneously
- - LlamaDeploy orchestrates the workflow execution
- - Real-time streaming shows agent progress
+ - vTeam orchestrates the workflow execution via Kubernetes
+ - Real-time updates show agent progress
3. **Watch for specialized analysis** from each agent perspective
**The 7-agent process:**
+
1. **Parker (PM)**: Prioritizes business value and stakeholder impact
2. **Archie (Architect)**: Evaluates technical feasibility and design approach
3. **Stella (Staff Engineer)**: Reviews implementation complexity and completeness
@@ -131,18 +134,21 @@ The LlamaDeploy workflow automatically orchestrates all 7 agents:
Each agent provides specialized analysis. Review their outputs:
### **Parker (PM) Analysis**
+
- Business value assessment (1-10 scale)
- User impact evaluation
- Resource requirement estimates
- Stakeholder communication recommendations
### **Archie (Architect) Analysis**
+
- Technical feasibility score
- Architecture impact assessment
- Integration points identified
- Risk factors and mitigation strategies
### **Stella (Staff Engineer) Analysis**
+
- Implementation complexity rating
- Development time estimates
- Required skills and resources
@@ -155,12 +161,14 @@ Each agent provides specialized analysis. Review their outputs:
Based on the agent analysis, you should see:
**Positive Indicators:**
+
- High business value score (7-9/10)
- Low-to-medium technical complexity
- Clear implementation path
- Strong user benefit justification
**Potential Concerns:**
+
- Design system impact
- Testing requirements for multiple themes
- Browser compatibility considerations
@@ -178,6 +186,7 @@ If the RFE is accepted, Derek (Delivery Owner) will create:
- **Implementation Timeline**: Development phases and milestones
**Review these artifacts for:**
+
- [ ] Clear, actionable user stories
- [ ] Testable acceptance criteria
- [ ] Realistic timeline estimates
@@ -198,6 +207,7 @@ If the RFE is accepted, Derek (Delivery Owner) will create:
### Verify the RFE Quality
A well-refined RFE should have:
+
- [ ] **Specific title** that clearly communicates the feature
- [ ] **Detailed description** with user context and motivation
- [ ] **Quantified business justification** with user impact metrics
@@ -208,16 +218,19 @@ A well-refined RFE should have:
## Troubleshooting 🛠️
### Agent Analysis Takes Too Long
+
- **Cause**: High API usage or network issues
- **Solution**: Check your internet connection and Anthropic API status
- **Workaround**: Try during off-peak hours
### Unclear Agent Recommendations
+
- **Cause**: Insufficient initial requirements
- **Solution**: Provide more context about your application, users, and constraints
- **Tip**: Include technical stack, user base size, and business priorities
### RFE Rejected by Agents
+
- **Cause**: Low business value, high complexity, or unclear requirements
- **Solution**: Refine your requirements based on agent feedback
- **Next Step**: Address specific concerns and resubmit
@@ -253,4 +266,4 @@ You've successfully completed Lab 1 when:
---
-**Next**: Ready to understand how agents collaborate? Continue with [Lab 2: Agent Interaction Deep Dive](lab-2-agent-interaction.md)
\ No newline at end of file
+**Next**: Ready to understand how agents collaborate? Continue with [Lab 2: Agent Interaction Deep Dive](lab-2-agent-interaction.md)
diff --git a/docs/labs/index.md b/docs/labs/index.md
index 38286c8b3..0852886a7 100644
--- a/docs/labs/index.md
+++ b/docs/labs/index.md
@@ -7,6 +7,7 @@ Welcome to the vTeam hands-on learning labs! These practical exercises will guid
Our lab curriculum is designed for progressive skill building:
### 🎯 Basic Labs (30-45 minutes each)
+
Perfect for getting started with vTeam fundamentals.
- **[Lab 1: Create Your First RFE](basic/lab-1-first-rfe.md)**
@@ -19,6 +20,7 @@ Perfect for getting started with vTeam fundamentals.
Master workflow states, progress tracking, and result interpretation
### 🔧 Advanced Labs (60-90 minutes each)
+
For users ready to customize and extend vTeam capabilities.
- **[Lab 4: Custom Agent Creation](advanced/lab-4-custom-agents.md)**
@@ -31,6 +33,7 @@ For users ready to customize and extend vTeam capabilities.
Validate custom configurations and ensure system reliability
### 🚀 Production Labs (90-120 minutes each)
+
Enterprise deployment and scaling considerations.
- **[Lab 7: Jira Integration Setup](production/lab-7-jira-integration.md)**
@@ -47,24 +50,31 @@ Enterprise deployment and scaling considerations.
Each lab follows a consistent structure for optimal learning:
### **Objective** 🎯
+
Clear learning goals and expected outcomes
### **Prerequisites** 📋
+
Required knowledge, tools, and setup before starting
### **Estimated Time** ⏱️
+
Realistic time commitment for completion
### **Step-by-Step Instructions** 📝
+
Detailed procedures with code examples and screenshots
### **Validation Checkpoints** ✅
+
Verify your progress at key milestones
### **Troubleshooting** 🛠️
+
Common issues and solutions
### **Further Exploration** 🔍
+
Additional resources and next steps
## Prerequisites
@@ -82,18 +92,12 @@ Before starting the labs, ensure you have:
### Recommended Setup
```bash
-# Clone and set up vTeam
+# Clone vTeam repository
git clone https://github.com/red-hat-data-services/vTeam.git
-cd vTeam/demos/rfe-builder
-
-# Create dedicated lab environment
-python -m venv venv-labs
-source venv-labs/bin/activate
-uv pip install -r requirements.txt
+cd vTeam
-# Copy and configure secrets
-cp .streamlit/secrets.toml.example .streamlit/secrets.toml
-# Edit with your API keys
+# Follow the deployment guide for your environment
+# See: docs/user-guide/getting-started.md for detailed instructions
```
### Lab-Specific Data
@@ -113,17 +117,20 @@ labs/
## Skills You'll Develop
### **Product Management Skills**
+
- Writing clear, actionable requirements
- Collaborating with AI agents for requirement refinement
- Stakeholder communication through agent interactions
### **Technical Skills**
+
- Python development for agent customization
- YAML configuration for agent personas
- REST API integration and testing
- Docker and Kubernetes deployment
### **Process Skills**
+
- Agile refinement best practices
- Workflow optimization and measurement
- Quality assurance for AI-generated content
@@ -179,4 +186,4 @@ Complete solutions are available after you've attempted each lab:
- **Familiar with basics?** → Jump to [Lab 4: Custom Agents](advanced/lab-4-custom-agents.md)
- **Ready for production?** → Begin with [Lab 7: Jira Integration](production/lab-7-jira-integration.md)
-Let's start building your vTeam expertise! 🚀
\ No newline at end of file
+Let's start building your vTeam expertise! 🚀
diff --git a/docs/reference/index.md b/docs/reference/index.md
index 3a2375379..2a948a45d 100644
--- a/docs/reference/index.md
+++ b/docs/reference/index.md
@@ -5,15 +5,19 @@ This section provides comprehensive reference material for the vTeam system, inc
## Quick Reference
### **[Agent Personas](agent-personas.md)** 🤖
+
Complete specifications for all 7 AI agents, including their roles, expertise areas, analysis frameworks, and interaction patterns.
### **[API Endpoints](api-endpoints.md)** 🌐
+
REST API documentation with request/response examples, authentication requirements, and error handling.
### **[Configuration Schema](configuration-schema.md)** ⚙️
+
Detailed configuration file formats, validation rules, and customization options for agents, workflows, and integrations.
### **[Glossary](glossary.md)** 📖
+
Definitions of terms, concepts, and acronyms used throughout the vTeam system and documentation.
## Agent System Reference
@@ -51,10 +55,12 @@ stateDiagram-v2
## API Reference Summary
### Base URLs
+
- **Development**: `http://localhost:8000`
- **Production**: `https://vteam.example.com/api`
### Authentication
+
```http
Authorization: Bearer your-api-key-here
Content-Type: application/json
@@ -102,21 +108,7 @@ outputSchema:
### System Configuration
-```toml
-# .streamlit/secrets.toml
-[anthropic]
-api_key = "sk-ant-api03-..."
-model = "claude-3-5-sonnet-20241022"
-
-[system]
-max_concurrent_agents = 7
-response_timeout_seconds = 120
-enable_caching = true
-
-[integrations]
-jira_enabled = true
-github_enabled = false
-```
+Configuration is managed via Kubernetes Custom Resources (ProjectSettings) and environment variables in deployment manifests.
## Data Models
@@ -162,6 +154,7 @@ class AgentAnalysis(BaseModel):
### Common Issues
#### Agent Timeout
+
```json
{
"error": "AGENT_TIMEOUT",
@@ -172,6 +165,7 @@ class AgentAnalysis(BaseModel):
```
#### Invalid Configuration
+
```json
{
"error": "CONFIG_VALIDATION_ERROR",
@@ -206,12 +200,14 @@ class AgentAnalysis(BaseModel):
### Current Version: v2.0.0
**Major Changes:**
+
- LlamaDeploy workflow orchestration
- TypeScript frontend with @llamaindex/server
- Enhanced agent persona system
- Production-ready deployment architecture
**Breaking Changes:**
+
- API endpoints moved from `/api/v1/` to `/deployments/rhoai/`
- Agent configuration schema updated
- Authentication now requires Bearer tokens
@@ -236,14 +232,14 @@ To help with support requests, gather this information:
# Version info
git describe --tags
-# System details
-python --version
-streamlit --version
+# Kubernetes cluster info
+kubectl version
+kubectl get pods -n ambient-code
-# Environment check
-pip list | grep -E "(anthropic|streamlit|llamaindex)"
+# Component versions
+kubectl get pods -n ambient-code -o jsonpath='{.items[*].spec.containers[*].image}'
```
---
-This reference documentation is maintained alongside the codebase. Found an error or missing information? [Submit a pull request](https://github.com/red-hat-data-services/vTeam/pulls) or [create an issue](https://github.com/red-hat-data-services/vTeam/issues).
\ No newline at end of file
+This reference documentation is maintained alongside the codebase. Found an error or missing information? [Submit a pull request](https://github.com/red-hat-data-services/vTeam/pulls) or [create an issue](https://github.com/red-hat-data-services/vTeam/issues).
diff --git a/docs/user-guide/getting-started.md b/docs/user-guide/getting-started.md
index 79b2d1d9b..a0c604abc 100644
--- a/docs/user-guide/getting-started.md
+++ b/docs/user-guide/getting-started.md
@@ -1,208 +1,184 @@
# Getting Started
-Get vTeam up and running in just 5 minutes! This guide walks you through everything needed to create your first AI-refined RFE.
+Get vTeam up and running quickly! This guide walks you through everything needed to create your first AI-powered agentic session.
## Prerequisites
Before starting, ensure you have:
-- **Python 3.12+** (or Python 3.11+)
+- **Kubernetes or OpenShift cluster** (or OpenShift Local for development)
- **Git** for cloning the repository
-- **uv** package manager ([Installation guide](https://docs.astral.sh/uv/getting-started/installation/))
-- **pnpm** for TypeScript frontend (`npm i -g pnpm`)
-- **OpenAI API key** for embeddings and AI features
-- **Anthropic Claude API key** for conversational AI ([Get one here](https://console.anthropic.com/))
-- **Internet connection** for API calls and package downloads
+- **kubectl** or **oc** CLI tools
+- **Anthropic Claude API key** ([Get one here](https://console.anthropic.com/))
+- **Internet connection** for container image pulls and API calls
-## Installation
+For local development:
-### Step 1: Clone the Repository
+- **OpenShift Local (CRC)** - [Installation guide](https://developers.redhat.com/products/openshift-local/overview)
+- **Make** for running build commands
+- **Docker or Podman** (optional, for building custom images)
-```bash
-git clone https://github.com/red-hat-data-services/vTeam.git
-cd vTeam
-```
+## Quick Start - Local Development
-### Step 2: Set Up Environment
+The fastest way to get started is using OpenShift Local (CRC):
-Navigate to the RFE builder and install dependencies:
+### Step 1: Install OpenShift Local
```bash
-# Navigate to the RFE builder demo
-cd demos/rfe-builder
-
-# Install all dependencies (Python backend + TypeScript frontend)
-uv sync
-```
-
-This will automatically:
-- Create a virtual environment
-- Install Python dependencies from `pyproject.toml`
-- Set up the LlamaDeploy workflow system
+# Install CRC (one-time setup)
+brew install crc
-### Step 3: Configure API Access
+# Get your free Red Hat pull secret from:
+# https://console.redhat.com/openshift/create/local
-Set up your API keys in the environment file:
-
-```bash
-# Create environment file
-cp src/.env.example src/.env # If example exists
-# OR create new file:
-touch src/.env
+# Setup CRC (follow prompts to add pull secret)
+crc setup
```
-Add your API credentials to `src/.env`:
+### Step 2: Clone and Deploy
```bash
-# Required: OpenAI for embeddings
-OPENAI_API_KEY=your-openai-api-key-here
-
-# Required: Anthropic for conversational AI
-ANTHROPIC_API_KEY=sk-ant-api03-your-key-here
-
-# Optional: Vertex AI support
-VERTEX_PROJECT_ID=your-gcp-project-id
-VERTEX_LOCATION=us-central1
+# Clone the repository
+git clone https://github.com/red-hat-data-services/vTeam.git
+cd vTeam
-# Optional: Jira integration
-JIRA_BASE_URL=https://your-domain.atlassian.net
-JIRA_USERNAME=your-email@company.com
-JIRA_API_TOKEN=your-jira-api-token
+# Single command to start everything
+make dev-start
```
-!!! warning "Keep Your Keys Secret"
- Never commit `src/.env` to version control. It's already in `.gitignore`.
+This command will:
+
+- Start OpenShift Local if not running
+- Create the vteam-dev project/namespace
+- Deploy all components (frontend, backend, operator, runner)
+- Configure routes and services
+- Display the frontend URL when ready
-### Step 4: Generate Knowledge Base
+### Step 3: Configure API Key
-First, generate the document embeddings for the RAG system:
+After deployment, you need to configure your Anthropic API key:
```bash
-uv run generate
+# Create a project settings with your API key
+# Access the vTeam UI (URL shown after dev-start)
+# Navigate to Project Settings
+# Add your ANTHROPIC_API_KEY
```
-This creates vector embeddings from documents in the `./data` directory.
-
-### Step 5: Launch the Application
-
-Start the LlamaDeploy system in two steps:
+Alternatively, create it via CLI:
```bash
-# Terminal 1: Start the API server
-uv run -m llama_deploy.apiserver
-
-# Terminal 2: Deploy the workflow (wait for server to start)
-uv run llamactl deploy deployment.yml
+oc apply -f - < -n vteam-dev
-# Or use a different port in deployment.yml
-# Modify the apiServer.port setting
+# Check pod logs
+oc logs -n vteam-dev
+
+# Verify images are accessible
+oc get pods -n vteam-dev -o jsonpath='{.items[*].spec.containers[*].image}'
```
### Deployment Failures
-**Symptom**: `llamactl deploy` fails or times out
+**Symptom**: `make dev-start` fails or times out
**Solution**:
-1. Ensure the API server is running first (`uv run -m llama_deploy.apiserver`)
-2. Wait a few seconds between starting the server and deploying
-3. Check logs for specific error messages
-4. Verify all agents have valid configurations in `src/agents/`
-### Slow Agent Responses
+1. Check CRC status: `crc status`
+2. Ensure CRC has enough resources (recommend 8GB RAM minimum)
+3. Check deployment logs: `make dev-logs`
+4. Verify all CRDs are installed: `oc get crd | grep vteam`
+
+### Session Job Failures
-**Symptom**: Long wait times for multi-agent analysis
+**Symptom**: AgenticSession jobs fail or timeout
**Solution**:
-1. Check your internet connection
-2. Verify API service status at [Anthropic Status](https://status.anthropic.com/) and [OpenAI Status](https://status.openai.com/)
-3. Monitor the LlamaDeploy logs for bottlenecks
-4. Consider using smaller document sets during development
+
+1. Check job logs: `oc logs job/ -n vteam-dev`
+2. Verify workspace PVC is accessible
+3. Check operator logs for errors: `make dev-logs-operator`
+4. Ensure sufficient cluster resources for job pods
## What's Next?
Now that vTeam is running, you're ready to:
-1. **Learn RFE best practices** → [Creating RFEs Guide](creating-rfes.md)
-2. **Understand the AI agents** → [Agent Framework](agent-framework.md)
-3. **Try hands-on exercises** → [Lab 1: First RFE](../labs/basic/lab-1-first-rfe.md)
-4. **Customize your setup** → [Configuration Guide](configuration.md)
+1. **Explore the architecture** → [Developer Guide](../developer-guide/index.md)
+2. **Try RFE workflows** → [RFE Workflow Guide](rfe-workflow.md)
+3. **Try hands-on exercises** → [Labs](../labs/index.md)
+4. **Customize your deployment** → [Configuration Guide](configuration.md)
## Getting Help
If you encounter issues not covered here:
-- **Check the troubleshooting guide** → [Troubleshooting](troubleshooting.md)
-- **Search existing issues** → [GitHub Issues](https://github.com/red-hat-data-services/vTeam/issues)
+- **Check the CLAUDE.md** → [Project Documentation](../../CLAUDE.md)
+- **Search existing issues** → [GitHub Issues](https://github.com/red-hat-data-services/vTeam/issues)
- **Create a new issue** with your error details and environment info
-Welcome to AI-assisted refinement! 🚀
\ No newline at end of file
+Welcome to Kubernetes-native AI automation! 🚀