diff --git a/.specify/memory/roadmap.md b/.specify/memory/roadmap.md index d493843..4751622 100644 --- a/.specify/memory/roadmap.md +++ b/.specify/memory/roadmap.md @@ -1,6 +1,6 @@ # Product Roadmap: Subtree CLI -**Version:** v1.7.0 +**Version:** v1.8.0 **Last Updated:** 2025-11-30 ## Vision & Goals @@ -34,7 +34,7 @@ Simplify git subtree management through declarative YAML configuration with safe - ✅ Multi-Pattern Extraction (5 user stories, 439 tests) - ✅ Extract Clean Mode (5 user stories, 477 tests) - ✅ **Brace Expansion: Embedded Path Separators** (4 user stories, 526 tests) -- ⏳ **Multi-Destination Extraction** — Fan-out to multiple `--to` paths +- ✅ **Multi-Destination Extraction** (5 user stories, 571 tests) — Fan-out to multiple `--to` paths - ⏳ Lint Command — Configuration integrity validation ## Product-Level Metrics & Success Criteria @@ -80,6 +80,7 @@ Simplify git subtree management through declarative YAML configuration with safe ## Change Log +- **v1.8.0** (2025-11-30): Multi-Destination Extraction complete (012-multi-destination-extraction) with 571 tests; fan-out to multiple `--to` paths, fail-fast validation, clean mode parity, bulk support (MINOR — feature complete) - **v1.7.0** (2025-11-30): Brace Expansion complete (011-brace-expansion) with 526 tests; embedded path separators, cartesian product, bash pass-through semantics (MINOR — feature complete) - **v1.6.0** (2025-11-29): Added Brace Expansion and Multi-Destination Extraction to Phase 3; marked Multi-Pattern Extraction and Extract Clean Mode complete (MINOR — new features) - **v1.5.0** (2025-11-27): Roadmap refactored to multi-file structure; added Multi-Pattern Extraction and Extract Clean Mode to Phase 3 (MINOR — new features, structural improvement) diff --git a/.specify/memory/roadmap/phase-3-advanced-operations.md b/.specify/memory/roadmap/phase-3-advanced-operations.md index 1ad3a89..7d67b24 100644 --- a/.specify/memory/roadmap/phase-3-advanced-operations.md +++ b/.specify/memory/roadmap/phase-3-advanced-operations.md @@ -82,7 +82,7 @@ Enable portable configuration validation, selective file extraction with compreh - Nested braces, escaping, numeric ranges deferred to backlog - **Delivered**: All 4 user stories (basic expansion, multiple groups, pass-through, empty alternative errors), 526 tests passing -### 6. Multi-Destination Extraction (Fan-Out) ⏳ PLANNED +### 6. Multi-Destination Extraction (Fan-Out) ✅ COMPLETE - **Purpose & user value**: Allows extracting matched files to multiple destinations simultaneously (e.g., `--to Lib/ --to Vendor/`), enabling distribution of extracted files to multiple locations without repeated commands - **Success metrics**: @@ -95,7 +95,10 @@ Enable portable configuration validation, selective file extraction with compreh - Fan-out semantics: N files × M destinations = N×M copy operations - Directory structure preserved at each destination - YAML schema: `to: ["path1/", "path2/"]` for persisted mappings - - Atomic per-destination: all files to one destination succeed or fail together + - Fail-fast validation: all destinations checked upfront before any writes + - PathNormalizer deduplicates equivalent paths (`Lib/`, `./Lib`, `Lib` → single destination) + - Backward compatible: single `--to` and existing YAML configs unchanged +- **Delivered**: All 5 user stories (CLI multi-dest, persist arrays, clean mode, fail-fast, bulk support), 571 tests passing ### 7. Lint Command ⏳ PLANNED @@ -116,7 +119,7 @@ Enable portable configuration validation, selective file extraction with compreh 3. Multi-Pattern Extraction ✅ 4. Extract Clean Mode ✅ 5. Brace Expansion in Patterns ✅ - 6. Multi-Destination Extraction ⏳ + 6. Multi-Destination Extraction ✅ 7. Lint Command ⏳ (final Phase 3 feature) - **Rationale**: Brace Expansion and Multi-Destination extend pattern capabilities before Lint validates all operations - **Cross-phase dependencies**: Requires Phase 2 Add Command for subtrees to exist @@ -127,7 +130,7 @@ This phase is successful when: - All seven features complete and tested - Extract supports multiple patterns and cleanup operations - Lint provides comprehensive integrity validation -- 600+ tests pass on macOS and Ubuntu (currently 526, growing) +- 600+ tests pass on macOS and Ubuntu (currently 571, growing) ## Risks & Assumptions @@ -138,6 +141,7 @@ This phase is successful when: ## Phase Notes +- 2025-11-30: Multi-Destination Extraction complete (012-multi-destination-extraction) with 571 tests; 5 user stories delivered - 2025-11-30: Brace Expansion complete (011-brace-expansion) with 526 tests; 4 user stories delivered - 2025-11-29: Added Brace Expansion and Multi-Destination Extraction features - 2025-11-29: Extract Clean Mode complete (010-extract-clean) with 477 tests; dry-run/preview mode deferred to Phase 5 backlog diff --git a/.specify/templates/spec-template.md b/.specify/templates/spec-template.md index c67d914..a4147e6 100644 --- a/.specify/templates/spec-template.md +++ b/.specify/templates/spec-template.md @@ -113,3 +113,17 @@ - **SC-002**: [Measurable metric, e.g., "System handles 1000 concurrent users without degradation"] - **SC-003**: [User satisfaction metric, e.g., "90% of users successfully complete primary task on first attempt"] - **SC-004**: [Business metric, e.g., "Reduce support tickets related to [X] by 50%"] + +## Validation Steps *(optional)* + + + +- [ ] Step 1: [Manual validation step, e.g., "Run `command --flag` and verify output shows X"] +- [ ] Step 2: [Manual validation step, e.g., "Create config with Y and verify behavior Z"] + +*Or state:* All validation covered by automated tests. diff --git a/.specify/templates/tasks-template.md b/.specify/templates/tasks-template.md index 60f9be4..458fba7 100644 --- a/.specify/templates/tasks-template.md +++ b/.specify/templates/tasks-template.md @@ -249,3 +249,18 @@ With multiple developers: - Commit after each task or logical group - Stop at any checkpoint to validate story independently - Avoid: vague tasks, same file conflicts, cross-story dependencies that break independence + +--- + +## Type Change Checklist + +When changing a field's type (e.g., `String?` → `[String]}`, `Int` → `Int?`), verify: + +- [ ] All `!= nil` checks updated to appropriate emptiness checks (e.g., `.isEmpty`) +- [ ] All `guard let` bindings updated to new type +- [ ] All comparisons updated (e.g., `== nil` → `.isEmpty`) +- [ ] All initializers updated to match new type +- [ ] Run `swift build` (or equivalent) to catch remaining issues +- [ ] Update tests to use new type format + +**Rationale**: Type changes propagate through the codebase. Missing updates cause lint errors mid-implementation. diff --git a/README.md b/README.md index 8b5d0d1..2e8da30 100644 --- a/README.md +++ b/README.md @@ -165,6 +165,17 @@ subtree extract --name mylib \ --from "src/**/*.c" \ --to vendor/ +# Multi-destination extraction (012) - fan-out to multiple locations +subtree extract --name mylib \ + --from "**/*.h" \ + --to Lib/include/ \ + --to Vendor/headers/ + +# Combined: multi-pattern + multi-destination (cartesian product) +subtree extract --name mylib \ + --from "*.h" --from "*.c" \ + --to Lib/ --to Vendor/ + # Brace expansion (011) - compact patterns with {alternatives} subtree extract --name mylib --from "*.{h,c,cpp}" --to Sources/ subtree extract --name mylib --from "{src,test}/*.swift" --to Sources/ @@ -177,8 +188,8 @@ subtree extract --name crypto-lib \ # With exclusions (applies to all patterns) subtree extract --name mylib --from "src/**/*.c" --to Sources/ --exclude "**/test/**" -# Save multi-pattern mapping for future use -subtree extract --name mylib --from "include/**/*.h" --from "src/**/*.c" --to vendor/ --persist +# Save multi-destination mapping for future use +subtree extract --name mylib --from "**/*.h" --to Lib/ --to Vendor/ --persist # Execute saved mappings from subtree.yaml subtree extract --name example-lib @@ -193,6 +204,9 @@ Remove previously extracted files with checksum validation: # Clean specific files (validates checksums before deletion) subtree extract --clean --name mylib --from "src/**/*.c" --to Sources/ +# Clean from multiple destinations (012) +subtree extract --clean --name mylib --from "**/*.h" --to Lib/ --to Vendor/ + # Clean all saved mappings for a subtree subtree extract --clean --name mylib @@ -258,6 +272,7 @@ subtree validate --with-remote - `--persist` - Save mapping to subtree.yaml - `--force` - Overwrite git-tracked files / force delete modified files - `--clean` - Remove extracted files (validates checksums first) + - Multi-destination: Use `--to` multiple times for fan-out extraction - **`validate`** - Verify subtree integrity - `--name ` - Validate specific subtree @@ -287,16 +302,28 @@ subtrees: squash: true # Default: true commit: 0123456789abcdef... # Latest known commit extractions: # File extraction mappings - # Single pattern (legacy format) + # Single pattern, single destination (legacy format) - from: "docs/**/*.md" to: Docs/ - # Multi-pattern (009) - array format + # Multi-pattern (009) - union extraction - from: - "include/**/*.h" - "src/**/*.c" to: vendor/ exclude: - "**/test/**" + # Multi-destination (012) - fan-out to multiple locations + - from: "**/*.h" + to: + - Lib/include/ + - Vendor/headers/ + # Combined: multi-pattern + multi-destination + - from: + - "*.h" + - "*.c" + to: + - Lib/ + - Vendor/ ``` ## Platform Compatibility diff --git a/Sources/SubtreeLib/Commands/ExtractCommand.swift b/Sources/SubtreeLib/Commands/ExtractCommand.swift index 1a1978d..b301f5f 100644 --- a/Sources/SubtreeLib/Commands/ExtractCommand.swift +++ b/Sources/SubtreeLib/Commands/ExtractCommand.swift @@ -34,15 +34,18 @@ private func writeStderr(_ message: String) { FileHandle.standardError.write(Data(message.utf8)) } -/// Extract files from a subtree using glob patterns (008-extract-command + 009-multi-pattern) +/// Extract files from a subtree using glob patterns (008 + 009 + 012) /// /// This command supports two modes: -/// 1. Ad-hoc extraction: Extract files using command-line patterns (supports multiple `--from` flags) +/// 1. Ad-hoc extraction: Extract files using command-line patterns /// 2. Saved mappings: Extract files using saved extraction mappings from subtree.yaml /// /// Multi-pattern extraction (009): Use multiple `--from` flags to extract files from several /// directories in a single command. Files are deduplicated by relative path. /// +/// Multi-destination extraction (012): Use multiple `--to` flags to copy files to several +/// destinations simultaneously (fan-out). Destinations are deduplicated by normalized path. +/// /// Examples: /// ``` /// # Ad-hoc: Extract markdown docs from docs subtree @@ -51,6 +54,9 @@ private func writeStderr(_ message: String) { /// # Multi-pattern: Extract headers AND sources in one command /// subtree extract --name mylib --from "include/**/*.h" --from "src/**/*.c" --to vendor/ /// +/// # Multi-destination: Fan-out to multiple locations +/// subtree extract --name mylib --from "**/*.h" --to Lib/include/ --to Vendor/headers/ +/// /// # With exclusions (applies to all patterns) /// subtree extract --name mylib --from "src/**/*.c" --to Sources/ --exclude "**/test/**" /// ``` @@ -80,11 +86,17 @@ public struct ExtractCommand: AsyncParsableCommand { # Multi-pattern: Extract headers AND sources together subtree extract --name mylib --from "include/**/*.h" --from "src/**/*.c" --to vendor/ + # Multi-destination: Fan-out to multiple locations (012) + subtree extract --name mylib --from "**/*.h" --to Lib/ --to Vendor/ + + # Combined: Multi-pattern + multi-destination + subtree extract --name mylib --from "*.h" --from "*.c" --to Lib/ --to Vendor/ + # With exclusions (applies to all patterns) subtree extract --name mylib --from "src/**/*.c" --to Sources/ --exclude "**/test/**" - # Save mapping for future use - subtree extract --name mylib --from "src/**/*.c" --to Sources/ --persist + # Save mapping for future use (supports multi-destination) + subtree extract --name mylib --from "*.h" --to Lib/ --to Vendor/ --persist # Execute saved mappings subtree extract --name mylib @@ -94,6 +106,9 @@ public struct ExtractCommand: AsyncParsableCommand { # Clean extracted files (checksum validated) subtree extract --clean --name mylib --from "src/**/*.c" --to Sources/ + # Clean from multiple destinations + subtree extract --clean --name mylib --from "*.h" --to Lib/ --to Vendor/ + # Force clean modified files subtree extract --clean --force --name mylib --from "*.c" --to Sources/ @@ -130,9 +145,9 @@ public struct ExtractCommand: AsyncParsableCommand { @Option(name: .long, help: "Glob pattern to match files (can be repeated for multi-pattern extraction)") var from: [String] = [] - // T022: Destination option - @Option(name: .long, help: "Destination path relative to repository root (e.g., 'docs/', 'Sources/MyLib/')") - var to: String? + // T022 + T035: Destination option (repeatable for multi-destination extraction) + @Option(name: .long, help: "Destination path (can be repeated for multi-destination extraction)") + var to: [String] = [] // T066: --exclude repeatable flag for exclusion patterns @Option(name: .long, help: "Glob pattern to exclude files (can be repeated)") @@ -167,7 +182,7 @@ public struct ExtractCommand: AsyncParsableCommand { } // T111: Mode selection based on --from/--to options - let hasAdHocArgs = !from.isEmpty && to != nil + let hasAdHocArgs = !from.isEmpty && !to.isEmpty if hasAdHocArgs { // AD-HOC MODE: Extract specific pattern @@ -186,7 +201,7 @@ public struct ExtractCommand: AsyncParsableCommand { try await runAdHocExtraction(subtreeName: subtreeName) } else { // BULK MODE: Execute saved mappings - if !from.isEmpty || to != nil { + if !from.isEmpty || !to.isEmpty { writeStderr("❌ Error: --from and --to must both be provided or both omitted\n") Foundation.exit(1) } @@ -205,11 +220,19 @@ public struct ExtractCommand: AsyncParsableCommand { // MARK: - Ad-Hoc Extraction Mode private func runAdHocExtraction(subtreeName: String) async throws { - guard let destinationValue = to else { + // T036: Deduplicate destinations using PathNormalizer + let deduplicatedDestinations = PathNormalizer.deduplicate(to) + + guard !deduplicatedDestinations.isEmpty else { writeStderr("❌ Internal error: Missing --to in ad-hoc mode\n") Foundation.exit(2) } + // T039: Soft limit warning for >10 destinations (FR-011) + if deduplicatedDestinations.count > 10 { + print("⚠️ Warning: \(deduplicatedDestinations.count) destinations specified (>10)") + } + // T068: Subtree validation let gitRoot = try await validateGitRepository() let configPath = ConfigFileManager.configPath(gitRoot: gitRoot) @@ -217,8 +240,12 @@ public struct ExtractCommand: AsyncParsableCommand { let subtree = try validateSubtreeExists(name: subtreeName, in: config) try await validateSubtreePrefix(subtree.prefix, gitRoot: gitRoot) - // T069: Destination path validation - let normalizedDest = try validateDestination(destinationValue, gitRoot: gitRoot) + // T069: Validate all destination paths + var normalizedDestinations: [String] = [] + for dest in deduplicatedDestinations { + let normalizedDest = try validateDestination(dest, gitRoot: gitRoot) + normalizedDestinations.append(normalizedDest) + } // T039: Expand brace patterns in --from before matching (011-brace-expansion) let expandedFromPatterns = expandBracePatterns(from) @@ -272,47 +299,55 @@ public struct ExtractCommand: AsyncParsableCommand { print("⚠️ Pattern '\(zeroMatch.pattern)' matched 0 files") } - // T074: Destination directory creation - let fullDestPath = gitRoot + "/" + normalizedDest - try createDestinationDirectory(at: fullDestPath) - - // T132-T133: Check for git-tracked files before copying (unless --force) + // T047-T049: Fail-fast - validate ALL destinations upfront BEFORE copying to ANY if !force { - let trackedFiles = try await checkForTrackedFiles( - matchedFiles: allMatchedFiles, - fullDestPath: fullDestPath, - gitRoot: gitRoot - ) + var allTrackedFiles: [String] = [] + for normalizedDest in normalizedDestinations { + let fullDestPath = gitRoot + "/" + normalizedDest + let trackedFiles = try await checkForTrackedFiles( + matchedFiles: allMatchedFiles, + fullDestPath: fullDestPath, + gitRoot: gitRoot + ) + allTrackedFiles.append(contentsOf: trackedFiles) + } - if !trackedFiles.isEmpty { - // T135-T136: Show error and exit with code 2 - try handleOverwriteProtection(trackedFiles: trackedFiles) + if !allTrackedFiles.isEmpty { + // T049: Show all conflicts across all destinations + try handleOverwriteProtection(trackedFiles: allTrackedFiles) } } - // T072: File copying with FileManager - // T073: Directory structure preservation - var copiedCount = 0 - for (sourcePath, relativePath) in allMatchedFiles { - let destFilePath = fullDestPath + "/" + relativePath - try copyFilePreservingStructure(from: sourcePath, to: destFilePath) - copiedCount += 1 + // T037: Fan-out extraction to all destinations (after validation passes) + for normalizedDest in normalizedDestinations { + // T074: Destination directory creation + let fullDestPath = gitRoot + "/" + normalizedDest + try createDestinationDirectory(at: fullDestPath) + + // T072: File copying with FileManager + // T073: Directory structure preservation + var copiedCount = 0 + for (sourcePath, relativePath) in allMatchedFiles { + let destFilePath = fullDestPath + "/" + relativePath + try copyFilePreservingStructure(from: sourcePath, to: destFilePath) + copiedCount += 1 + } + + // T038: Per-destination success output (FR-017) + print("✅ Extracted \(copiedCount) file(s) to '\(normalizedDest)'") } - // T092: Save mapping if --persist flag is set + // T092 + T041: Save mapping if --persist flag is set var mappingSaved = false if persist { mappingSaved = try await saveMappingToConfig( patterns: from, - destination: destinationValue, // Use original destination to preserve user's formatting + destinations: deduplicatedDestinations, // T041: Use deduplicated destinations array excludePatterns: exclude, subtreeName: subtreeName, configPath: configPath ) } - - // T095: Contextual success messages - print("✅ Extracted \(copiedCount) file(s) from '\(subtreeName)' to '\(normalizedDest)'") if persist { if mappingSaved { print("📝 Saved extraction mapping to subtree.yaml") @@ -380,10 +415,10 @@ public struct ExtractCommand: AsyncParsableCommand { mappingNum: mappingNum, totalMappings: mappings.count ) - print(" ✅ [\(mappingNum)/\(mappings.count)] \(mapping.from.joined(separator: ", ")) → \(mapping.to) (\(count) file\(count == 1 ? "" : "s"))") + print(" ✅ [\(mappingNum)/\(mappings.count)] \(mapping.from.joined(separator: ", ")) → \(mapping.to.joined(separator: ", ")) (\(count) file\(count == 1 ? "" : "s"))") successfulMappings += 1 } catch let error as GlobMatcherError { - print(" ❌ [\(mappingNum)/\(mappings.count)] \(mapping.from.joined(separator: ", ")) → \(mapping.to) (invalid pattern)") + print(" ❌ [\(mappingNum)/\(mappings.count)] \(mapping.from.joined(separator: ", ")) → \(mapping.to.joined(separator: ", ")) (invalid pattern)") failedMappings.append(( subtreeName: subtree.name, mappingIndex: mappingNum, @@ -391,7 +426,7 @@ public struct ExtractCommand: AsyncParsableCommand { exitCode: 1 // User error )) } catch let error as LocalizedError where error.errorDescription?.contains("git-tracked") == true { - print(" ❌ [\(mappingNum)/\(mappings.count)] \(mapping.from.joined(separator: ", ")) → \(mapping.to) (blocked: git-tracked files)") + print(" ❌ [\(mappingNum)/\(mappings.count)] \(mapping.from.joined(separator: ", ")) → \(mapping.to.joined(separator: ", ")) (blocked: git-tracked files)") failedMappings.append(( subtreeName: subtree.name, mappingIndex: mappingNum, @@ -399,7 +434,7 @@ public struct ExtractCommand: AsyncParsableCommand { exitCode: 2 // System error (overwrite protection) )) } catch { - print(" ❌ [\(mappingNum)/\(mappings.count)] \(mapping.from.joined(separator: ", ")) → \(mapping.to) (failed)") + print(" ❌ [\(mappingNum)/\(mappings.count)] \(mapping.from.joined(separator: ", ")) → \(mapping.to.joined(separator: ", ")) (failed)") failedMappings.append(( subtreeName: subtree.name, mappingIndex: mappingNum, @@ -431,7 +466,7 @@ public struct ExtractCommand: AsyncParsableCommand { /// T023: Route clean mode based on arguments (ad-hoc vs bulk) private func runCleanMode() async throws { - let hasAdHocArgs = !from.isEmpty && to != nil + let hasAdHocArgs = !from.isEmpty && !to.isEmpty if hasAdHocArgs { // AD-HOC CLEAN MODE @@ -448,7 +483,7 @@ public struct ExtractCommand: AsyncParsableCommand { try await runAdHocClean(subtreeName: subtreeName) } else { // BULK CLEAN MODE - if !from.isEmpty || to != nil { + if !from.isEmpty || !to.isEmpty { writeStderr("❌ Error: --from and --to must both be provided or both omitted\n") Foundation.exit(1) } @@ -562,7 +597,7 @@ public struct ExtractCommand: AsyncParsableCommand { let message: String } - /// Clean files for a single extraction mapping + /// Clean files for a single extraction mapping (multi-destination support) private func cleanSingleMapping( mapping: ExtractionMapping, subtree: SubtreeEntry, @@ -571,90 +606,108 @@ public struct ExtractCommand: AsyncParsableCommand { mappingNum: Int, totalMappings: Int ) async throws -> Int { - // Normalize destination - let normalizedDest = try validateDestination(mapping.to, gitRoot: gitRoot) - let fullDestPath = gitRoot + "/" + normalizedDest + // T044 + T052: Deduplicate destinations + let deduplicatedDestinations = PathNormalizer.deduplicate(mapping.to) + + // Validate all destinations + var normalizedDestinations: [String] = [] + for dest in deduplicatedDestinations { + let normalizedDest = try validateDestination(dest, gitRoot: gitRoot) + normalizedDestinations.append(normalizedDest) + } // 011-brace-expansion: Expand brace patterns before matching let expandedFromPatterns = expandBracePatterns(mapping.from) let expandedExcludePatterns = expandBracePatterns(mapping.exclude ?? []) - // Find files to clean - let filesToClean = try await findFilesToClean( - patterns: expandedFromPatterns, - excludePatterns: expandedExcludePatterns, - subtreePrefix: subtree.prefix, - destinationPath: fullDestPath, - gitRoot: gitRoot - ) - - // Zero files = success for this mapping - guard !filesToClean.isEmpty else { - print(" [\(mappingNum)/\(totalMappings)] → '\(normalizedDest)': 0 files (no matches)") - return 0 - } - - // Validate checksums (unless --force) - var validatedFiles: [CleanFileEntry] = [] - var skippedCount = 0 + // T044 + T052: Fan-out clean to all destinations + var totalDeletedCount = 0 - for file in filesToClean { - let validationResult = await validateChecksumForClean(file: file, force: force) + for normalizedDest in normalizedDestinations { + let fullDestPath = gitRoot + "/" + normalizedDest - switch validationResult { - case .valid: - validatedFiles.append(file) - case .modified(let sourceHash, let destHash): - // In bulk mode, report error but throw to be caught by continue-on-error - throw CleanMappingError( - exitCode: 1, - message: "File '\(file.relativePath)' modified (src: \(sourceHash.prefix(8))..., dst: \(destHash.prefix(8))...)" - ) - case .sourceMissing: - if force { + // Find files to clean + let filesToClean = try await findFilesToClean( + patterns: expandedFromPatterns, + excludePatterns: expandedExcludePatterns, + subtreePrefix: subtree.prefix, + destinationPath: fullDestPath, + gitRoot: gitRoot + ) + + // Zero files = success for this destination + guard !filesToClean.isEmpty else { + print(" [\(mappingNum)/\(totalMappings)] → '\(normalizedDest)': 0 files (no matches)") + continue + } + + // Validate checksums (unless --force) + var validatedFiles: [CleanFileEntry] = [] + var skippedCount = 0 + + for file in filesToClean { + let validationResult = await validateChecksumForClean(file: file, force: force) + + switch validationResult { + case .valid: validatedFiles.append(file) - } else { - skippedCount += 1 + case .modified(let sourceHash, let destHash): + // In bulk mode, report error but throw to be caught by continue-on-error + throw CleanMappingError( + exitCode: 1, + message: "File '\(file.relativePath)' in '\(normalizedDest)' modified (src: \(sourceHash.prefix(8))..., dst: \(destHash.prefix(8))...)" + ) + case .sourceMissing: + if force { + validatedFiles.append(file) + } else { + skippedCount += 1 + } } } - } - - // Delete validated files - var pruner = DirectoryPruner(boundary: fullDestPath) - var deletedCount = 0 - - for file in validatedFiles { - do { - try FileManager.default.removeItem(atPath: file.destinationPath) - pruner.add(parentOf: file.destinationPath) - deletedCount += 1 - } catch { - throw CleanMappingError( - exitCode: 3, - message: "Failed to delete '\(file.relativePath)': \(error.localizedDescription)" - ) + + // Delete validated files + var pruner = DirectoryPruner(boundary: fullDestPath) + var deletedCount = 0 + + for file in validatedFiles { + do { + try FileManager.default.removeItem(atPath: file.destinationPath) + pruner.add(parentOf: file.destinationPath) + deletedCount += 1 + } catch { + throw CleanMappingError( + exitCode: 3, + message: "Failed to delete '\(file.relativePath)': \(error.localizedDescription)" + ) + } } + + totalDeletedCount += deletedCount + + // Prune empty directories + let prunedDirs = try pruner.pruneEmpty() + + // Report progress for this destination + var statusParts: [String] = ["\(deletedCount) file(s)"] + if prunedDirs > 0 { + statusParts.append("\(prunedDirs) dir(s) pruned") + } + if skippedCount > 0 { + statusParts.append("\(skippedCount) skipped") + } + print(" [\(mappingNum)/\(totalMappings)] → '\(normalizedDest)': \(statusParts.joined(separator: ", "))") } - // Prune empty directories - let prunedDirs = try pruner.pruneEmpty() - - // Report progress - var statusParts: [String] = ["\(deletedCount) file(s)"] - if prunedDirs > 0 { - statusParts.append("\(prunedDirs) dir(s) pruned") - } - if skippedCount > 0 { - statusParts.append("\(skippedCount) skipped") - } - print(" [\(mappingNum)/\(totalMappings)] → '\(normalizedDest)': \(statusParts.joined(separator: ", "))") - - return deletedCount + return totalDeletedCount } - /// T024: Ad-hoc clean with pattern arguments + /// T024 + T043-T046: Ad-hoc clean with pattern arguments (multi-destination) private func runAdHocClean(subtreeName: String) async throws { - guard let destinationValue = to else { + // T043: Deduplicate destinations using PathNormalizer + let deduplicatedDestinations = PathNormalizer.deduplicate(to) + + guard !deduplicatedDestinations.isEmpty else { writeStderr("❌ Internal error: Missing --to in ad-hoc clean mode\n") Foundation.exit(2) } @@ -670,83 +723,92 @@ public struct ExtractCommand: AsyncParsableCommand { try await validateSubtreePrefix(subtree.prefix, gitRoot: gitRoot) } - // Normalize destination - let normalizedDest = try validateDestination(destinationValue, gitRoot: gitRoot) - let fullDestPath = gitRoot + "/" + normalizedDest + // Validate all destination paths + var normalizedDestinations: [String] = [] + for dest in deduplicatedDestinations { + let normalizedDest = try validateDestination(dest, gitRoot: gitRoot) + normalizedDestinations.append(normalizedDest) + } // 011-brace-expansion: Expand brace patterns before matching let expandedFromPatterns = expandBracePatterns(from) let expandedExcludePatterns = expandBracePatterns(exclude) - // T025: Find files to clean in destination - let filesToClean = try await findFilesToClean( - patterns: expandedFromPatterns, - excludePatterns: expandedExcludePatterns, - subtreePrefix: subtree.prefix, - destinationPath: fullDestPath, - gitRoot: gitRoot - ) - - // BC-007: Zero files matched = success - guard !filesToClean.isEmpty else { - print("✅ Cleaned 0 file(s) from '\(subtreeName)' destination '\(normalizedDest)'") - print(" ℹ️ No files matched the pattern(s)") - return - } - - // T026-T028: Validate checksums and handle missing sources - var validatedFiles: [CleanFileEntry] = [] - var skippedCount = 0 - - for file in filesToClean { - let validationResult = await validateChecksumForClean(file: file, force: force) + // T043-T046: Fan-out clean to all destinations + for normalizedDest in normalizedDestinations { + let fullDestPath = gitRoot + "/" + normalizedDest - switch validationResult { - case .valid: - validatedFiles.append(file) - case .modified(let sourceHash, let destHash): - // T027: Fail fast on checksum mismatch (unless --force) - writeStderr("❌ Error: File '\(file.relativePath)' has been modified\n\n") - writeStderr(" Source hash: \(sourceHash)\n") - writeStderr(" Dest hash: \(destHash)\n\n") - writeStderr("Suggestion: Use --force to delete modified files, or restore original content.\n") - Foundation.exit(1) - case .sourceMissing: - // T028: Skip with warning for missing source (unless --force) - if force { + // T025: Find files to clean in destination + let filesToClean = try await findFilesToClean( + patterns: expandedFromPatterns, + excludePatterns: expandedExcludePatterns, + subtreePrefix: subtree.prefix, + destinationPath: fullDestPath, + gitRoot: gitRoot + ) + + // BC-007: Zero files matched = success for this destination + guard !filesToClean.isEmpty else { + print("✅ Cleaned 0 file(s) from '\(subtreeName)' destination '\(normalizedDest)'") + continue + } + + // T026-T028: Validate checksums and handle missing sources + var validatedFiles: [CleanFileEntry] = [] + var skippedCount = 0 + + for file in filesToClean { + let validationResult = await validateChecksumForClean(file: file, force: force) + + switch validationResult { + case .valid: validatedFiles.append(file) - } else { - print("⚠️ Skipping '\(file.relativePath)': source file not found in subtree") - skippedCount += 1 + case .modified(let sourceHash, let destHash): + // T045: Fail fast on checksum mismatch (unless --force) + writeStderr("❌ Error: File '\(file.relativePath)' in '\(normalizedDest)' has been modified\n\n") + writeStderr(" Source hash: \(sourceHash)\n") + writeStderr(" Dest hash: \(destHash)\n\n") + writeStderr("Suggestion: Use --force to delete modified files, or restore original content.\n") + Foundation.exit(1) + case .sourceMissing: + // T028: Skip with warning for missing source (unless --force) + if force { + validatedFiles.append(file) + } else { + print("⚠️ Skipping '\(file.relativePath)': source file not found in subtree") + skippedCount += 1 + } } } - } - - // T029: Delete validated files - var pruner = DirectoryPruner(boundary: fullDestPath) - var deletedCount = 0 - - for file in validatedFiles { - do { - try FileManager.default.removeItem(atPath: file.destinationPath) - pruner.add(parentOf: file.destinationPath) - deletedCount += 1 - } catch { - writeStderr("❌ Error: Failed to delete '\(file.relativePath)': \(error.localizedDescription)\n") - Foundation.exit(3) + + // T029: Delete validated files + var pruner = DirectoryPruner(boundary: fullDestPath) + var deletedCount = 0 + + for file in validatedFiles { + do { + try FileManager.default.removeItem(atPath: file.destinationPath) + pruner.add(parentOf: file.destinationPath) + deletedCount += 1 + } catch { + writeStderr("❌ Error: Failed to delete '\(file.relativePath)': \(error.localizedDescription)\n") + Foundation.exit(3) + } } - } - - // T030: Prune empty directories - let prunedDirs = try pruner.pruneEmpty() - - // T031: Success output - print("✅ Cleaned \(deletedCount) file(s) from '\(subtreeName)' destination '\(normalizedDest)'") - if prunedDirs > 0 { - print(" 📁 Pruned \(prunedDirs) empty director\(prunedDirs == 1 ? "y" : "ies")") - } - if skippedCount > 0 { - print(" ⚠️ Skipped \(skippedCount) file(s) with missing source") + + // T030: Prune empty directories + let prunedDirs = try pruner.pruneEmpty() + + // T046: Per-destination success output + var statusParts: [String] = [] + if prunedDirs > 0 { + statusParts.append("\(prunedDirs) dir(s) pruned") + } + if skippedCount > 0 { + statusParts.append("\(skippedCount) skipped") + } + let suffix = statusParts.isEmpty ? "" : " (\(statusParts.joined(separator: ", ")))" + print("✅ Cleaned \(deletedCount) file(s) from '\(normalizedDest)'\(suffix)") } } @@ -839,8 +901,15 @@ public struct ExtractCommand: AsyncParsableCommand { mappingNum: Int, totalMappings: Int ) async throws -> Int { - // Validate destination - let normalizedDest = try validateDestination(mapping.to, gitRoot: gitRoot) + // T040: Deduplicate destinations using PathNormalizer (012-multi-destination) + let deduplicatedDestinations = PathNormalizer.deduplicate(mapping.to) + + // Validate all destinations + var normalizedDestinations: [String] = [] + for dest in deduplicatedDestinations { + let normalizedDest = try validateDestination(dest, gitRoot: gitRoot) + normalizedDestinations.append(normalizedDest) + } // 011-brace-expansion: Expand brace patterns before matching let expandedFromPatterns = expandBracePatterns(mapping.from) @@ -871,20 +940,20 @@ public struct ExtractCommand: AsyncParsableCommand { return 0 // No files matched, but not an error } - // Create destination directory - let fullDestPath = gitRoot + "/" + normalizedDest - try createDestinationDirectory(at: fullDestPath) - - // Check for tracked files (unless --force) - // In bulk mode, this will be caught as an error for this specific mapping + // T047-T049 + T053: Fail-fast - validate ALL destinations upfront BEFORE copying to ANY if !force { - let trackedFiles = try await checkForTrackedFiles( - matchedFiles: allMatchedFiles, - fullDestPath: fullDestPath, - gitRoot: gitRoot - ) + var allTrackedFiles: [String] = [] + for normalizedDest in normalizedDestinations { + let fullDestPath = gitRoot + "/" + normalizedDest + let trackedFiles = try await checkForTrackedFiles( + matchedFiles: allMatchedFiles, + fullDestPath: fullDestPath, + gitRoot: gitRoot + ) + allTrackedFiles.append(contentsOf: trackedFiles) + } - if !trackedFiles.isEmpty { + if !allTrackedFiles.isEmpty { // For bulk mode, throw an error that will be caught and reported struct OverwriteProtectionError: Error, LocalizedError { let trackedFiles: [String] @@ -892,19 +961,26 @@ public struct ExtractCommand: AsyncParsableCommand { "Would overwrite \(trackedFiles.count) git-tracked file(s)" } } - throw OverwriteProtectionError(trackedFiles: trackedFiles) + throw OverwriteProtectionError(trackedFiles: allTrackedFiles) } } - // Copy files - var copiedCount = 0 - for (sourcePath, relativePath) in allMatchedFiles { - let destFilePath = fullDestPath + "/" + relativePath - try copyFilePreservingStructure(from: sourcePath, to: destFilePath) - copiedCount += 1 + // T040: Fan-out to all destinations (after validation passes) + var totalCopiedCount = 0 + for normalizedDest in normalizedDestinations { + // Create destination directory + let fullDestPath = gitRoot + "/" + normalizedDest + try createDestinationDirectory(at: fullDestPath) + + // Copy files to this destination + for (sourcePath, relativePath) in allMatchedFiles { + let destFilePath = fullDestPath + "/" + relativePath + try copyFilePreservingStructure(from: sourcePath, to: destFilePath) + totalCopiedCount += 1 + } } - return copiedCount + return totalCopiedCount } // MARK: - T068: Subtree Validation @@ -1204,7 +1280,7 @@ public struct ExtractCommand: AsyncParsableCommand { /// /// - Parameters: /// - patterns: Array of glob patterns (from field) - /// - destination: Destination path (to field) + /// - destinations: Destination paths (to field) - T041: now supports array /// - excludePatterns: Exclusion patterns (exclude field) /// - subtreeName: Name of subtree to save mapping to /// - configPath: Path to subtree.yaml @@ -1212,25 +1288,43 @@ public struct ExtractCommand: AsyncParsableCommand { /// - Throws: I/O errors or config errors private func saveMappingToConfig( patterns: [String], - destination: String, + destinations: [String], excludePatterns: [String], subtreeName: String, configPath: String ) async throws -> Bool { - // T093: Construct ExtractionMapping from CLI flags - // Use single-pattern init for single pattern, multi-pattern for multiple + // T093 + T041: Construct ExtractionMapping from CLI flags + // Use appropriate initializer based on single vs multiple patterns/destinations let mapping: ExtractionMapping - if patterns.count == 1 { + let excludeValue = excludePatterns.isEmpty ? nil : excludePatterns + + if patterns.count == 1 && destinations.count == 1 { + // Single pattern, single destination mapping = ExtractionMapping( from: patterns[0], - to: destination, - exclude: excludePatterns.isEmpty ? nil : excludePatterns + to: destinations[0], + exclude: excludeValue + ) + } else if patterns.count == 1 { + // Single pattern, multiple destinations + mapping = ExtractionMapping( + from: patterns[0], + toDestinations: destinations, + exclude: excludeValue + ) + } else if destinations.count == 1 { + // Multiple patterns, single destination + mapping = ExtractionMapping( + fromPatterns: patterns, + to: destinations[0], + exclude: excludeValue ) } else { + // Multiple patterns, multiple destinations mapping = ExtractionMapping( fromPatterns: patterns, - to: destination, - exclude: excludePatterns.isEmpty ? nil : excludePatterns + toDestinations: destinations, + exclude: excludeValue ) } diff --git a/Sources/SubtreeLib/Configuration/ExtractionMapping.swift b/Sources/SubtreeLib/Configuration/ExtractionMapping.swift index eb4c2c7..1db1c3b 100644 --- a/Sources/SubtreeLib/Configuration/ExtractionMapping.swift +++ b/Sources/SubtreeLib/Configuration/ExtractionMapping.swift @@ -3,18 +3,21 @@ /// Defines how to extract files from a subtree to the project structure using glob patterns. /// Stored in subtree.yaml under each subtree's `extractions` array. /// -/// The `from` field supports both legacy string format and new array format: -/// - Legacy: `from: "pattern"` (single pattern as string) -/// - New: `from: ["p1", "p2"]` (multiple patterns as array) +/// Both `from` and `to` fields support legacy string format and new array format: +/// - Legacy: `from: "pattern"`, `to: "path/"` (single value as string) +/// - New: `from: ["p1", "p2"]`, `to: ["Lib/", "Vendor/"]` (multiple values as array) /// -/// Internally, patterns are always stored as an array for uniform processing. +/// Internally, both fields are always stored as arrays for uniform processing. +/// Multi-destination extraction (012) enables fan-out: N files × M destinations. public struct ExtractionMapping: Equatable, Sendable { /// Source glob patterns for matching files within the subtree /// Always stored as array internally; single patterns are wrapped public let from: [String] - /// Destination path (relative to repository root) where files are copied - public let to: String + /// Destination paths (relative to repository root) where files are copied + /// Always stored as array internally; single destinations are wrapped + /// Fan-out: files are copied to EVERY destination in this array + public let to: [String] /// Optional array of glob patterns to exclude from matches public let exclude: [String]? @@ -29,21 +32,21 @@ public struct ExtractionMapping: Equatable, Sendable { // MARK: - Initializers - /// Initialize an extraction mapping with a single pattern (convenience) + /// Initialize an extraction mapping with a single pattern and single destination (common case) /// /// - Parameters: /// - from: Single glob pattern matching source files (e.g., "docs/**/*.md") - /// - to: Destination path for copied files (e.g., "project-docs/") + /// - to: Single destination path for copied files (e.g., "project-docs/") /// - exclude: Optional array of glob patterns to exclude (e.g., ["docs/internal/**"]) public init(from: String, to: String, exclude: [String]? = nil) { self.from = [from] - self.to = to + self.to = [to] self.exclude = exclude } - /// Initialize an extraction mapping with multiple patterns (009-multi-pattern-extraction) + /// Initialize an extraction mapping with multiple patterns and single destination /// - /// Use this initializer when extracting files from multiple directories in a single mapping. + /// Use this initializer when extracting files from multiple directories to one destination. /// Files matching ANY pattern are included (union behavior), and duplicates are removed. /// /// Example: @@ -56,11 +59,57 @@ public struct ExtractionMapping: Equatable, Sendable { /// /// - Parameters: /// - fromPatterns: Array of glob patterns matching source files (processed as union) - /// - to: Destination path for copied files (relative to repository root) + /// - to: Single destination path for copied files (relative to repository root) /// - exclude: Optional array of glob patterns to exclude (applies to all patterns) public init(fromPatterns: [String], to: String, exclude: [String]? = nil) { self.from = fromPatterns - self.to = to + self.to = [to] + self.exclude = exclude + } + + /// Initialize an extraction mapping with a single pattern and multiple destinations (012-multi-destination) + /// + /// Use this initializer for fan-out extraction: one pattern → multiple destinations. + /// Files matching the pattern are copied to EVERY destination. + /// + /// Example: + /// ```swift + /// let mapping = ExtractionMapping( + /// from: "include/**/*.h", + /// toDestinations: ["Lib/", "Vendor/"] + /// ) + /// ``` + /// + /// - Parameters: + /// - from: Single glob pattern matching source files + /// - toDestinations: Array of destination paths (each receives all matched files) + /// - exclude: Optional array of glob patterns to exclude + public init(from: String, toDestinations: [String], exclude: [String]? = nil) { + self.from = [from] + self.to = toDestinations + self.exclude = exclude + } + + /// Initialize an extraction mapping with multiple patterns and multiple destinations (012-multi-destination) + /// + /// Use this initializer for combined fan-out: union of patterns → multiple destinations. + /// Files matching ANY pattern are copied to EVERY destination (N files × M destinations). + /// + /// Example: + /// ```swift + /// let mapping = ExtractionMapping( + /// fromPatterns: ["include/**/*.h", "src/**/*.c"], + /// toDestinations: ["Lib/", "Vendor/"] + /// ) + /// ``` + /// + /// - Parameters: + /// - fromPatterns: Array of glob patterns (processed as union) + /// - toDestinations: Array of destination paths (each receives all matched files) + /// - exclude: Optional array of glob patterns to exclude + public init(fromPatterns: [String], toDestinations: [String], exclude: [String]? = nil) { + self.from = fromPatterns + self.to = toDestinations self.exclude = exclude } } @@ -69,11 +118,11 @@ public struct ExtractionMapping: Equatable, Sendable { extension ExtractionMapping: Codable { - /// Custom decoder that handles both string and array formats for `from` field + /// Custom decoder that handles both string and array formats for `from` and `to` fields public init(from decoder: Decoder) throws { let container = try decoder.container(keyedBy: CodingKeys.self) - // Try decoding as array first, then fall back to single string + // Decode `from`: try array first, fall back to single string if let patterns = try? container.decode([String].self, forKey: .from) { // Validate: reject empty arrays guard !patterns.isEmpty else { @@ -90,22 +139,44 @@ extension ExtractionMapping: Codable { self.from = [single] } - self.to = try container.decode(String.self, forKey: .to) + // Decode `to`: try array first, fall back to single string (012-multi-destination) + if let destinations = try? container.decode([String].self, forKey: .to) { + // Validate: reject empty arrays + guard !destinations.isEmpty else { + throw DecodingError.dataCorruptedError( + forKey: .to, + in: container, + debugDescription: "to destinations cannot be empty" + ) + } + self.to = destinations + } else { + // Fall back to single string (legacy format) + let single = try container.decode(String.self, forKey: .to) + self.to = [single] + } + self.exclude = try container.decodeIfPresent([String].self, forKey: .exclude) } - /// Custom encoder that outputs string for single pattern, array for multiple + /// Custom encoder that outputs string for single value, array for multiple public func encode(to encoder: Encoder) throws { var container = encoder.container(keyedBy: CodingKeys.self) - // Serialize as string if single pattern, array if multiple + // Serialize `from` as string if single, array if multiple if from.count == 1 { try container.encode(from[0], forKey: .from) } else { try container.encode(from, forKey: .from) } - try container.encode(to, forKey: .to) + // Serialize `to` as string if single, array if multiple (012-multi-destination) + if to.count == 1 { + try container.encode(to[0], forKey: .to) + } else { + try container.encode(to, forKey: .to) + } + try container.encodeIfPresent(exclude, forKey: .exclude) } } diff --git a/Sources/SubtreeLib/Utilities/PathNormalizer.swift b/Sources/SubtreeLib/Utilities/PathNormalizer.swift new file mode 100644 index 0000000..e0f9611 --- /dev/null +++ b/Sources/SubtreeLib/Utilities/PathNormalizer.swift @@ -0,0 +1,76 @@ +/// Normalizes paths for deduplication (012-multi-destination-extraction) +/// +/// Handles common path variations that users might provide for the same destination: +/// - Trailing slashes: `Lib/` → `Lib` +/// - Leading `./`: `./Lib` → `Lib` +/// - Combinations: `./Lib/` → `Lib` +/// +/// Used by `ExtractCommand` to deduplicate multiple `--to` destinations before extraction. +/// +/// Example: +/// ```swift +/// // All these normalize to "Lib": +/// PathNormalizer.normalize("Lib") // "Lib" +/// PathNormalizer.normalize("Lib/") // "Lib" +/// PathNormalizer.normalize("./Lib") // "Lib" +/// PathNormalizer.normalize("./Lib/") // "Lib" +/// +/// // Deduplicate equivalent paths: +/// PathNormalizer.deduplicate(["Lib/", "Lib", "./Lib"]) // ["Lib/"] +/// ``` +public enum PathNormalizer { + + /// Normalize a single path by removing leading `./` and trailing `/` + /// + /// - Parameter path: The path to normalize + /// - Returns: Normalized path with leading `./` and trailing `/` removed + /// + /// Edge cases: + /// - Empty string returns empty string + /// - Single `.` returns `.` (current directory) + /// - Single `/` returns `/` (root) + public static func normalize(_ path: String) -> String { + var result = path + + // Remove leading ./ (can be repeated: ././path → path) + while result.hasPrefix("./") { + result = String(result.dropFirst(2)) + } + + // Remove trailing / (except for root "/") + while result.hasSuffix("/") && result.count > 1 { + result = String(result.dropLast()) + } + + return result + } + + /// Deduplicate paths after normalization, preserving order and original form + /// + /// Returns the first occurrence of each normalized path, keeping the user's + /// original formatting. This allows users to write `--to Lib/ --to Lib` and + /// have it deduplicated to a single copy operation. + /// + /// - Parameter paths: Array of paths to deduplicate + /// - Returns: Array with duplicates removed, preserving order and original form + /// + /// Example: + /// ```swift + /// deduplicate(["Lib/", "Vendor", "./Lib"]) + /// // Returns: ["Lib/", "Vendor"] — "Lib/" is kept, "./Lib" removed as duplicate + /// ``` + public static func deduplicate(_ paths: [String]) -> [String] { + var seen = Set() + var unique: [String] = [] + + for path in paths { + let normalized = normalize(path) + if !seen.contains(normalized) { + seen.insert(normalized) + unique.append(path) // Keep original form + } + } + + return unique + } +} diff --git a/Tests/IntegrationTests/ExtractMultiDestTests.swift b/Tests/IntegrationTests/ExtractMultiDestTests.swift new file mode 100644 index 0000000..82b03a6 --- /dev/null +++ b/Tests/IntegrationTests/ExtractMultiDestTests.swift @@ -0,0 +1,888 @@ +import Testing +import Foundation + +/// Integration tests for multi-destination extraction (012-multi-destination-extraction) +/// +/// Tests the complete workflow of extracting files to multiple destinations using +/// multiple `--to` flags (fan-out semantics). +/// +/// **Purist Approach**: No library imports. Tests execute CLI commands only and validate +/// via file system checks, stdout/stderr output, and YAML string matching. +@Suite("Extract Multi-Destination Integration Tests") +struct ExtractMultiDestTests { + + // MARK: - Helper Functions + + /// Create a subtree.yaml config file with a single subtree + private func writeSubtreeConfig( + name: String, + remote: String, + prefix: String, + commit: String, + to path: String + ) throws { + let yaml = """ + subtrees: + - name: \(name) + remote: \(remote) + prefix: \(prefix) + commit: \(commit) + """ + try yaml.write(toFile: path, atomically: true, encoding: .utf8) + } + + /// Create test files in a directory structure + private func createTestFiles( + in directory: String, + files: [(path: String, content: String)] + ) throws { + let fm = FileManager.default + for (path, content) in files { + let fullPath = directory + "/" + path + let dirPath = (fullPath as NSString).deletingLastPathComponent + try fm.createDirectory(atPath: dirPath, withIntermediateDirectories: true) + try content.write(toFile: fullPath, atomically: true, encoding: .utf8) + } + } + + // MARK: - Phase 3: P1 User Stories (US1 + US2) + + // T027: Multiple --to flags extract to all destinations + @Test("Multiple --to flags extract files to all destinations") + func testMultipleToFlagsExtractToAllDestinations() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + // Create subtree directory with test files + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("include/header.h", "// header"), + ("src/impl.c", "// impl") + ]) + + // Create subtree config + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // Extract to two destinations + let result = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Dest1/", "--to", "Dest2/"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode == 0, "Should succeed. stderr: \(result.stderr)") + + // Verify files exist in both destinations + let fm = FileManager.default + #expect(fm.fileExists(atPath: fixture.path.string + "/Dest1/include/header.h")) + #expect(fm.fileExists(atPath: fixture.path.string + "/Dest1/src/impl.c")) + #expect(fm.fileExists(atPath: fixture.path.string + "/Dest2/include/header.h")) + #expect(fm.fileExists(atPath: fixture.path.string + "/Dest2/src/impl.c")) + } + + // T028: Same files appear in every destination + @Test("Same files appear in every destination with identical content") + func testSameFilesInEveryDestination() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + let content = "// unique content \(UUID())" + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", content) + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + let result = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "A/", "--to", "B/", "--to", "C/"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode == 0) + + // All destinations should have identical content + let contentA = try String(contentsOfFile: fixture.path.string + "/A/file.txt", encoding: .utf8) + let contentB = try String(contentsOfFile: fixture.path.string + "/B/file.txt", encoding: .utf8) + let contentC = try String(contentsOfFile: fixture.path.string + "/C/file.txt", encoding: .utf8) + + #expect(contentA == content) + #expect(contentB == content) + #expect(contentC == content) + } + + // T029: Directory structure preserved identically + @Test("Directory structure preserved identically at each destination") + func testDirectoryStructurePreserved() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("a/b/c/deep.txt", "deep"), + ("x/shallow.txt", "shallow") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + let result = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Out1/", "--to", "Out2/"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode == 0) + + let fm = FileManager.default + // Same structure in both destinations + #expect(fm.fileExists(atPath: fixture.path.string + "/Out1/a/b/c/deep.txt")) + #expect(fm.fileExists(atPath: fixture.path.string + "/Out1/x/shallow.txt")) + #expect(fm.fileExists(atPath: fixture.path.string + "/Out2/a/b/c/deep.txt")) + #expect(fm.fileExists(atPath: fixture.path.string + "/Out2/x/shallow.txt")) + } + + // T030: Duplicate destinations deduplicated (./Lib = Lib/ = Lib) + @Test("Duplicate destinations are deduplicated") + func testDuplicateDestinationsDeduplicated() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "content") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // Use equivalent paths that should deduplicate + let result = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Lib/", "--to", "./Lib", "--to", "Lib"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode == 0) + + // Should only show one destination in output (deduplicated) + let outputLines = result.stdout.components(separatedBy: "\n") + .filter { $0.contains("Extracted") } + #expect(outputLines.count == 1, "Should deduplicate to single destination. Got: \(result.stdout)") + } + + // T031: Per-destination success output shown + @Test("Per-destination success output is shown") + func testPerDestinationOutput() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "content") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + let result = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Dest1/", "--to", "Dest2/"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode == 0) + + // Should show output for each destination + #expect(result.stdout.contains("Dest1") || result.stdout.contains("Dest1/"), "Should mention Dest1") + #expect(result.stdout.contains("Dest2") || result.stdout.contains("Dest2/"), "Should mention Dest2") + } + + // T031b: Overlapping destinations both receive files + @Test("Overlapping destinations both receive files") + func testOverlappingDestinations() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "content") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // Overlapping: Lib/ and Lib/Sub/ + let result = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Lib/", "--to", "Lib/Sub/"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode == 0) + + let fm = FileManager.default + // Both destinations should have the file + #expect(fm.fileExists(atPath: fixture.path.string + "/Lib/file.txt")) + #expect(fm.fileExists(atPath: fixture.path.string + "/Lib/Sub/file.txt")) + } + + // T032: Legacy string `to` format still works + @Test("Legacy single --to flag still works") + func testLegacySingleToWorks() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "content") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // Single --to (backward compatible) + let result = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Output/"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode == 0) + + let fm = FileManager.default + #expect(fm.fileExists(atPath: fixture.path.string + "/Output/file.txt")) + } + + // T033: --persist stores destinations as array + @Test("Persist stores multiple destinations as array") + func testPersistStoresDestinationsAsArray() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "content") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // Extract with persist and multiple destinations + let result = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*.txt", "--to", "Lib/", "--to", "Vendor/", "--persist"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode == 0, "Should succeed. stderr: \(result.stderr)") + + // Check YAML has array format for to + let configContent = try String(contentsOfFile: fixture.path.string + "/subtree.yaml", encoding: .utf8) + #expect(configContent.contains("- Lib/") || configContent.contains("- \"Lib/\""), + "Should have array with Lib/. Config: \(configContent)") + #expect(configContent.contains("- Vendor/") || configContent.contains("- \"Vendor/\""), + "Should have array with Vendor/. Config: \(configContent)") + } + + // T034: Bulk extract with persisted array destinations works + @Test("Bulk extract with persisted multi-destination mappings") + func testBulkExtractWithPersistedMultiDest() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "content") + ]) + + // Create config with multi-destination mapping already saved + let yaml = """ + subtrees: + - name: lib + remote: https://example.com/lib.git + prefix: vendor/lib + commit: \(try await fixture.getCurrentCommit()) + extractions: + - from: "**/*.txt" + to: + - "BulkDest1/" + - "BulkDest2/" + """ + try yaml.write(toFile: fixture.path.string + "/subtree.yaml", atomically: true, encoding: .utf8) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // Run bulk extraction + let result = try await harness.run( + arguments: ["extract", "--name", "lib"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode == 0, "Should succeed. stderr: \(result.stderr)") + + // Both destinations should have the file + let fm = FileManager.default + #expect(fm.fileExists(atPath: fixture.path.string + "/BulkDest1/file.txt"), + "BulkDest1 should have file") + #expect(fm.fileExists(atPath: fixture.path.string + "/BulkDest2/file.txt"), + "BulkDest2 should have file") + } + + // MARK: - Phase 4: US3 (Clean Mode) + + // T043: --clean removes files from all destinations + @Test("Clean removes files from all destinations") + func testCleanRemovesFromAllDestinations() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "content") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // First extract to multiple destinations + _ = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Dest1/", "--to", "Dest2/"], + workingDirectory: fixture.path + ) + + let fm = FileManager.default + #expect(fm.fileExists(atPath: fixture.path.string + "/Dest1/file.txt")) + #expect(fm.fileExists(atPath: fixture.path.string + "/Dest2/file.txt")) + + // Now clean from all destinations + let cleanResult = try await harness.run( + arguments: ["extract", "--clean", "--name", "lib", "--from", "**/*", "--to", "Dest1/", "--to", "Dest2/"], + workingDirectory: fixture.path + ) + + #expect(cleanResult.exitCode == 0, "Clean should succeed. stderr: \(cleanResult.stderr)") + #expect(!fm.fileExists(atPath: fixture.path.string + "/Dest1/file.txt"), "Dest1 file should be removed") + #expect(!fm.fileExists(atPath: fixture.path.string + "/Dest2/file.txt"), "Dest2 file should be removed") + } + + // T044: Clean with persisted multi-dest mapping works + @Test("Clean with persisted multi-destination mapping") + func testCleanWithPersistedMultiDestMapping() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "content") + ]) + + // Create config with multi-destination mapping + let yaml = """ + subtrees: + - name: lib + remote: https://example.com/lib.git + prefix: vendor/lib + commit: \(try await fixture.getCurrentCommit()) + extractions: + - from: "**/*.txt" + to: + - "CleanDest1/" + - "CleanDest2/" + """ + try yaml.write(toFile: fixture.path.string + "/subtree.yaml", atomically: true, encoding: .utf8) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // Extract using bulk mode + _ = try await harness.run( + arguments: ["extract", "--name", "lib"], + workingDirectory: fixture.path + ) + + let fm = FileManager.default + #expect(fm.fileExists(atPath: fixture.path.string + "/CleanDest1/file.txt")) + #expect(fm.fileExists(atPath: fixture.path.string + "/CleanDest2/file.txt")) + + // Clean using bulk mode + let cleanResult = try await harness.run( + arguments: ["extract", "--clean", "--name", "lib"], + workingDirectory: fixture.path + ) + + #expect(cleanResult.exitCode == 0, "Clean should succeed. stderr: \(cleanResult.stderr)") + #expect(!fm.fileExists(atPath: fixture.path.string + "/CleanDest1/file.txt")) + #expect(!fm.fileExists(atPath: fixture.path.string + "/CleanDest2/file.txt")) + } + + // T045: Clean fails-fast if checksum mismatch in any destination + @Test("Clean fails if checksum mismatch in any destination") + func testCleanFailsOnChecksumMismatch() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "original") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // Extract to multiple destinations + _ = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Dest1/", "--to", "Dest2/"], + workingDirectory: fixture.path + ) + + // Modify file in one destination + try "modified".write(toFile: fixture.path.string + "/Dest2/file.txt", atomically: true, encoding: .utf8) + + // Clean should fail due to checksum mismatch + let cleanResult = try await harness.run( + arguments: ["extract", "--clean", "--name", "lib", "--from", "**/*", "--to", "Dest1/", "--to", "Dest2/"], + workingDirectory: fixture.path + ) + + #expect(cleanResult.exitCode != 0, "Clean should fail on checksum mismatch") + #expect(cleanResult.stderr.contains("modified") || cleanResult.stderr.contains("checksum"), + "Should mention modification. stderr: \(cleanResult.stderr)") + } + + // T046: Per-destination clean output shown + @Test("Per-destination clean output is shown") + func testPerDestinationCleanOutput() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "content") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // Extract to multiple destinations + _ = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Out1/", "--to", "Out2/"], + workingDirectory: fixture.path + ) + + // Clean + let cleanResult = try await harness.run( + arguments: ["extract", "--clean", "--name", "lib", "--from", "**/*", "--to", "Out1/", "--to", "Out2/"], + workingDirectory: fixture.path + ) + + #expect(cleanResult.exitCode == 0) + // Output should mention both destinations + #expect(cleanResult.stdout.contains("Out1") || cleanResult.stdout.contains("Out1/"), + "Should mention Out1. stdout: \(cleanResult.stdout)") + #expect(cleanResult.stdout.contains("Out2") || cleanResult.stdout.contains("Out2/"), + "Should mention Out2. stdout: \(cleanResult.stdout)") + } + + // MARK: - Phase 4: US4 (Fail-Fast Overwrite Protection) + + // T047: Overwrite protection validates ALL destinations upfront + @Test("Overwrite protection validates all destinations upfront") + func testOverwriteProtectionValidatesAllDestinations() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "content") + ]) + + // Create tracked file in second destination only + try createTestFiles(in: fixture.path.string + "/Dest2", files: [ + ("file.txt", "tracked") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup with tracked file"]) + + // Extract should fail because Dest2 has tracked file + let result = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Dest1/", "--to", "Dest2/"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode != 0, "Should fail due to tracked file conflict") + #expect(result.stderr.contains("git-tracked") || result.stderr.contains("Dest2"), + "Should mention conflict. stderr: \(result.stderr)") + } + + // T048: No files copied if any destination has conflicts + @Test("No files copied if any destination has conflicts") + func testNoFilesCopiedOnConflict() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("newfile.txt", "new content") + ]) + + // Create tracked file in second destination + try createTestFiles(in: fixture.path.string + "/Dest2", files: [ + ("newfile.txt", "tracked") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // Extract should fail + let result = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Dest1/", "--to", "Dest2/"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode != 0) + + // Verify Dest1 was NOT populated (fail-fast) + let fm = FileManager.default + #expect(!fm.fileExists(atPath: fixture.path.string + "/Dest1/newfile.txt"), + "Dest1 should not have file when Dest2 has conflict") + } + + // T049: Error lists conflicts across all destinations + @Test("Error lists conflicts across all destinations") + func testErrorListsAllConflicts() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file1.txt", "content1"), + ("file2.txt", "content2") + ]) + + // Create tracked files in BOTH destinations + try createTestFiles(in: fixture.path.string + "/Dest1", files: [ + ("file1.txt", "tracked1") + ]) + try createTestFiles(in: fixture.path.string + "/Dest2", files: [ + ("file2.txt", "tracked2") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup with conflicts in both dests"]) + + let result = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Dest1/", "--to", "Dest2/"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode != 0) + // Should mention files from both destinations in error + #expect(result.stderr.contains("file1.txt") || result.stderr.contains("file2.txt"), + "Should list conflicting files. stderr: \(result.stderr)") + } + + // T050: --force bypasses protection for all destinations + @Test("Force flag bypasses protection for all destinations") + func testForceBypasses() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "new content") + ]) + + // Create tracked files in both destinations + try createTestFiles(in: fixture.path.string + "/Dest1", files: [ + ("file.txt", "tracked1") + ]) + try createTestFiles(in: fixture.path.string + "/Dest2", files: [ + ("file.txt", "tracked2") + ]) + + try writeSubtreeConfig( + name: "lib", + remote: "https://example.com/lib.git", + prefix: "vendor/lib", + commit: try await fixture.getCurrentCommit(), + to: fixture.path.string + "/subtree.yaml" + ) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // With --force, should succeed + let result = try await harness.run( + arguments: ["extract", "--name", "lib", "--from", "**/*", "--to", "Dest1/", "--to", "Dest2/", "--force"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode == 0, "Should succeed with --force. stderr: \(result.stderr)") + + // Both destinations should have new content + let content1 = try String(contentsOfFile: fixture.path.string + "/Dest1/file.txt", encoding: .utf8) + let content2 = try String(contentsOfFile: fixture.path.string + "/Dest2/file.txt", encoding: .utf8) + #expect(content1 == "new content") + #expect(content2 == "new content") + } + + // MARK: - Phase 4: US5 (Bulk Mode) + + // T051: --all processes multi-dest mappings correctly + @Test("Bulk --all processes multi-destination mappings") + func testBulkAllMultiDest() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib1", files: [ + ("file1.txt", "lib1 content") + ]) + try createTestFiles(in: fixture.path.string + "/vendor/lib2", files: [ + ("file2.txt", "lib2 content") + ]) + + // Config with two subtrees, each with multi-dest mappings + let yaml = """ + subtrees: + - name: lib1 + remote: https://example.com/lib1.git + prefix: vendor/lib1 + commit: \(try await fixture.getCurrentCommit()) + extractions: + - from: "**/*.txt" + to: + - "Lib1A/" + - "Lib1B/" + - name: lib2 + remote: https://example.com/lib2.git + prefix: vendor/lib2 + commit: \(try await fixture.getCurrentCommit()) + extractions: + - from: "**/*.txt" + to: + - "Lib2A/" + - "Lib2B/" + """ + try yaml.write(toFile: fixture.path.string + "/subtree.yaml", atomically: true, encoding: .utf8) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + let result = try await harness.run( + arguments: ["extract", "--all"], + workingDirectory: fixture.path + ) + + #expect(result.exitCode == 0, "Should succeed. stderr: \(result.stderr)") + + let fm = FileManager.default + #expect(fm.fileExists(atPath: fixture.path.string + "/Lib1A/file1.txt")) + #expect(fm.fileExists(atPath: fixture.path.string + "/Lib1B/file1.txt")) + #expect(fm.fileExists(atPath: fixture.path.string + "/Lib2A/file2.txt")) + #expect(fm.fileExists(atPath: fixture.path.string + "/Lib2B/file2.txt")) + } + + // T052: --clean --all removes from all destinations + @Test("Clean --all removes from all multi-destination mappings") + func testCleanAllMultiDest() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib", files: [ + ("file.txt", "content") + ]) + + let yaml = """ + subtrees: + - name: lib + remote: https://example.com/lib.git + prefix: vendor/lib + commit: \(try await fixture.getCurrentCommit()) + extractions: + - from: "**/*.txt" + to: + - "AllDest1/" + - "AllDest2/" + """ + try yaml.write(toFile: fixture.path.string + "/subtree.yaml", atomically: true, encoding: .utf8) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup"]) + + // Extract first + _ = try await harness.run( + arguments: ["extract", "--all"], + workingDirectory: fixture.path + ) + + let fm = FileManager.default + #expect(fm.fileExists(atPath: fixture.path.string + "/AllDest1/file.txt")) + #expect(fm.fileExists(atPath: fixture.path.string + "/AllDest2/file.txt")) + + // Clean all + let cleanResult = try await harness.run( + arguments: ["extract", "--clean", "--all"], + workingDirectory: fixture.path + ) + + #expect(cleanResult.exitCode == 0, "Should succeed. stderr: \(cleanResult.stderr)") + #expect(!fm.fileExists(atPath: fixture.path.string + "/AllDest1/file.txt")) + #expect(!fm.fileExists(atPath: fixture.path.string + "/AllDest2/file.txt")) + } + + // T053: Continue-on-error per subtree (not per destination) + @Test("Continue-on-error applies per subtree not per destination") + func testContinueOnErrorPerSubtree() async throws { + let harness = TestHarness() + let fixture = try await GitRepositoryFixture() + defer { try? fixture.tearDown() } + + try createTestFiles(in: fixture.path.string + "/vendor/lib1", files: [ + ("file1.txt", "lib1") + ]) + try createTestFiles(in: fixture.path.string + "/vendor/lib2", files: [ + ("file2.txt", "lib2") + ]) + + // Create conflict in lib1's second destination + try createTestFiles(in: fixture.path.string + "/Lib1B", files: [ + ("file1.txt", "conflict") + ]) + + let yaml = """ + subtrees: + - name: lib1 + remote: https://example.com/lib1.git + prefix: vendor/lib1 + commit: \(try await fixture.getCurrentCommit()) + extractions: + - from: "**/*.txt" + to: + - "Lib1A/" + - "Lib1B/" + - name: lib2 + remote: https://example.com/lib2.git + prefix: vendor/lib2 + commit: \(try await fixture.getCurrentCommit()) + extractions: + - from: "**/*.txt" + to: + - "Lib2A/" + - "Lib2B/" + """ + try yaml.write(toFile: fixture.path.string + "/subtree.yaml", atomically: true, encoding: .utf8) + try await fixture.runGit(["add", "."]) + try await fixture.runGit(["commit", "-m", "Setup with conflict in lib1"]) + + let result = try await harness.run( + arguments: ["extract", "--all"], + workingDirectory: fixture.path + ) + + // Should have non-zero exit (lib1 failed), but lib2 should still complete + #expect(result.exitCode != 0, "Should fail due to lib1 conflict") + + let fm = FileManager.default + // lib1 destinations should NOT be populated (failed) + #expect(!fm.fileExists(atPath: fixture.path.string + "/Lib1A/file1.txt"), + "Lib1A should not have file when Lib1B has conflict") + + // lib2 destinations SHOULD be populated (continue-on-error) + #expect(fm.fileExists(atPath: fixture.path.string + "/Lib2A/file2.txt"), + "Lib2 should still succeed despite lib1 failure") + #expect(fm.fileExists(atPath: fixture.path.string + "/Lib2B/file2.txt")) + } +} diff --git a/Tests/SubtreeLibTests/Commands/ExtractCommandTests.swift b/Tests/SubtreeLibTests/Commands/ExtractCommandTests.swift index 54908ed..edd400c 100644 --- a/Tests/SubtreeLibTests/Commands/ExtractCommandTests.swift +++ b/Tests/SubtreeLibTests/Commands/ExtractCommandTests.swift @@ -208,20 +208,20 @@ struct ExtractCommandTests { // Test with no exclusions let mapping1 = ExtractionMapping(from: "**/*.md", to: "docs/", exclude: nil) #expect(mapping1.from == ["**/*.md"]) - #expect(mapping1.to == "docs/") + #expect(mapping1.to == ["docs/"]) #expect(mapping1.exclude == nil) // Test with empty exclusions let mapping2 = ExtractionMapping(from: "**/*.c", to: "src/", exclude: []) #expect(mapping2.from == ["**/*.c"]) - #expect(mapping2.to == "src/") + #expect(mapping2.to == ["src/"]) #expect(mapping2.exclude?.isEmpty == true) // Test with multiple exclusions let excludes = ["**/test/**", "**/bench/**"] let mapping3 = ExtractionMapping(from: "src/**/*.c", to: "Sources/", exclude: excludes) #expect(mapping3.from == ["src/**/*.c"]) - #expect(mapping3.to == "Sources/") + #expect(mapping3.to == ["Sources/"]) #expect(mapping3.exclude?.count == 2) #expect(mapping3.exclude?.contains("**/test/**") == true) #expect(mapping3.exclude?.contains("**/bench/**") == true) diff --git a/Tests/SubtreeLibTests/ConfigurationTests/ExtractionMappingTests.swift b/Tests/SubtreeLibTests/ConfigurationTests/ExtractionMappingTests.swift index 05c1d24..8832d9e 100644 --- a/Tests/SubtreeLibTests/ConfigurationTests/ExtractionMappingTests.swift +++ b/Tests/SubtreeLibTests/ConfigurationTests/ExtractionMappingTests.swift @@ -18,7 +18,7 @@ struct ExtractionMappingTests { func testExtractionMappingInit() { let mapping = ExtractionMapping(from: "src/**/*.h", to: "include/") #expect(mapping.from == ["src/**/*.h"], "Single pattern should be wrapped in array") - #expect(mapping.to == "include/") + #expect(mapping.to == ["include/"]) #expect(mapping.exclude == nil) } @@ -39,7 +39,7 @@ struct ExtractionMappingTests { #expect(decoded == original) #expect(decoded.from == ["docs/**/*.md"]) - #expect(decoded.to == "project-docs/") + #expect(decoded.to == ["project-docs/"]) #expect(decoded.exclude == ["docs/internal/**"]) } @@ -114,7 +114,7 @@ struct ExtractionMappingTests { let mapping = try decoder.decode(ExtractionMapping.self, from: yaml) #expect(mapping.from == ["docs/**/*.md"], "String should be wrapped in array") - #expect(mapping.to == "project-docs/") + #expect(mapping.to == ["project-docs/"]) #expect(mapping.exclude == ["docs/internal/**", "docs/draft*.md"]) } @@ -130,7 +130,7 @@ struct ExtractionMappingTests { let mapping = try decoder.decode(ExtractionMapping.self, from: yaml) #expect(mapping.from == ["templates/**"]) - #expect(mapping.to == ".templates/") + #expect(mapping.to == [".templates/"]) #expect(mapping.exclude == nil) } @@ -148,7 +148,7 @@ struct ExtractionMappingTests { let mapping = try decoder.decode(ExtractionMapping.self, from: yaml) #expect(mapping.from == ["include/**/*.h"], "Single string should be wrapped in array") - #expect(mapping.to == "vendor/headers/") + #expect(mapping.to == ["vendor/headers/"]) #expect(mapping.exclude == nil) } @@ -166,7 +166,7 @@ struct ExtractionMappingTests { let mapping = try decoder.decode(ExtractionMapping.self, from: yaml) #expect(mapping.from == ["include/**/*.h", "src/**/*.c"], "Array should be preserved") - #expect(mapping.to == "vendor/source/") + #expect(mapping.to == ["vendor/source/"]) } // T005: Encode single pattern as string @@ -239,7 +239,7 @@ struct ExtractionMappingTests { let mapping = ExtractionMapping(from: "docs/**/*.md", to: "output/", exclude: ["**/internal/**"]) #expect(mapping.from == ["docs/**/*.md"], "Single pattern should be wrapped in array") - #expect(mapping.to == "output/") + #expect(mapping.to == ["output/"]) #expect(mapping.exclude == ["**/internal/**"]) } @@ -250,7 +250,7 @@ struct ExtractionMappingTests { let mapping = ExtractionMapping(fromPatterns: patterns, to: "vendor/", exclude: nil) #expect(mapping.from == patterns, "Patterns should be preserved") - #expect(mapping.to == "vendor/") + #expect(mapping.to == ["vendor/"]) #expect(mapping.exclude == nil) } @@ -270,7 +270,157 @@ struct ExtractionMappingTests { let mapping = try decoder.decode(ExtractionMapping.self, from: yaml) #expect(mapping.from == ["src/**/*.c"]) - #expect(mapping.to == "vendor/") + #expect(mapping.to == ["vendor/"]) #expect(mapping.exclude == ["**/test_*", "**/internal/**"]) } + + // MARK: - Multi-Destination (012-multi-destination-extraction) + + // T012: Decode single string format `to: "path/"` + @Test("Decode single string format to: path/") + func testDecodeSingleStringToFormat() throws { + let yaml = """ + from: "include/**/*.h" + to: "vendor/headers/" + """ + + let decoder = YAMLDecoder() + let mapping = try decoder.decode(ExtractionMapping.self, from: yaml) + + #expect(mapping.from == ["include/**/*.h"]) + #expect(mapping.to == ["vendor/headers/"], "Single string should be wrapped in array") + } + + // T013: Decode array format `to: ["p1/", "p2/"]` + @Test("Decode array format to: [p1/, p2/]") + func testDecodeArrayToFormat() throws { + let yaml = """ + from: "include/**/*.h" + to: + - "Lib/headers/" + - "Vendor/headers/" + """ + + let decoder = YAMLDecoder() + let mapping = try decoder.decode(ExtractionMapping.self, from: yaml) + + #expect(mapping.from == ["include/**/*.h"]) + #expect(mapping.to == ["Lib/headers/", "Vendor/headers/"], "Array should be preserved") + } + + // T014: Encode single destination as string + @Test("Encode single destination as string format") + func testEncodeSingleDestinationAsString() throws { + let mapping = ExtractionMapping(from: "include/**/*.h", to: "vendor/") + + let encoder = YAMLEncoder() + let yaml = try encoder.encode(mapping) + + // Single destination should encode as string, not array + #expect(yaml.contains("to: vendor/") || yaml.contains("to: \"vendor/\""), + "Single destination should encode as string, not array. Got: \(yaml)") + #expect(!yaml.contains("- vendor"), "Should not encode as array") + } + + // T015: Encode multiple destinations as array + @Test("Encode multiple destinations as array format") + func testEncodeMultipleDestinationsAsArray() throws { + let mapping = ExtractionMapping(from: "include/**/*.h", toDestinations: ["Lib/", "Vendor/"]) + + let encoder = YAMLEncoder() + let yaml = try encoder.encode(mapping) + + // Multiple destinations should encode as array + #expect(yaml.contains("- Lib/") || yaml.contains("- \"Lib/\""), + "Multiple destinations should encode as array. Got: \(yaml)") + #expect(yaml.contains("- Vendor/") || yaml.contains("- \"Vendor/\""), + "Multiple destinations should encode as array. Got: \(yaml)") + } + + // T016: Reject empty array `to: []` + @Test("Reject empty array to: []") + func testRejectEmptyToArray() throws { + let yaml = """ + from: "include/**/*.h" + to: [] + """ + + let decoder = YAMLDecoder() + + #expect(throws: Error.self) { + _ = try decoder.decode(ExtractionMapping.self, from: yaml) + } + } + + // T017: Single-destination initializer works + @Test("Single-destination initializer wraps string in array") + func testSingleDestinationInitializerWrapsInArray() { + let mapping = ExtractionMapping(from: "docs/**/*.md", to: "output/", exclude: ["**/internal/**"]) + + #expect(mapping.from == ["docs/**/*.md"]) + #expect(mapping.to == ["output/"], "Single destination should be wrapped in array") + #expect(mapping.exclude == ["**/internal/**"]) + } + + // T018: Multi-destination initializer works + @Test("Multi-destination initializer preserves array") + func testMultiDestinationInitializer() { + let destinations = ["Lib/", "Vendor/", "Backup/"] + let mapping = ExtractionMapping(from: "include/**/*.h", toDestinations: destinations, exclude: nil) + + #expect(mapping.from == ["include/**/*.h"]) + #expect(mapping.to == destinations, "Destinations should be preserved") + #expect(mapping.exclude == nil) + } + + // T018b: Verify Yams coercion of non-string elements in `to` + @Test("Array with mixed types in to is handled by Yams coercion") + func testMixedTypesToHandledByYams() throws { + // YAML with integer in array - Yams coerces to string + let yaml = """ + from: "valid/pattern" + to: + - "Lib/" + - 123 + """ + + let decoder = YAMLDecoder() + // Yams coerces 123 to "123", so this actually succeeds + let mapping = try decoder.decode(ExtractionMapping.self, from: yaml) + #expect(mapping.to == ["Lib/", "123"], "Yams coerces integers to strings") + } + + // Additional: Combined multi-pattern + multi-destination + @Test("Combined multi-pattern and multi-destination") + func testCombinedMultiPatternMultiDestination() throws { + let yaml = """ + from: + - "include/**/*.h" + - "src/**/*.c" + to: + - "Lib/" + - "Vendor/" + exclude: + - "**/test/**" + """ + + let decoder = YAMLDecoder() + let mapping = try decoder.decode(ExtractionMapping.self, from: yaml) + + #expect(mapping.from == ["include/**/*.h", "src/**/*.c"]) + #expect(mapping.to == ["Lib/", "Vendor/"]) + #expect(mapping.exclude == ["**/test/**"]) + } + + // Additional: Multi-pattern + multi-destination initializer + @Test("Multi-pattern multi-destination initializer") + func testMultiPatternMultiDestinationInitializer() { + let patterns = ["include/**/*.h", "src/**/*.c"] + let destinations = ["Lib/", "Vendor/"] + let mapping = ExtractionMapping(fromPatterns: patterns, toDestinations: destinations, exclude: ["**/internal/**"]) + + #expect(mapping.from == patterns) + #expect(mapping.to == destinations) + #expect(mapping.exclude == ["**/internal/**"]) + } } diff --git a/Tests/SubtreeLibTests/ConfigurationTests/Models/SubtreeEntryTests.swift b/Tests/SubtreeLibTests/ConfigurationTests/Models/SubtreeEntryTests.swift index 80e7c59..2b6704c 100644 --- a/Tests/SubtreeLibTests/ConfigurationTests/Models/SubtreeEntryTests.swift +++ b/Tests/SubtreeLibTests/ConfigurationTests/Models/SubtreeEntryTests.swift @@ -100,7 +100,7 @@ struct SubtreeEntryTests { #expect(entry.extractions?.count == 1) #expect(entry.extractions?[0].from == ["docs/**/*.md"]) - #expect(entry.extractions?[0].to == "project-docs/") + #expect(entry.extractions?[0].to == ["project-docs/"]) } // T015: Test for SubtreeEntry backward compatibility @@ -161,9 +161,9 @@ struct SubtreeEntryTests { #expect(entry.extractions?.count == 2) #expect(entry.extractions?[0].from == ["src/**/*.{h,c}"]) - #expect(entry.extractions?[0].to == "Sources/libsecp256k1/src/") + #expect(entry.extractions?[0].to == ["Sources/libsecp256k1/src/"]) #expect(entry.extractions?[0].exclude?.count == 2) #expect(entry.extractions?[1].from == ["include/**/*.h"]) - #expect(entry.extractions?[1].to == "Sources/libsecp256k1/include/") + #expect(entry.extractions?[1].to == ["Sources/libsecp256k1/include/"]) } } diff --git a/Tests/SubtreeLibTests/Utilities/PathNormalizerTests.swift b/Tests/SubtreeLibTests/Utilities/PathNormalizerTests.swift new file mode 100644 index 0000000..2c7c742 --- /dev/null +++ b/Tests/SubtreeLibTests/Utilities/PathNormalizerTests.swift @@ -0,0 +1,125 @@ +import Testing +import Foundation +@testable import SubtreeLib + +/// Unit tests for PathNormalizer (012-multi-destination-extraction) +/// +/// PathNormalizer handles destination path normalization and deduplication: +/// - Removes trailing slashes: `Lib/` → `Lib` +/// - Removes leading `./`: `./Lib` → `Lib` +/// - Deduplicates equivalent paths while preserving order and original form +@Suite("PathNormalizer Tests") +struct PathNormalizerTests { + + // MARK: - T004: Normalize removes trailing slash + + @Test("Normalize removes single trailing slash") + func testNormalizeRemovesTrailingSlash() { + #expect(PathNormalizer.normalize("Lib/") == "Lib") + #expect(PathNormalizer.normalize("vendor/headers/") == "vendor/headers") + } + + @Test("Normalize removes multiple trailing slashes") + func testNormalizeRemovesMultipleTrailingSlashes() { + #expect(PathNormalizer.normalize("Lib//") == "Lib") + #expect(PathNormalizer.normalize("path///") == "path") + } + + @Test("Normalize preserves path without trailing slash") + func testNormalizePreservesPathWithoutTrailingSlash() { + #expect(PathNormalizer.normalize("Lib") == "Lib") + #expect(PathNormalizer.normalize("vendor/headers") == "vendor/headers") + } + + // MARK: - T005: Normalize removes leading `./` + + @Test("Normalize removes single leading ./") + func testNormalizeRemovesLeadingDotSlash() { + #expect(PathNormalizer.normalize("./Lib") == "Lib") + #expect(PathNormalizer.normalize("./vendor/headers") == "vendor/headers") + } + + @Test("Normalize removes multiple leading ./") + func testNormalizeRemovesMultipleLeadingDotSlash() { + #expect(PathNormalizer.normalize("././Lib") == "Lib") + #expect(PathNormalizer.normalize("./././path") == "path") + } + + @Test("Normalize preserves path without leading ./") + func testNormalizePreservesPathWithoutLeadingDotSlash() { + #expect(PathNormalizer.normalize("Lib") == "Lib") + #expect(PathNormalizer.normalize("vendor/headers") == "vendor/headers") + } + + // MARK: - T006: Normalize handles combined `./path/` + + @Test("Normalize handles combined leading ./ and trailing /") + func testNormalizeHandlesCombined() { + #expect(PathNormalizer.normalize("./Lib/") == "Lib") + #expect(PathNormalizer.normalize("./vendor/headers/") == "vendor/headers") + #expect(PathNormalizer.normalize("././path//") == "path") + } + + @Test("Normalize handles edge cases") + func testNormalizeEdgeCases() { + // Single dot should remain (current directory) + #expect(PathNormalizer.normalize(".") == ".") + // Root slash should remain + #expect(PathNormalizer.normalize("/") == "/") + // Empty string edge case + #expect(PathNormalizer.normalize("") == "") + } + + // MARK: - T007: Deduplicate removes equivalent paths + + @Test("Deduplicate removes paths that normalize to same value") + func testDeduplicateRemovesEquivalentPaths() { + let paths = ["Lib", "Lib/", "./Lib", "./Lib/"] + let result = PathNormalizer.deduplicate(paths) + #expect(result.count == 1) + } + + @Test("Deduplicate keeps distinct paths") + func testDeduplicateKeepsDistinctPaths() { + let paths = ["Lib", "Vendor", "Headers"] + let result = PathNormalizer.deduplicate(paths) + #expect(result == ["Lib", "Vendor", "Headers"]) + } + + @Test("Deduplicate handles mixed equivalent and distinct") + func testDeduplicateMixedPaths() { + let paths = ["Lib/", "Vendor", "./Lib", "Vendor/", "Headers"] + let result = PathNormalizer.deduplicate(paths) + #expect(result.count == 3) + // Should contain one form of Lib, Vendor, Headers + } + + // MARK: - T008: Deduplicate preserves order and original form + + @Test("Deduplicate preserves original form (first occurrence)") + func testDeduplicatePreservesOriginalForm() { + // First occurrence "Lib/" should be kept, not "Lib" or "./Lib" + let paths = ["Lib/", "Lib", "./Lib"] + let result = PathNormalizer.deduplicate(paths) + #expect(result == ["Lib/"]) + } + + @Test("Deduplicate preserves input order") + func testDeduplicatePreservesOrder() { + let paths = ["Vendor", "Lib", "Headers"] + let result = PathNormalizer.deduplicate(paths) + #expect(result == ["Vendor", "Lib", "Headers"]) + } + + @Test("Deduplicate handles empty array") + func testDeduplicateEmptyArray() { + let result = PathNormalizer.deduplicate([]) + #expect(result.isEmpty) + } + + @Test("Deduplicate handles single element") + func testDeduplicateSingleElement() { + let result = PathNormalizer.deduplicate(["Lib/"]) + #expect(result == ["Lib/"]) + } +} diff --git a/specs/012-multi-destination-extraction/checklists/requirements.md b/specs/012-multi-destination-extraction/checklists/requirements.md new file mode 100644 index 0000000..5cd0672 --- /dev/null +++ b/specs/012-multi-destination-extraction/checklists/requirements.md @@ -0,0 +1,40 @@ +# Specification Quality Checklist: Multi-Destination Extraction (Fan-Out) + +**Purpose**: Validate specification completeness and quality before proceeding to planning +**Created**: 2025-11-30 +**Feature**: [spec.md](../spec.md) + +## Content Quality + +- [x] No implementation details (languages, frameworks, APIs) +- [x] Focused on user value and business needs +- [x] Written for non-technical stakeholders +- [x] All mandatory sections completed + +## Requirement Completeness + +- [x] No [NEEDS CLARIFICATION] markers remain +- [x] Requirements are testable and unambiguous +- [x] Success criteria are measurable +- [x] Success criteria are technology-agnostic (no implementation details) +- [x] All acceptance scenarios are defined +- [x] Edge cases are identified +- [x] Scope is clearly bounded +- [x] Dependencies and assumptions identified + +## Feature Readiness + +- [x] All functional requirements have clear acceptance criteria +- [x] User scenarios cover primary flows +- [x] Feature meets measurable outcomes defined in Success Criteria +- [x] No implementation details leak into specification + +## Notes + +- Spec complete with 5 user stories covering CLI, persistence, clean mode, fail-fast protection, and bulk mode +- 17 functional requirements defined (FR-001 through FR-017) +- 5 success criteria with measurable outcomes +- 6 clarification questions answered and documented (3 initial + 3 from /speckit.clarify) +- 7 edge cases identified +- Backward compatibility explicitly addressed in US2 and FR-008 +- Clarification session 2025-11-30 resolved: destination limits, path normalization, output format diff --git a/specs/012-multi-destination-extraction/contracts/cli-contract.md b/specs/012-multi-destination-extraction/contracts/cli-contract.md new file mode 100644 index 0000000..8bbe9ba --- /dev/null +++ b/specs/012-multi-destination-extraction/contracts/cli-contract.md @@ -0,0 +1,120 @@ +# CLI Contract: Multi-Destination Extraction + +**Feature**: 012-multi-destination-extraction +**Date**: 2025-11-30 + +## Command Changes + +### `subtree extract` (Modified) + +**Current `--to` option**: +``` +--to Destination path (single) +``` + +**New `--to` option**: +``` +--to Destination path (can be repeated for fan-out) +``` + +## Usage Examples + +### Ad-Hoc Multi-Destination Extraction +```bash +# Extract to two destinations +subtree extract --name mylib --from "**/*.h" --to Lib/ --to Vendor/ + +# Combined with multi-pattern (from 009) +subtree extract --name mylib \ + --from "include/**/*.h" \ + --from "src/**/*.c" \ + --to Lib/ \ + --to Vendor/ +``` + +### Persist Multi-Destination Mapping +```bash +# Save mapping with multiple destinations +subtree extract --name mylib --from "**/*.h" --to Lib/ --to Vendor/ --persist +``` + +### Clean Multi-Destination +```bash +# Ad-hoc clean from multiple destinations +subtree extract --clean --name mylib --from "**/*.h" --to Lib/ --to Vendor/ + +# Bulk clean (uses persisted mappings) +subtree extract --clean --name mylib +subtree extract --clean --all +``` + +## Output Format + +### Extraction Success (Per-Destination) +``` +✅ Extracted 5 file(s) to 'Lib/' +✅ Extracted 5 file(s) to 'Vendor/' +``` + +### With Persist +``` +✅ Extracted 5 file(s) to 'Lib/' +✅ Extracted 5 file(s) to 'Vendor/' +📝 Saved extraction mapping to subtree.yaml +``` + +### Soft Limit Warning (>10 destinations) +``` +⚠️ Warning: 15 destinations specified (>10) +✅ Extracted 5 file(s) to 'Dest1/' +✅ Extracted 5 file(s) to 'Dest2/' +... +``` + +### Fail-Fast Error (Overwrite Protection) +``` +❌ Error: Git-tracked files would be overwritten + +Conflicts in 'Lib/': + • include/foo.h + +Conflicts in 'Vendor/': + • include/foo.h + • include/bar.h + +Use --force to override protection. +``` + +### Clean Success (Per-Destination) +``` +✅ Cleaned 5 file(s) from 'Lib/' + 📁 Pruned 2 empty directories +✅ Cleaned 5 file(s) from 'Vendor/' + 📁 Pruned 1 empty directory +``` + +## Exit Codes + +| Code | Meaning | When | +|------|---------|------| +| 0 | Success | All destinations processed successfully | +| 1 | Validation error | Empty destinations, checksum mismatch | +| 2 | User error | Invalid flag combination, overwrite protection | +| 3 | I/O error | Permission denied, filesystem error | + +## Flag Interactions + +| Flags | Behavior | +|-------|----------| +| `--to X --to X` | Deduplicated to single destination | +| `--to ./Lib --to Lib/` | Deduplicated after normalization | +| `--to` × >10 | Warning printed, operation continues | +| `--to` + `--persist` | Saves array to config | +| `--to` + `--clean` | Cleans all specified destinations | +| `--to` + `--force` | Bypasses protection for all destinations | + +## Backward Compatibility + +- Single `--to` works unchanged +- Existing bulk mode (`--name` without `--to`) works unchanged +- Existing persisted mappings with string `to` work unchanged diff --git a/specs/012-multi-destination-extraction/data-model.md b/specs/012-multi-destination-extraction/data-model.md new file mode 100644 index 0000000..0bc4b85 --- /dev/null +++ b/specs/012-multi-destination-extraction/data-model.md @@ -0,0 +1,238 @@ +# Data Model: Multi-Destination Extraction (Fan-Out) + +**Feature**: 012-multi-destination-extraction +**Date**: 2025-11-30 + +## Entity Changes + +### ExtractionMapping (Modified) + +**Location**: `Sources/SubtreeLib/Configuration/ExtractionMapping.swift` + +**Current**: +```swift +public struct ExtractionMapping: Codable, Equatable, Sendable { + public let from: [String] // Array of patterns (from 009) + public let to: String // Single destination + public let exclude: [String]? +} +``` + +**Proposed**: +```swift +public struct ExtractionMapping: Codable, Equatable, Sendable { + public let from: [String] // Array of patterns (unchanged) + public let to: [String] // Array of destinations (NEW - mirrors from) + public let exclude: [String]? + + // MARK: - Initializers + + /// Single pattern, single destination (common case) + public init(from: String, to: String, exclude: [String]? = nil) { + self.from = [from] + self.to = [to] + self.exclude = exclude + } + + /// Multiple patterns, single destination + public init(fromPatterns: [String], to: String, exclude: [String]? = nil) { + self.from = fromPatterns + self.to = [to] + self.exclude = exclude + } + + /// Single pattern, multiple destinations (NEW) + public init(from: String, toDestinations: [String], exclude: [String]? = nil) { + self.from = [from] + self.to = toDestinations + self.exclude = exclude + } + + /// Multiple patterns, multiple destinations (NEW) + public init(fromPatterns: [String], toDestinations: [String], exclude: [String]? = nil) { + self.from = fromPatterns + self.to = toDestinations + self.exclude = exclude + } + + // MARK: - Custom Codable + + public init(from decoder: Decoder) throws { + let container = try decoder.container(keyedBy: CodingKeys.self) + + // Decode `from` (existing logic from 009) + if let patterns = try? container.decode([String].self, forKey: .from) { + guard !patterns.isEmpty else { + throw DecodingError.dataCorruptedError( + forKey: .from, in: container, + debugDescription: "from patterns cannot be empty" + ) + } + self.from = patterns + } else { + let single = try container.decode(String.self, forKey: .from) + self.from = [single] + } + + // Decode `to` (NEW - mirrors from) + if let destinations = try? container.decode([String].self, forKey: .to) { + guard !destinations.isEmpty else { + throw DecodingError.dataCorruptedError( + forKey: .to, in: container, + debugDescription: "to destinations cannot be empty" + ) + } + self.to = destinations + } else { + let single = try container.decode(String.self, forKey: .to) + self.to = [single] + } + + self.exclude = try container.decodeIfPresent([String].self, forKey: .exclude) + } + + public func encode(to encoder: Encoder) throws { + var container = encoder.container(keyedBy: CodingKeys.self) + + // Encode `from` as string if single, array if multiple + if from.count == 1 { + try container.encode(from[0], forKey: .from) + } else { + try container.encode(from, forKey: .from) + } + + // Encode `to` as string if single, array if multiple (NEW) + if to.count == 1 { + try container.encode(to[0], forKey: .to) + } else { + try container.encode(to, forKey: .to) + } + + try container.encodeIfPresent(exclude, forKey: .exclude) + } +} +``` + +### Field Changes + +| Field | Before | After | Notes | +|-------|--------|-------|-------| +| `to` | `String` | `[String]` | Internal representation always array | + +### Validation Rules + +| Rule | Enforcement | Error | +|------|-------------|-------| +| `to` cannot be empty | Decoding + init | "to destinations cannot be empty" | +| All elements must be strings | Decoding | "to must contain only strings" | +| At least one destination required | Init validation | "at least one destination required" | + +### State Transitions + +No state transitions — `ExtractionMapping` is a value type with no lifecycle. + +## New Entity: PathNormalizer + +**Location**: `Sources/SubtreeLib/Utilities/PathNormalizer.swift` + +**Purpose**: Normalize and deduplicate destination paths. + +```swift +/// Normalizes paths for deduplication (012-multi-destination-extraction) +/// +/// Handles common path variations: +/// - Trailing slashes: `Lib/` → `Lib` +/// - Leading `./`: `./Lib` → `Lib` +/// - Combinations: `./Lib/` → `Lib` +public enum PathNormalizer { + + /// Normalize a single path + public static func normalize(_ path: String) -> String { + var result = path + + // Remove leading ./ + while result.hasPrefix("./") { + result = String(result.dropFirst(2)) + } + + // Remove trailing / (except for root) + while result.hasSuffix("/") && result.count > 1 { + result = String(result.dropLast()) + } + + return result + } + + /// Deduplicate paths after normalization, preserving order + /// + /// Returns the first occurrence of each normalized path, + /// preserving the user's original formatting. + public static func deduplicate(_ paths: [String]) -> [String] { + var seen = Set() + var unique: [String] = [] + + for path in paths { + let normalized = normalize(path) + if !seen.contains(normalized) { + seen.insert(normalized) + unique.append(path) // Keep original form + } + } + + return unique + } +} +``` + +## YAML Schema + +### Before (Single Destination Only) +```yaml +extractions: + - from: "include/**/*.h" + to: "vendor/headers/" + exclude: + - "**/internal/**" +``` + +### After (Both Formats Supported) +```yaml +extractions: + # Legacy format (still works) + - from: "include/**/*.h" + to: "vendor/headers/" + + # New array format for destinations + - from: "include/**/*.h" + to: + - "Lib/headers/" + - "Vendor/headers/" + exclude: + - "**/internal/**" + + # Combined: multi-pattern + multi-destination + - from: + - "include/**/*.h" + - "src/**/*.c" + to: + - "Lib/" + - "Vendor/" +``` + +## Relationships + +``` +SubtreeEntry + └── extractions: [ExtractionMapping]? + ├── from: [String] + ├── to: [String] # Modified: String → [String] + └── exclude: [String]? +``` + +No relationship changes — `ExtractionMapping` remains a child of `SubtreeEntry.extractions`. + +## Migration Notes + +**Backward Compatibility**: 100% compatible. Existing configs with `to: "path/"` continue to work unchanged. The custom `Codable` implementation handles both string and array formats transparently. + +**No Migration Required**: Users can optionally adopt array format when needed. diff --git a/specs/012-multi-destination-extraction/plan.md b/specs/012-multi-destination-extraction/plan.md new file mode 100644 index 0000000..009dcc3 --- /dev/null +++ b/specs/012-multi-destination-extraction/plan.md @@ -0,0 +1,112 @@ +# Implementation Plan: Multi-Destination Extraction (Fan-Out) + +**Branch**: `012-multi-destination-extraction` | **Date**: 2025-11-30 | **Spec**: [spec.md](./spec.md) +**Input**: Feature specification from `/specs/012-multi-destination-extraction/spec.md` + +## Summary + +Enable multiple `--to` destination flags in a single extract command, with YAML config supporting both string (legacy) and array (new) formats. Files matching source patterns are copied to EVERY destination (fan-out semantics). Implementation mirrors 009-multi-pattern-extraction: `to` changes from `String` to `[String]` internally with custom `Codable` for backward compatibility. + +## Technical Context + +**Language/Version**: Swift 6.1 +**Primary Dependencies**: swift-argument-parser 1.6.1, Yams 6.1.0 +**Storage**: YAML config file (`subtree.yaml`) +**Testing**: Swift Testing (built into Swift 6.1 toolchain) +**Target Platform**: macOS 13+, Ubuntu 20.04 LTS +**Project Type**: CLI (Library + Executable pattern) +**Performance Goals**: Multi-destination extraction <5 seconds for ≤100 files × ≤5 destinations +**Constraints**: Backward compatible with existing single-destination configs +**Scale/Scope**: Soft limit 10 destinations (warn above, still allow) + +## Constitution Check + +*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.* + +| Principle | Status | Notes | +|-----------|--------|-------| +| I. Spec-First & TDD | ✅ | Spec complete (17 FRs, 6 clarifications), tests written per user story | +| II. Config as Source of Truth | ✅ | Extends subtree.yaml schema, backward compatible | +| III. Safe by Default | ✅ | Fail-fast validation, existing --force gates preserved | +| IV. Performance by Default | ✅ | <5s target for typical extractions | +| V. Security & Privacy | ✅ | No new shell invocations, reuses existing safe patterns | +| VI. Open Source Excellence | ✅ | KISS (mirrors 009 pattern), DRY (reuses extraction infrastructure) | + +**Legend**: ✅ Pass | ⬜ Not yet verified | ❌ Violation (requires justification) + +## Project Structure + +### Documentation (this feature) + +```text +specs/012-multi-destination-extraction/ +├── spec.md # Feature specification (complete) +├── plan.md # This file +├── research.md # Phase 0 output (009 pattern analysis) +├── data-model.md # Phase 1 output +├── quickstart.md # Phase 1 output +├── contracts/ # Phase 1 output (CLI contract) +├── checklists/ # Quality checklists +└── tasks.md # Phase 2 output (/speckit.tasks) +``` + +### Source Code (repository root) + +```text +Sources/ +├── SubtreeLib/ # Library (all business logic) +│ ├── Commands/ +│ │ └── ExtractCommand.swift # MODIFY: Accept multiple --to flags, fan-out logic +│ ├── Configuration/ +│ │ ├── ExtractionMapping.swift # MODIFY: Union type for to field (mirror from) +│ │ └── Models/ +│ │ └── SubtreeEntry.swift # No changes +│ └── Utilities/ +│ ├── ConfigFileManager.swift # MODIFY: Handle array format for to in persist +│ ├── PathNormalizer.swift # ADD: Normalize trailing slash + ./ +│ └── GlobMatcher.swift # No changes +└── subtree/ # Executable (no changes) + +Tests/ +├── SubtreeLibTests/ +│ ├── ExtractionMappingTests.swift # MODIFY: Add tests for to array parsing +│ └── PathNormalizerTests.swift # ADD: Unit tests for normalization +└── IntegrationTests/ + └── ExtractMultiDestTests.swift # ADD: Integration tests per user story +``` + +**Structure Decision**: Extends existing Library + Executable pattern. Changes mirror 009-multi-pattern-extraction: modify `ExtractionMapping.swift` (data model), `ExtractCommand.swift` (CLI + fan-out), and `ConfigFileManager.swift` (persistence). New `PathNormalizer.swift` utility for destination deduplication. + +## Implementation Phases + +### Phase 1: Data Model + CLI + Persist (Mechanical — Reuse 009) + +**Goal**: Core multi-destination functionality + +**User Stories**: US1 (Multiple CLI Destinations), US2 (Persist Multi-Destination) + +**Scope**: +- Modify `ExtractionMapping` with `to: [String]` internally (mirror `from`) +- Modify `ExtractCommand` to accept repeated `--to` flags +- Add `PathNormalizer` for destination deduplication +- Update extraction loop to copy to each destination +- Modify `ConfigFileManager` for array persist format +- Unit tests for parsing, integration tests for CLI + +### Phase 2: Behavioral Integration (New Logic) + +**Goal**: Fail-fast, clean mode, bulk mode integration + +**User Stories**: US3 (Clean Mode), US4 (Fail-Fast), US5 (Bulk Mode) + +**Scope**: +- Upfront validation across ALL destinations before any writes +- Aggregate conflict errors across destinations +- Clean mode support for multi-destination +- Per-destination progress output (FR-017) +- Soft limit warning for >10 destinations (FR-016) +- Integration tests for behavioral scenarios + +## Complexity Tracking + +No constitution violations. All changes extend existing patterns established in 009. diff --git a/specs/012-multi-destination-extraction/quickstart.md b/specs/012-multi-destination-extraction/quickstart.md new file mode 100644 index 0000000..5a04090 --- /dev/null +++ b/specs/012-multi-destination-extraction/quickstart.md @@ -0,0 +1,144 @@ +# Quickstart: Multi-Destination Extraction (Fan-Out) + +**Feature**: 012-multi-destination-extraction +**Date**: 2025-11-30 + +## Build & Test + +```bash +# Build +swift build + +# Run all tests +swift test + +# Run specific test file (once created) +swift test --filter ExtractMultiDestTests +``` + +## Manual Verification + +### Setup Test Environment + +```bash +# Create test repository +cd /tmp && rm -rf test-multi-dest && mkdir test-multi-dest && cd test-multi-dest +git init + +# Initialize subtree config +.build/debug/subtree init + +# Create mock subtree structure +mkdir -p vendor/mylib/include vendor/mylib/src +echo "// header.h" > vendor/mylib/include/header.h +echo "// source.c" > vendor/mylib/src/source.c +git add . && git commit -m "Setup" + +# Add subtree entry manually (or use subtree add) +cat >> subtree.yaml << 'EOF' +subtrees: + - name: mylib + remote: https://example.com/mylib.git + prefix: vendor/mylib + ref: main +EOF +git add subtree.yaml && git commit -m "Add mylib config" +``` + +### Test Multi-Destination Extraction + +```bash +# Basic fan-out to two destinations +.build/debug/subtree extract --name mylib --from "**/*" --to Lib/ --to Vendor/ + +# Expected output: +# ✅ Extracted 2 file(s) to 'Lib/' +# ✅ Extracted 2 file(s) to 'Vendor/' + +# Verify files exist in both locations +ls -la Lib/include/ Lib/src/ +ls -la Vendor/include/ Vendor/src/ +``` + +### Test Path Normalization / Deduplication + +```bash +# These should deduplicate to single destination +.build/debug/subtree extract --name mylib --from "**/*" --to Lib/ --to ./Lib --to Lib + +# Expected: Only one "Extracted" line for Lib/ +``` + +### Test Persist with Array + +```bash +# Save multi-destination mapping +.build/debug/subtree extract --name mylib --from "**/*.h" --to Headers/ --to Backup/ --persist + +# Verify YAML has array format +cat subtree.yaml | grep -A5 "extractions" +# Should show: to: ["Headers/", "Backup/"] +``` + +### Test Fail-Fast Protection + +```bash +# Create conflict +mkdir -p Conflict/include +echo "tracked file" > Conflict/include/header.h +git add Conflict/ && git commit -m "Add tracked conflict" + +# This should fail before any writes +.build/debug/subtree extract --name mylib --from "**/*.h" --to Clean/ --to Conflict/ + +# Expected: Error listing conflicts, no files in Clean/ +ls Clean/ # Should be empty or not exist +``` + +### Test Clean Mode + +```bash +# Extract first +.build/debug/subtree extract --name mylib --from "**/*" --to ToClean1/ --to ToClean2/ + +# Clean all destinations +.build/debug/subtree extract --clean --name mylib --from "**/*" --to ToClean1/ --to ToClean2/ + +# Expected output: +# ✅ Cleaned 2 file(s) from 'ToClean1/' +# ✅ Cleaned 2 file(s) from 'ToClean2/' +``` + +### Test Soft Limit Warning + +```bash +# Create many destinations (>10) +.build/debug/subtree extract --name mylib --from "**/*.h" \ + --to D1/ --to D2/ --to D3/ --to D4/ --to D5/ \ + --to D6/ --to D7/ --to D8/ --to D9/ --to D10/ \ + --to D11/ + +# Expected: Warning about >10 destinations, then success +``` + +## Test Coverage Checklist + +| Test | Command | Expected | +|------|---------|----------| +| Basic fan-out | `--to A/ --to B/` | Files in both | +| Deduplication | `--to A/ --to ./A` | Single copy | +| Persist array | `--persist` | YAML array | +| Fail-fast | Tracked conflict | Error, no partial | +| Clean multi-dest | `--clean --to A/ --to B/` | Both cleaned | +| Soft limit | `>10 --to` flags | Warning | +| Backward compat | Single `--to` | Works unchanged | + +## CI Validation + +```bash +# Run full test suite +swift test 2>&1 | tail -20 + +# Check exit code +echo "Exit code: $?" +``` diff --git a/specs/012-multi-destination-extraction/research.md b/specs/012-multi-destination-extraction/research.md new file mode 100644 index 0000000..b0d3d18 --- /dev/null +++ b/specs/012-multi-destination-extraction/research.md @@ -0,0 +1,188 @@ +# Research: Multi-Destination Extraction (Fan-Out) + +**Feature**: 012-multi-destination-extraction +**Date**: 2025-11-30 +**Purpose**: Document patterns from 009-multi-pattern-extraction for reuse + +## Research Summary + +This feature mirrors 009-multi-pattern-extraction but applies to the `to` field instead of `from`. The implementation patterns are well-established and require no new research — only formal documentation for reference during implementation. + +## Decision 1: Data Model Pattern (from 009) + +**Decision**: Use `[String]` internally with custom `Codable` for backward compatibility. + +**Rationale**: Proven pattern from 009. Internal array simplifies processing logic while custom encoding preserves clean YAML for single-element case. + +**Alternatives Considered**: +- **Union type (enum)**: More complex code, no practical benefit over array approach +- **Always array in YAML**: Noisier config files for common single-destination case + +**Source Reference**: `specs/009-multi-pattern-extraction/data-model.md` + +```swift +// Pattern from 009 (for `from` field) +public struct ExtractionMapping: Codable { + public let from: [String] // Always array internally + + public init(from decoder: Decoder) throws { + // Try array first, then single string + if let patterns = try? container.decode([String].self, forKey: .from) { + self.from = patterns + } else { + let single = try container.decode(String.self, forKey: .from) + self.from = [single] + } + } + + public func encode(to encoder: Encoder) throws { + // Serialize as string if single, array if multiple + if from.count == 1 { + try container.encode(from[0], forKey: .from) + } else { + try container.encode(from, forKey: .from) + } + } +} +``` + +## Decision 2: CLI Flag Pattern (from 009) + +**Decision**: Use swift-argument-parser's native repeated `@Option` support. + +**Rationale**: ArgumentParser handles `--to X --to Y` automatically when declared as `[String]`. No custom parsing needed. + +**Source Reference**: `Sources/SubtreeLib/Commands/ExtractCommand.swift` line 131 + +```swift +// Current pattern (for --from) +@Option(name: .long, help: "Glob pattern to match files (can be repeated)") +var from: [String] = [] + +// New pattern (for --to) — identical +@Option(name: .long, help: "Destination path (can be repeated for fan-out)") +var to: [String] = [] +``` + +## Decision 3: Deduplication Pattern + +**Decision**: Normalize paths then deduplicate using `Set`. + +**Rationale**: Simple and deterministic. Normalization handles common user variations. + +**Normalization Rules** (from spec clarification): +1. Remove trailing slashes (`Lib/` → `Lib`) +2. Remove leading `./` (`./Lib` → `Lib`) +3. Compare after normalization + +```swift +// New utility: PathNormalizer +public enum PathNormalizer { + public static func normalize(_ path: String) -> String { + var result = path + // Remove leading ./ + while result.hasPrefix("./") { + result = String(result.dropFirst(2)) + } + // Remove trailing / + while result.hasSuffix("/") && result.count > 1 { + result = String(result.dropLast()) + } + return result + } + + public static func deduplicate(_ paths: [String]) -> [String] { + var seen = Set() + var unique: [String] = [] + for path in paths { + let normalized = normalize(path) + if !seen.contains(normalized) { + seen.insert(normalized) + unique.append(path) // Preserve original form + } + } + return unique + } +} +``` + +## Decision 4: Fan-Out Execution Pattern + +**Decision**: Sequential iteration over destinations with per-destination output. + +**Rationale**: Simpler than parallel execution, sufficient for ≤10 destinations, easier to debug. + +**Pattern**: +```swift +// Fan-out: copy same files to each destination +for destination in normalizedDestinations { + for (sourcePath, relativePath) in matchedFiles { + let destFilePath = destination + "/" + relativePath + try copyFilePreservingStructure(from: sourcePath, to: destFilePath) + } + print("✅ Extracted \(matchedFiles.count) files to '\(destination)'") +} +``` + +## Decision 5: Fail-Fast Validation Pattern + +**Decision**: Collect all conflicts across all destinations before any writes. + +**Rationale**: Consistent with existing overwrite protection. Prevents partial state. + +**Pattern**: +```swift +// Phase 1: Validate ALL destinations +var allConflicts: [(destination: String, file: String)] = [] +for destination in destinations { + let conflicts = checkForTrackedFiles(matchedFiles, destination) + for file in conflicts { + allConflicts.append((destination, file)) + } +} + +// Phase 2: Fail if any conflicts (before any writes) +if !allConflicts.isEmpty { + displayConflictErrors(allConflicts) + exit(2) +} + +// Phase 3: Execute (all destinations validated) +for destination in destinations { + // ... copy files ... +} +``` + +## Decision 6: Progress Output Pattern + +**Decision**: Per-destination summary lines (FR-017). + +**Rationale**: Provides visibility without verbose per-file output. Matches existing emoji style. + +**Pattern**: +``` +✅ Extracted 5 files to 'Lib/' +✅ Extracted 5 files to 'Vendor/' +``` + +## Decision 7: Soft Limit Warning + +**Decision**: Warn at >10 destinations, don't block (FR-016). + +**Rationale**: Prevents accidental abuse while remaining permissive for edge cases. + +**Pattern**: +```swift +if destinations.count > 10 { + print("⚠️ Warning: \(destinations.count) destinations specified (>10)") +} +``` + +## No Further Research Needed + +All patterns are established from: +- **009-multi-pattern-extraction**: Data model, CLI, deduplication +- **008-extract-command**: Overwrite protection, file copying +- **010-extract-clean**: Clean mode infrastructure + +Implementation can proceed directly to Phase 1. diff --git a/specs/012-multi-destination-extraction/spec.md b/specs/012-multi-destination-extraction/spec.md new file mode 100644 index 0000000..fbda668 --- /dev/null +++ b/specs/012-multi-destination-extraction/spec.md @@ -0,0 +1,168 @@ +# Feature Specification: Multi-Destination Extraction (Fan-Out) + +**Feature Branch**: `012-multi-destination-extraction` +**Created**: 2025-11-30 +**Status**: Complete +**Input**: User description: "Multi-Destination Extraction (Fan-Out) — Allows extracting matched files to multiple destinations simultaneously (e.g., --to Lib/ --to Vendor/), enabling distribution of extracted files to multiple locations without repeated commands" + +## User Scenarios & Testing *(mandatory)* + +### User Story 1 - Multiple CLI Destinations (Priority: P1) + +As a developer distributing vendor files to multiple project locations, I want to specify multiple `--to` destinations in a single extract command so that I can copy matched files to several directories (e.g., both `Lib/` and `Vendor/`) without running multiple commands. + +**Why this priority**: This is the core feature — enabling multiple destinations in the CLI. Without this, users must run separate extract commands for each destination, which is tedious and error-prone. + +**Independent Test**: Can be fully tested by running `subtree extract --name foo --from '*.h' --to 'Lib/' --to 'Vendor/'` and verifying files appear in both destinations. + +**Acceptance Scenarios**: + +1. **Given** a configured subtree with header files, **When** user runs `subtree extract --name foo --from '**/*.h' --to 'Lib/' --to 'Vendor/'`, **Then** all matched files are copied to both `Lib/` and `Vendor/` directories. + +2. **Given** multiple `--to` destinations with different depths, **When** extraction runs, **Then** each destination receives the same files with their relative paths preserved. + +3. **Given** multiple `--from` patterns and multiple `--to` destinations, **When** extraction runs, **Then** the union of matched files is copied to every destination (fan-out semantics). + +--- + +### User Story 2 - Persist Multi-Destination Mappings (Priority: P1) + +As a developer, I want to persist a multi-destination extraction as a single mapping so that destinations I use together stay together when I run bulk extraction later. + +**Why this priority**: Persistence enables repeatable workflows — equally critical as CLI support for practical usage. + +**Independent Test**: Can be tested by running `subtree extract --from '*.h' --to 'Lib/' --to 'Vendor/' --persist` and verifying the config stores destinations as an array. + +**Acceptance Scenarios**: + +1. **Given** a multi-destination extract command with `--persist`, **When** the command completes successfully, **Then** the config stores `to: ["Lib/", "Vendor/"]` (array format) in a single mapping entry. + +2. **Given** a persisted multi-destination mapping, **When** user runs bulk extraction (`subtree extract --name foo`), **Then** files are extracted to all destinations from the stored array. + +3. **Given** an existing config with single-destination `to: "path/"` (string format), **When** extraction runs, **Then** it works exactly as before (backward compatible). + +--- + +### User Story 3 - Clean Mode with Multiple Destinations (Priority: P2) + +As a developer, I want `--clean` to remove files from all specified destinations so that cleanup is symmetric with extraction. + +**Why this priority**: Clean mode parity ensures users don't need to run separate cleanup commands per destination. + +**Independent Test**: Can be tested by running `subtree extract --clean --name foo --from '*.h' --to 'Lib/' --to 'Vendor/'` and verifying files are removed from both destinations. + +**Acceptance Scenarios**: + +1. **Given** files previously extracted to multiple destinations, **When** user runs `subtree extract --clean --name foo --from '**/*.h' --to 'Lib/' --to 'Vendor/'`, **Then** matching files are removed from both destinations. + +2. **Given** a persisted multi-destination mapping, **When** user runs `subtree extract --clean --name foo`, **Then** files are removed from all destinations in the stored array. + +3. **Given** clean mode with checksum validation, **When** one destination has modified files, **Then** clean fails before removing any files (fail-fast across all destinations). + +--- + +### User Story 4 - Fail-Fast Overwrite Protection (Priority: P2) + +As a developer, I want the system to validate all destinations upfront before copying any files so that I don't end up with partial extraction state. + +**Why this priority**: Prevents confusing partial state where some destinations have files and others don't. + +**Independent Test**: Can be tested by creating a git-tracked file conflict in one destination and verifying extraction fails before modifying any destination. + +**Acceptance Scenarios**: + +1. **Given** two destinations where `Lib/` is clear but `Vendor/foo.h` is git-tracked, **When** extraction runs without `--force`, **Then** the command fails with an error listing all conflicts and no files are copied to either destination. + +2. **Given** overwrite protection triggered by any destination, **When** user provides `--force` flag, **Then** extraction proceeds to all destinations, overwriting conflicting files. + +3. **Given** multiple destinations with multiple conflicts, **When** extraction fails, **Then** the error message lists all conflicting files across all destinations. + +--- + +### User Story 5 - Bulk Mode Interaction (Priority: P3) + +As a developer using `--all` to process all subtrees, I want multi-destination mappings to work correctly in bulk mode with continue-on-error semantics. + +**Why this priority**: Ensures bulk mode remains consistent with existing behavior while supporting new multi-destination mappings. + +**Independent Test**: Can be tested by configuring multiple subtrees with multi-destination mappings and running `subtree extract --all`. + +**Acceptance Scenarios**: + +1. **Given** multiple subtrees each with multi-destination mappings, **When** user runs `subtree extract --all`, **Then** each mapping extracts to all its destinations. + +2. **Given** bulk mode where one subtree's mapping fails (e.g., protection triggered), **When** extraction runs, **Then** other subtrees complete successfully and failures are summarized at the end. + +3. **Given** bulk clean mode with multi-destination mappings, **When** user runs `subtree extract --clean --all`, **Then** files are removed from all destinations for each mapping. + +--- + +### Edge Cases + +- **Empty destination array**: Config with `to: []` (empty array) should fail validation with clear error. +- **Duplicate destinations**: `--to 'Lib/' --to 'Lib/'` should deduplicate to single destination (no double-copy). +- **Overlapping destinations**: `--to 'Lib/' --to 'Lib/Sub/'` extracts to both (file may appear at `Lib/foo.h` AND `Lib/Sub/foo.h`). No deduplication based on path hierarchy — each destination is independent. +- **Mixed `to` formats in config**: Some mappings with string, others with array — both work correctly. +- **Single `--to` backward compatibility**: Existing commands with single `--to` continue to work unchanged. +- **Destination with trailing slash**: `--to 'Lib'` and `--to 'Lib/'` should be treated equivalently. +- **Non-existent destination directories**: Directories should be created as needed (existing behavior). + +## Requirements *(mandatory)* + +### Functional Requirements + +- **FR-001**: CLI MUST accept multiple `--to` flags, each specifying one destination path. +- **FR-002**: CLI MUST copy matched files to EVERY specified destination (fan-out semantics: N files × M destinations). +- **FR-003**: Directory structure MUST be preserved identically at each destination. +- **FR-004**: Config MUST accept `to` as either a string (single destination) or array of strings (multiple destinations). +- **FR-005**: Config parsing MUST validate that array elements are all strings; reject non-string elements with clear error. +- **FR-006**: Config parsing MUST reject empty arrays (`to: []`) with clear error message. +- **FR-007**: When `--persist` is used with multiple destinations, system MUST store them as an array in a single mapping entry. +- **FR-008**: Single-destination commands (`--to 'path/'`) MUST continue to work exactly as before (backward compatible). +- **FR-009**: Bulk extraction with persisted multi-destination mappings MUST extract to all destinations in the array. +- **FR-010**: Duplicate destinations MUST be deduplicated after normalizing trailing slashes and leading `./` (e.g., `Lib`, `Lib/`, `./Lib/` collapse to one). +- **FR-011**: CLI MUST warn (but not fail) when more than 10 destinations are specified. +- **FR-012**: `--clean` mode MUST remove files from ALL specified destinations (symmetric with extraction). +- **FR-013**: Overwrite protection MUST validate ALL destinations upfront before any copy operations (fail-fast). +- **FR-014**: When protection fails, error MUST list all conflicting files across all destinations. +- **FR-015**: `--force` flag MUST bypass protection for all destinations. +- **FR-016**: In bulk mode (`--all`), multi-destination mappings MUST follow continue-on-error semantics per subtree. +- **FR-017**: Progress output MUST show per-destination summaries (e.g., `✅ Extracted 5 files to Lib/` then `✅ Extracted 5 files to Vendor/`). + +### Key Entities + +- **ExtractionMapping**: Extended to support `to` as either `String` or `[String]` (array), mirroring `from` array support. +- **Destination**: Represents a single target path; multiple destinations are processed independently with same source files. + +## Success Criteria *(mandatory)* + +### Measurable Outcomes + +- **SC-001**: Users can extract files to 3+ destinations in a single command (vs. 3+ separate commands previously). +- **SC-002**: Existing single-destination configurations work without modification (100% backward compatible). +- **SC-003**: Multi-destination extraction completes in <5 seconds for typical file sets (≤100 files × ≤5 destinations). +- **SC-004**: Fail-fast protection catches all conflicts before any files are copied, leaving no partial state. +- **SC-005**: Clean mode removes files from all destinations symmetrically with extraction. + +## Assumptions + +- Users understand that fan-out copies the same files to multiple locations (not different files to different locations). +- Destination order in config does not affect extraction behavior (parallel/sequential is implementation detail). +- Existing extract command infrastructure (GlobMatcher, file copying, overwrite protection) is reused. +- `--from` and `--to` counts are independent — no positional pairing between source patterns and destinations. + +## Clarifications + +### Session 2025-11-30 + +- Q: When using `--persist` with multiple destinations, should this create a single mapping with destination array or multiple separate mappings? → A: **Single mapping** with `to: ["path1/", "path2/"]` array format, mirroring `from` array semantics. + +- Q: How should `--clean` mode work with multiple destinations? → A: **Clean all specified destinations** — symmetric with extraction behavior. + +- Q: If extracting to one destination succeeds but another has overwrite protection conflicts, what should happen? → A: **Fail-fast** — validate all destinations upfront before any copy operations. No partial state; user gets single error listing all conflicts. + +- Q: Should there be a limit on the number of destinations? → A: **Soft limit 10** — warn above threshold but still allow the operation. Prevents accidental abuse while remaining permissive for real-world use. + +- Q: How should path deduplication handle equivalent paths? → A: **Normalize trailing slash + leading `./`** — `Lib`, `Lib/`, `./Lib/` all collapse to one destination. Simple, predictable, covers common cases without filesystem calls. + +- Q: How should progress be reported for multi-destination operations? → A: **Per-destination summary** — show `✅ Extracted N files to /` for each destination. Provides visibility without verbose per-file output. diff --git a/specs/012-multi-destination-extraction/tasks.md b/specs/012-multi-destination-extraction/tasks.md new file mode 100644 index 0000000..48e98a0 --- /dev/null +++ b/specs/012-multi-destination-extraction/tasks.md @@ -0,0 +1,248 @@ +# Tasks: Multi-Destination Extraction (Fan-Out) + +**Input**: Design documents from `/specs/012-multi-destination-extraction/` +**Prerequisites**: plan.md, spec.md, data-model.md, contracts/, research.md, quickstart.md + +## Format: `[ID] [P?] [Story] Description` + +- **[P]**: Can run in parallel (different files, no dependencies) +- **[Story]**: Which user story this task belongs to (US1, US2, US3, US4, US5) +- Include exact file paths in descriptions + +--- + +## Phase 1: Setup + +**Purpose**: Prepare test infrastructure for multi-destination feature + +- [x] T001 Create test file `Tests/SubtreeLibTests/PathNormalizerTests.swift` with test suite skeleton +- [x] T002 [P] Add test cases for `to` array parsing in `Tests/SubtreeLibTests/ExtractionMappingTests.swift` +- [x] T003 [P] Create test file `Tests/IntegrationTests/ExtractMultiDestTests.swift` with test suite skeleton + +**Checkpoint**: Test files exist and compile (empty suites) + +--- + +## Phase 2: Foundational (Data Model + PathNormalizer) + +**Purpose**: Modify ExtractionMapping for `to` array + create PathNormalizer — blocks all user stories + +**Note**: Corresponds to plan.md "Phase 1: Data Model + CLI + Persist" (tasks.md Phase 1 is setup scaffolding only). + +**⚠️ CRITICAL**: No user story work can begin until this phase is complete + +### Tests for PathNormalizer + +- [x] T004 [P] Unit test: Normalize removes trailing slash in `Tests/SubtreeLibTests/PathNormalizerTests.swift` +- [x] T005 [P] Unit test: Normalize removes leading `./` in `Tests/SubtreeLibTests/PathNormalizerTests.swift` +- [x] T006 [P] Unit test: Normalize handles combined `./path/` in `Tests/SubtreeLibTests/PathNormalizerTests.swift` +- [x] T007 [P] Unit test: Deduplicate removes equivalent paths in `Tests/SubtreeLibTests/PathNormalizerTests.swift` +- [x] T008 [P] Unit test: Deduplicate preserves order and original form in `Tests/SubtreeLibTests/PathNormalizerTests.swift` + +### Implementation for PathNormalizer + +- [x] T009 Create `Sources/SubtreeLib/Utilities/PathNormalizer.swift` with `normalize(_:)` function +- [x] T010 Add `deduplicate(_:)` function in `Sources/SubtreeLib/Utilities/PathNormalizer.swift` +- [x] T011 Verify PathNormalizer tests pass with `swift test --filter PathNormalizerTests` + +### Tests for ExtractionMapping `to` Array + +- [x] T012 [P] Unit test: Decode single string format `to: "path/"` in `Tests/SubtreeLibTests/ExtractionMappingTests.swift` +- [x] T013 [P] Unit test: Decode array format `to: ["p1/", "p2/"]` in `Tests/SubtreeLibTests/ExtractionMappingTests.swift` +- [x] T014 [P] Unit test: Encode single destination as string in `Tests/SubtreeLibTests/ExtractionMappingTests.swift` +- [x] T015 [P] Unit test: Encode multiple destinations as array in `Tests/SubtreeLibTests/ExtractionMappingTests.swift` +- [x] T016 [P] Unit test: Reject empty array `to: []` in `Tests/SubtreeLibTests/ExtractionMappingTests.swift` +- [x] T017 [P] Unit test: Single-destination initializer works in `Tests/SubtreeLibTests/ExtractionMappingTests.swift` +- [x] T018 [P] Unit test: Multi-destination initializer works in `Tests/SubtreeLibTests/ExtractionMappingTests.swift` +- [x] T018b [P] Unit test: Verify Yams coercion of non-string elements in `to` in `Tests/SubtreeLibTests/ExtractionMappingTests.swift` + +### Implementation for ExtractionMapping `to` Array + +- [x] T019 Change `to` field from `String` to `[String]` in `Sources/SubtreeLib/Configuration/ExtractionMapping.swift` +- [x] T020 Update `init(from decoder:)` with try-array/fallback-string logic for `to` in `Sources/SubtreeLib/Configuration/ExtractionMapping.swift` +- [x] T021 Update `encode(to:)` with single=string/multiple=array logic for `to` in `Sources/SubtreeLib/Configuration/ExtractionMapping.swift` +- [x] T022 Add `init(from:toDestinations:exclude:)` initializer in `Sources/SubtreeLib/Configuration/ExtractionMapping.swift` +- [x] T023 Add `init(fromPatterns:toDestinations:exclude:)` initializer in `Sources/SubtreeLib/Configuration/ExtractionMapping.swift` +- [x] T024 Update existing initializers to wrap single `to` in array in `Sources/SubtreeLib/Configuration/ExtractionMapping.swift` +- [x] T025 Add validation to reject empty `to` arrays in decoding in `Sources/SubtreeLib/Configuration/ExtractionMapping.swift` +- [x] T026 Verify all unit tests pass with `swift test --filter SubtreeLibTests` + +**Checkpoint**: ExtractionMapping accepts both `to` formats, PathNormalizer works, all unit tests pass + +--- + +## Phase 3: P1 User Stories (CLI + Persist) 🎯 MVP + +**Goal**: Core multi-destination functionality — users can specify multiple `--to` flags + +**User Stories**: US1 (Multiple CLI Destinations), US2 (Persist Multi-Destination) + +**Independent Test**: Run `subtree extract --name foo --from '*.h' --to 'Lib/' --to 'Vendor/'` and verify fan-out extraction + +### Tests for P1 + +- [x] T027 [P] [US1] Integration test: Multiple --to flags extract to all destinations in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T028 [P] [US1] Integration test: Same files appear in every destination in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T029 [P] [US1] Integration test: Directory structure preserved identically in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T030 [P] [US1] Integration test: Duplicate destinations deduplicated (./Lib = Lib/ = Lib) in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T031 [P] [US1] Integration test: Per-destination success output shown in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T031b [P] [US1] Integration test: Overlapping destinations (`Lib/` and `Lib/Sub/`) both receive files in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T032 [P] [US2] Integration test: Legacy string `to` format still works in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T033 [P] [US2] Integration test: --persist stores destinations as array in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T034 [P] [US2] Integration test: Bulk extract with persisted array destinations works in `Tests/IntegrationTests/ExtractMultiDestTests.swift` + +### Implementation for P1 + +- [x] T035 [US1] Change `@Option var to: String?` to `@Option var to: [String]` in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T036 [US1] Add destination deduplication using PathNormalizer in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T037 [US1] Update ad-hoc extraction to iterate over destinations (fan-out loop) in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T038 [US1] Add per-destination success output (FR-017) in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T039 [US1] Add soft limit warning for >10 destinations (FR-011) in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T040 [US2] Update bulk extraction to handle array `to` field in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T041 [US2] Update `saveMappingToConfig` for array destination format in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T042 Verify single --to still works (backward compat) with `swift test` + +**Checkpoint**: Multi-destination CLI works, persist saves arrays, legacy configs work — MVP complete + +--- + +## Phase 4: P2/P3 User Stories (Clean + Fail-Fast + Bulk) + +**Goal**: Behavioral integration — fail-fast validation, clean mode, bulk mode support + +**User Stories**: US3 (Clean Mode), US4 (Fail-Fast), US5 (Bulk Mode) + +**Independent Test**: Create conflict in one destination, verify extraction fails before any writes + +### Tests for US3 (Clean Mode) + +- [x] T043 [P] [US3] Integration test: --clean removes files from all destinations in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T044 [P] [US3] Integration test: Clean with persisted multi-dest mapping works in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T045 [P] [US3] Integration test: Clean fails-fast if checksum mismatch in any destination in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T046 [P] [US3] Integration test: Per-destination clean output shown in `Tests/IntegrationTests/ExtractMultiDestTests.swift` + +### Tests for US4 (Fail-Fast) + +- [x] T047 [P] [US4] Integration test: Overwrite protection validates ALL destinations upfront in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T048 [P] [US4] Integration test: No files copied if any destination has conflicts in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T049 [P] [US4] Integration test: Error lists conflicts across all destinations in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T050 [P] [US4] Integration test: --force bypasses protection for all destinations in `Tests/IntegrationTests/ExtractMultiDestTests.swift` + +### Tests for US5 (Bulk Mode) + +- [x] T051 [P] [US5] Integration test: --all processes multi-dest mappings correctly in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T052 [P] [US5] Integration test: --clean --all removes from all destinations in `Tests/IntegrationTests/ExtractMultiDestTests.swift` +- [x] T053 [P] [US5] Integration test: Continue-on-error per subtree (not per destination) in `Tests/IntegrationTests/ExtractMultiDestTests.swift` + +### Implementation for US3 (Clean Mode) + +- [x] T054 [US3] Update `runAdHocClean` to iterate over destinations in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T055 [US3] Add per-destination clean output in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T056 [US3] Ensure checksum validation applies to all destinations before any deletes in `Sources/SubtreeLib/Commands/ExtractCommand.swift` + +### Implementation for US4 (Fail-Fast) + +- [x] T057 [US4] Refactor `checkForTrackedFiles` to collect conflicts from ALL destinations in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T058 [US4] Update `handleOverwriteProtection` to list conflicts grouped by destination in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T059 [US4] Ensure fan-out loop only executes after all destinations validated in `Sources/SubtreeLib/Commands/ExtractCommand.swift` + +### Implementation for US5 (Bulk Mode) + +- [x] T060 [US5] Update `runBulkExtraction` to handle multi-dest mappings in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T061 [US5] Update `runBulkClean` to handle multi-dest mappings in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T062 [US5] Verify continue-on-error semantics at subtree level (not destination) in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T063 Verify all integration tests pass with `swift test --filter ExtractMultiDestTests` + +**Checkpoint**: Fail-fast works, clean mode supports multi-dest, bulk mode handles arrays + +--- + +## Phase 5: Polish & Cross-Cutting Concerns + +**Purpose**: Documentation, cleanup, final validation + +- [x] T064 [P] Update command help text for multiple --to in `Sources/SubtreeLib/Commands/ExtractCommand.swift` +- [x] T065 [P] Add doc comments for new initializers in `Sources/SubtreeLib/Configuration/ExtractionMapping.swift` +- [x] T066 [P] Add doc comments for PathNormalizer in `Sources/SubtreeLib/Utilities/PathNormalizer.swift` +- [x] T067 Run full test suite: `swift test` — 571 tests pass +- [x] T068 Run quickstart.md validation steps manually — N/A (no quickstart.md for this feature) +- [x] T069 Update README.md with multi-destination examples (extract section, config format) + +**Checkpoint**: Documentation complete, all tests pass, feature ready for merge + +--- + +## Dependencies & Execution Order + +### Phase Dependencies + +- **Phase 1 (Setup)**: No dependencies — can start immediately +- **Phase 2 (Foundational)**: Depends on Phase 1 — BLOCKS all user stories +- **Phase 3 (P1)**: Depends on Phase 2 — Core MVP +- **Phase 4 (P2/P3)**: Depends on Phase 3 — Behavioral integration +- **Phase 5 (Polish)**: Depends on Phases 3-4 + +### User Story Dependencies + +- **US1 + US2**: Can proceed together (both P1, Phase 3) +- **US3 + US4 + US5**: Can proceed together after P1 (all in Phase 4) +- **US4 (Fail-Fast)** should be implemented before US3 (Clean) for correct validation order + +### Parallel Opportunities + +**Within Phase 2**: +``` +T004, T005, T006, T007, T008 (PathNormalizer tests) — parallel +T012, T013, T014, T015, T016, T017, T018 (ExtractionMapping tests) — parallel +``` + +**Within Phase 3**: +``` +T027, T028, T029, T030, T031, T032, T033, T034 (P1 integration tests) — parallel +``` + +**Within Phase 4**: +``` +T043-T046 (US3 tests), T047-T050 (US4 tests), T051-T053 (US5 tests) — parallel +``` + +--- + +## Implementation Strategy + +### MVP First (P1 Only) + +1. Complete Phase 1: Setup +2. Complete Phase 2: Foundational (data model + PathNormalizer) +3. Complete Phase 3: P1 User Stories (CLI + Persist) +4. **STOP and VALIDATE**: Run quickstart.md steps 1-4 +5. Deploy/demo if ready — users can use multi-destination extraction + +### Incremental Delivery + +1. **P1 Complete** → Users can extract to multiple destinations +2. **P2/P3 Complete** → Users get fail-fast protection, clean mode, bulk support +3. **Polish Complete** → Documentation and help updated + +--- + +## Task Summary + +| Phase | Task Range | Count | Purpose | +|-------|------------|-------|---------| +| Setup | T001-T003 | 3 | Test scaffolding | +| Foundational | T004-T026 + T018b | 24 | Data model + PathNormalizer | +| P1 (US1+US2) | T027-T042 + T031b | 17 | CLI + Persist MVP | +| P2/P3 (US3+US4+US5) | T043-T063 | 21 | Clean + Fail-fast + Bulk | +| Polish | T064-T069 | 6 | Docs + validation | +| **Total** | | **71** | | + +--- + +## Notes + +- All tests must FAIL before implementation (TDD) +- Commit after each phase completion +- Run `swift test` before moving to next phase +- Phase 2 is critical — data model change affects everything downstream +- Fail-fast (US4) should be implemented before clean mode (US3) within Phase 4