diff --git a/CHANGES b/CHANGES index 7a603830..15b0f6c1 100644 --- a/CHANGES +++ b/CHANGES @@ -31,7 +31,57 @@ $ pipx install --suffix=@next 'vcspull' --pip-args '\--pre' --force -_Notes on upcoming releases will be added here_ +### Breaking Changes + +This release modernizes the vcspull CLI to align with DevOps tool conventions (Terraform, Cargo, Ruff, Biome). **This is a breaking change release**. + +#### Command Changes (#472) + +- **REMOVED**: `vcspull import` command + - Replaced by `vcspull add ` to add a single repository + - Replaced by `vcspull discover ` to scan and add multiple repositories +- **NEW**: `vcspull list` - List configured repositories with optional `--tree`, `--json`, `--ndjson` output +- **NEW**: `vcspull status` - Check repository health (clean/dirty status, ahead/behind tracking with `--detailed`) + +#### Flag Changes (#472) + +- **RENAMED**: `-c/--config` → `-f/--file` (all commands) +- **NEW**: `-w/--workspace/--workspace-root` - All three aliases supported for workspace root +- **NEW**: `--dry-run/-n` - Preview changes without making modifications (sync, add, discover) +- **NEW**: `--json/--ndjson` - Machine-readable output for automation (sync, list, status) +- **NEW**: `--color {auto,always,never}` - Control color output + +#### Migration Guide (#472) + +```bash +# Old → New +vcspull import NAME URL → vcspull add NAME URL +vcspull import --scan DIR → vcspull discover DIR +vcspull sync -c FILE → vcspull sync -f FILE +vcspull sync --workspace-root PATH → vcspull sync -w PATH # (or keep long form) +vcspull fmt -c FILE → vcspull fmt -f FILE +``` + +### Features + +#### Developer Experience Improvements (#472) + +- Action commands (`sync`, `add`, `discover`) support `--dry-run` for safe previewing of changes +- Structured output (`--json`, `--ndjson`) enables CI/CD integration and automation +- Semantic colors with `NO_COLOR` environment variable support +- Short `-w` flag for workspace root reduces typing +- Consistent flag naming across all commands +- `vcspull sync --dry-run` renders a Terraform-style plan (with live progress on + TTYs) and exposes the same data via a stable JSON/NDJSON schema for automation + +#### New Introspection Commands (#472) + +- `vcspull list` - View all configured repositories + - `--tree` mode groups by workspace root + - `--json/--ndjson` for programmatic access +- `vcspull status` - Check repository health + - Shows which repos exist, are clean/dirty, or missing + - `--detailed` mode shows branch, ahead/behind tracking, and full paths ## vcspull v1.38.0 (2025-10-18) diff --git a/README.md b/README.md index db51f26c..4d7656bc 100644 --- a/README.md +++ b/README.md @@ -68,7 +68,7 @@ You can test the unpublished version of vcspull before its released. ## Configuration Add your repos to `~/.vcspull.yaml`. You can edit the file by hand or let -`vcspull import` create entries for you. +`vcspull add` or `vcspull discover` create entries for you. ```yaml ~/code/: @@ -91,40 +91,68 @@ more [configuration](https://vcspull.git-pull.com/configuration.html)) be used as a declarative manifest to clone your repos consistently across machines. Subsequent syncs of initialized repos will fetch the latest commits. -### Import repositories from the CLI +### Add repositories from the CLI -Register an existing remote without touching YAML manually: +Register a single repository without touching YAML manually: ```console -$ vcspull import my-lib https://github.com/example/my-lib.git --path ~/code/my-lib +$ vcspull add my-lib https://github.com/example/my-lib.git --path ~/code/my-lib ``` - Omit `--path` to default the entry under `./`. -- Use `--workspace-root` when you want to force a specific workspace root, e.g. - `--workspace-root ~/projects/libs`. -- Pass `-c/--config` to import into an alternate YAML file. +- Use `-w/--workspace` when you want to force a specific workspace root, e.g. + `-w ~/projects/libs`. +- Pass `-f/--file` to add to an alternate YAML file. +- Use `--dry-run` to preview changes before writing. - Follow with `vcspull sync my-lib` to clone or update the working tree after registration. -### Scan local checkouts and import en masse +### Discover local checkouts and add en masse Have a directory tree full of cloned Git repositories? Scan and append them to your configuration: ```console -$ vcspull import --scan ~/code --recursive +$ vcspull discover ~/code --recursive ``` The scan shows each repository before import unless you opt into `--yes`. Add -`--workspace-root ~/code/` to pin the resulting workspace root or `--config` to +`-w ~/code/` to pin the resulting workspace root or `-f` to write somewhere other than the default `~/.vcspull.yaml`. +### Inspect configured repositories + +List what vcspull already knows about without mutating anything: + +```console +$ vcspull list +$ vcspull list --tree +$ vcspull list --json | jq '.[].name' +``` + +`--json` emits a single JSON array, while `--ndjson` streams newline-delimited +objects that are easy to consume from shell pipelines. + +### Check repository status + +Get a quick health check for all configured workspaces: + +```console +$ vcspull status +$ vcspull status --detailed +$ vcspull status --ndjson | jq --slurp 'map(select(.reason == "summary"))' +``` + +The status command respects `--workspace/-w` filters and the global +`--color {auto,always,never}` flag. JSON and NDJSON output mirrors the list +command for automation workflows. + ### Normalize configuration files After importing or editing by hand, run the formatter to tidy up keys and keep entries sorted: ```console -$ vcspull fmt --config ~/.vcspull.yaml --write +$ vcspull fmt -f ~/.vcspull.yaml --write ``` Use `vcspull fmt --all --write` to format every YAML file that vcspull can @@ -136,6 +164,21 @@ discover under the standard config locations. $ vcspull sync ``` +Preview planned work with Terraform-style plan output or emit structured data +for CI/CD: + +```console +$ vcspull sync --dry-run "*" +$ vcspull sync --dry-run --show-unchanged "workspace-*" +$ vcspull sync --dry-run --json "*" | jq '.summary' +$ vcspull sync --dry-run --ndjson "*" | jq --slurp 'map(select(.type == "summary"))' +``` + +Dry runs stream a progress line when stdout is a TTY, then print a concise plan +summary (`+/~/✓/⚠/✗`) grouped by workspace. Use `--summary-only`, +`--relative-paths`, `--long`, or `-v/-vv` for alternate views, and +`--fetch`/`--offline` to control how remote metadata is refreshed. + Keep nested VCS repositories updated too, lets say you have a mercurial or svn project with a git dependency: @@ -149,7 +192,7 @@ or svn project with a git dependency: Clone / update repos via config file: ```console -$ vcspull sync -c external_deps.yaml '*' +$ vcspull sync -f external_deps.yaml '*' ``` See the [Quickstart](https://vcspull.git-pull.com/quickstart.html) for diff --git a/docs/Makefile b/docs/Makefile index f50e525f..12f94c8c 100644 --- a/docs/Makefile +++ b/docs/Makefile @@ -6,9 +6,14 @@ WATCH_FILES= find .. -type f -not -path '*/\.*' | grep -i '.*[.]\(rst\|md\)\$\|. # You can set these variables from the command line. SPHINXOPTS = -SPHINXBUILD = sphinx-build +# Keep ANSI color codes out of generated docs (Sphinx + argparse) by forcing +# Python's colour support off for every build command. +SPHINX_ENV = PYTHON_COLORS=0 NO_COLOR=1 +SPHINXBUILD = $(SPHINX_ENV) sphinx-build PAPER = BUILDDIR = _build +# Apply the same environment when running the live-reload server. +SPHINX_AUTOBUILD = $(SPHINX_ENV) uv run sphinx-autobuild # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 @@ -182,8 +187,8 @@ dev: $(MAKE) -j watch serve start: - uv run sphinx-autobuild "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) --port ${HTTP_PORT} $(O) + $(SPHINX_AUTOBUILD) "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) --port ${HTTP_PORT} $(O) design: # This adds additional watch directories (for _static file changes) and disable incremental builds - uv run sphinx-autobuild "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) --port ${HTTP_PORT} --watch "." -a $(O) + $(SPHINX_AUTOBUILD) "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) --port ${HTTP_PORT} --watch "." -a $(O) diff --git a/docs/api/cli/add.md b/docs/api/cli/add.md new file mode 100644 index 00000000..446ae16f --- /dev/null +++ b/docs/api/cli/add.md @@ -0,0 +1,8 @@ +# vcspull add - `vcspull.cli.add` + +```{eval-rst} +.. automodule:: vcspull.cli.add + :members: + :show-inheritance: + :undoc-members: +``` diff --git a/docs/api/cli/discover.md b/docs/api/cli/discover.md new file mode 100644 index 00000000..c9e2202f --- /dev/null +++ b/docs/api/cli/discover.md @@ -0,0 +1,8 @@ +# vcspull discover - `vcspull.cli.discover` + +```{eval-rst} +.. automodule:: vcspull.cli.discover + :members: + :show-inheritance: + :undoc-members: +``` diff --git a/docs/api/cli/import.md b/docs/api/cli/import.md index c78c8059..06d64d1d 100644 --- a/docs/api/cli/import.md +++ b/docs/api/cli/import.md @@ -1,8 +1,16 @@ # vcspull import - `vcspull.cli._import` -```{eval-rst} -.. automodule:: vcspull.cli._import - :members: - :show-inheritance: - :undoc-members: +```{warning} +**This module has been removed** as of vcspull 1.38.0. + +The `vcspull.cli._import` module has been split into two separate modules: +- {py:mod}`vcspull.cli.add` - Add single repositories manually +- {py:mod}`vcspull.cli.discover` - Scan directories for existing repositories + +See the user-facing documentation at {ref}`cli-add` and {ref}`cli-discover`. ``` + +## Historical API Reference + +This module previously provided the `import` command functionality but has been +replaced with more focused commands. diff --git a/docs/api/cli/index.md b/docs/api/cli/index.md index e205c315..b4bee3d4 100644 --- a/docs/api/cli/index.md +++ b/docs/api/cli/index.md @@ -9,7 +9,10 @@ :maxdepth: 1 sync -import +add +discover +list +status fmt ``` diff --git a/docs/api/cli/list.md b/docs/api/cli/list.md new file mode 100644 index 00000000..ffce7935 --- /dev/null +++ b/docs/api/cli/list.md @@ -0,0 +1,8 @@ +# vcspull list - `vcspull.cli.list` + +```{eval-rst} +.. automodule:: vcspull.cli.list + :members: + :show-inheritance: + :undoc-members: +``` diff --git a/docs/api/cli/status.md b/docs/api/cli/status.md new file mode 100644 index 00000000..c7a56fc3 --- /dev/null +++ b/docs/api/cli/status.md @@ -0,0 +1,8 @@ +# vcspull status - `vcspull.cli.status` + +```{eval-rst} +.. automodule:: vcspull.cli.status + :members: + :show-inheritance: + :undoc-members: +``` diff --git a/docs/cli/add.md b/docs/cli/add.md new file mode 100644 index 00000000..6d40e06b --- /dev/null +++ b/docs/cli/add.md @@ -0,0 +1,176 @@ +(cli-add)= + +# vcspull add + +The `vcspull add` command adds a single repository to your vcspull configuration. +Provide a repository name and URL, and vcspull will append it to your config file +with the appropriate workspace root. + +```{note} +This command replaces the manual import functionality from `vcspull import`. +For bulk scanning of existing repositories, see {ref}`cli-discover`. +``` + +## Command + +```{eval-rst} +.. argparse:: + :module: vcspull.cli + :func: create_parser + :prog: vcspull + :path: add + :nodescription: +``` + +## Basic usage + +Add a repository by name and URL: + +```console +$ vcspull add flask https://github.com/pallets/flask.git +Successfully added 'flask' to ./.vcspull.yaml under './' +``` + +By default, the repository is added to the current directory's workspace root (`./`). + +## Specifying workspace root + +Use `-w/--workspace` or `--workspace-root` to control where the repository will be checked out: + +```console +$ vcspull add flask https://github.com/pallets/flask.git -w ~/code/ +Successfully added 'flask' to ~/.vcspull.yaml under '~/code/' +``` + +All three flag names work identically: + +```console +$ vcspull add django https://github.com/django/django.git --workspace ~/code/ +$ vcspull add requests https://github.com/psf/requests.git --workspace-root ~/code/ +``` + +## Custom repository path + +Override the inferred path with `--path` when the repository already exists on disk: + +```console +$ vcspull add my-lib https://github.com/example/my-lib.git \ + --path ~/code/libraries/my-lib +``` + +The `--path` flag is useful when: +- Migrating existing local repositories +- Using non-standard directory layouts +- The repository name doesn't match the desired directory name + +## Choosing configuration files + +By default, vcspull looks for the first YAML configuration file in: +1. Current directory (`.vcspull.yaml`) +2. Home directory (`~/.vcspull.yaml`) +3. XDG config directory (`~/.config/vcspull/`) + +If no config exists, a new `.vcspull.yaml` is created in the current directory. + +Specify a custom config file with `-f/--file`: + +```console +$ vcspull add vcspull https://github.com/vcs-python/vcspull.git \ + -f ~/projects/.vcspull.yaml +``` + +## Dry run mode + +Preview changes without modifying your configuration with `--dry-run` or `-n`: + +```console +$ vcspull add flask https://github.com/pallets/flask.git -w ~/code/ --dry-run +Would add 'flask' (https://github.com/pallets/flask.git) to ~/.vcspull.yaml under '~/code/' +``` + +This is useful for: +- Verifying the workspace root is correct +- Checking which config file will be modified +- Testing path inference + +## URL formats + +Repositories use [pip VCS URL][pip vcs url] format with a scheme prefix: + +- Git: `git+https://github.com/user/repo.git` +- Mercurial: `hg+https://bitbucket.org/user/repo` +- Subversion: `svn+http://svn.example.org/repo/trunk` + +The URL scheme determines the VCS type. For Git, the `git+` prefix is required. + +## Examples + +Add to default location: + +```console +$ vcspull add myproject https://github.com/myuser/myproject.git +``` + +Add to specific workspace: + +```console +$ vcspull add django-blog https://github.com/example/django-blog.git \ + -w ~/code/django/ +``` + +Add with custom path: + +```console +$ vcspull add dotfiles https://github.com/myuser/dotfiles.git \ + --path ~/.dotfiles +``` + +Preview before adding: + +```console +$ vcspull add flask https://github.com/pallets/flask.git \ + -w ~/code/ --dry-run +``` + +Add to specific config file: + +```console +$ vcspull add tooling https://github.com/company/tooling.git \ + -f ~/company/.vcspull.yaml \ + -w ~/work/ +``` + +## Handling duplicates + +If a repository with the same name already exists in the workspace, vcspull will warn you: + +```console +$ vcspull add flask https://github.com/pallets/flask.git -w ~/code/ +WARNING: Repository 'flask' already exists in workspace '~/code/'. +``` + +The existing entry is preserved and not overwritten. + +## After adding repositories + +After adding repositories, consider: + +1. Running `vcspull fmt --write` to normalize and sort your configuration (see {ref}`cli-fmt`) +2. Running `vcspull list` to verify the repository was added correctly (see {ref}`cli-list`) +3. Running `vcspull sync` to clone the repository (see {ref}`cli-sync`) + +## Migration from vcspull import + +If you previously used `vcspull import `: + +```diff +- $ vcspull import flask https://github.com/pallets/flask.git -c ~/.vcspull.yaml ++ $ vcspull add flask https://github.com/pallets/flask.git -f ~/.vcspull.yaml +``` + +Changes: +- Command name: `import` → `add` +- Config flag: `-c` → `-f` +- Same functionality otherwise + +[pip vcs url]: https://pip.pypa.io/en/stable/topics/vcs-support/ diff --git a/docs/cli/discover.md b/docs/cli/discover.md new file mode 100644 index 00000000..b467563d --- /dev/null +++ b/docs/cli/discover.md @@ -0,0 +1,273 @@ +(cli-discover)= + +# vcspull discover + +The `vcspull discover` command scans directories for existing Git repositories +and adds them to your vcspull configuration. This is ideal for importing existing +workspaces or migrating from other tools. + +```{note} +This command replaces the filesystem scanning functionality from `vcspull import --scan`. +For adding single repositories manually, see {ref}`cli-add`. +``` + +## Command + +```{eval-rst} +.. argparse:: + :module: vcspull.cli + :func: create_parser + :prog: vcspull + :path: discover + :nodescription: +``` + +## Basic usage + +Scan a directory for Git repositories: + +```console +$ vcspull discover ~/code +Found 2 repositories in ~/code + +Repository: vcspull + Path: ~/code/vcspull + Remote: git+https://github.com/vcs-python/vcspull.git + Workspace: ~/code/ + +? Add to configuration? [y/N]: y +Successfully added 'vcspull' to ~/.vcspull.yaml + +Repository: libvcs + Path: ~/code/libvcs + Remote: git+https://github.com/vcs-python/libvcs.git + Workspace: ~/code/ + +? Add to configuration? [y/N]: y +Successfully added 'libvcs' to ~/.vcspull.yaml + +Scan complete: 2 repositories added, 0 skipped +``` + +The command prompts for each repository before adding it to your configuration. + +## Recursive scanning + +Search nested directories with `--recursive` or `-r`: + +```console +$ vcspull discover ~/code --recursive +``` + +This scans all subdirectories for Git repositories, making it ideal for: +- Workspaces with project categories (e.g., `~/code/python/`, `~/code/rust/`) +- Nested organization structures +- Home directory scans + +## Unattended mode + +Skip prompts and add all repositories with `--yes` or `-y`: + +```console +$ vcspull discover ~/code --recursive --yes +Found 15 repositories in ~/code +Added 15 repositories to ~/.vcspull.yaml +``` + +This is useful for: +- Automated workspace setup +- Migration scripts +- CI/CD environments + +## Dry run mode + +Preview what would be added without modifying your configuration: + +```console +$ vcspull discover ~/code --dry-run +``` + +Output shows: + +```console +Would add: vcspull (~/code/) + Remote: git+https://github.com/vcs-python/vcspull.git + +Would add: libvcs (~/code/) + Remote: git+https://github.com/vcs-python/libvcs.git + +Dry run complete: 2 repositories would be added +``` + +Combine with `--recursive` to preview large scans: + +```console +$ vcspull discover ~/ --recursive --dry-run +``` + +## Workspace root override + +Force all discovered repositories to use a specific workspace root: + +```console +$ vcspull discover ~/company/projects --workspace-root ~/work/ --yes +``` + +By default, vcspull infers the workspace root from the repository's location. +The `--workspace-root` override is useful when: + +- Consolidating repos from multiple locations +- Standardizing workspace organization +- The inferred workspace root doesn't match your desired structure + +Example - scanning home directory but organizing by workspace: + +```console +$ vcspull discover ~ --recursive --workspace-root ~/code/ --yes +``` + +## Choosing configuration files + +Specify a custom config file with `-f/--file`: + +```console +$ vcspull discover ~/company --recursive -f ~/company/.vcspull.yaml +``` + +If the config file doesn't exist, it will be created. + +## Repository detection + +`vcspull discover` identifies Git repositories by looking for `.git` directories. + +For each repository found: +1. The directory name becomes the repository name +2. The `origin` remote URL is extracted (if available) +3. The workspace root is inferred from the repository's location +4. You're prompted to confirm adding it + +### Repositories without remotes + +Repositories without an `origin` remote are detected but logged as a warning: + +```console +$ vcspull discover ~/code +WARNING: Could not determine remote URL for ~/code/local-project (no origin remote) +Skipping local-project +``` + +These repositories are skipped by default. You can add them manually with +`vcspull add` if needed. + +## Examples + +Scan current directory: + +```console +$ vcspull discover . +``` + +Scan recursively with confirmation: + +```console +$ vcspull discover ~/code --recursive +``` + +Bulk import without prompts: + +```console +$ vcspull discover ~/code --recursive --yes +``` + +Preview a large scan: + +```console +$ vcspull discover ~/code --recursive --dry-run +``` + +Scan with custom workspace: + +```console +$ vcspull discover /tmp/checkouts --workspace-root ~/code/ --yes +``` + +Scan to specific config: + +```console +$ vcspull discover ~/company/repos \ + --recursive \ + --yes \ + -f ~/company/.vcspull.yaml +``` + +## After discovering repositories + +After discovering repositories, consider: + +1. Running `vcspull fmt --write` to normalize and sort your configuration (see {ref}`cli-fmt`) +2. Running `vcspull list --tree` to verify the workspace organization (see {ref}`cli-list`) +3. Running `vcspull status` to confirm all repositories are tracked (see {ref}`cli-status`) + +## Handling existing entries + +If a repository already exists in your configuration, vcspull will detect it: + +```console +Repository: flask + Path: ~/code/flask + Remote: git+https://github.com/pallets/flask.git + Workspace: ~/code/ + +Note: Repository 'flask' already exists in ~/code/ +? Add anyway? [y/N]: n +Skipped flask (already exists) +``` + +You can choose to skip or overwrite the existing entry. + +## Migration from vcspull import --scan + +If you previously used `vcspull import --scan`: + +```diff +- $ vcspull import --scan ~/code --recursive -c ~/.vcspull.yaml --yes ++ $ vcspull discover ~/code --recursive -f ~/.vcspull.yaml --yes +``` + +Changes: +- Command: `import --scan` → `discover` +- Config flag: `-c` → `-f` +- `--scan` flag removed (discover always scans) +- Same functionality otherwise + +## Use cases + +**Initial workspace setup:** + +```console +$ vcspull discover ~/code --recursive --yes +$ vcspull fmt --write +``` + +**Migrate from another tool:** + +```console +$ vcspull discover ~/projects --recursive --dry-run +$ vcspull discover ~/projects --recursive --yes +``` + +**Add company repos to separate config:** + +```console +$ vcspull discover ~/company \ + --recursive \ + -f ~/company/.vcspull.yaml \ + --workspace-root ~/work/ \ + --yes +``` + +**Audit what's on disk:** + +```console +$ vcspull discover ~/code --recursive --dry-run | grep "Would add" +``` diff --git a/docs/cli/fmt.md b/docs/cli/fmt.md index 81489d47..20807009 100644 --- a/docs/cli/fmt.md +++ b/docs/cli/fmt.md @@ -51,8 +51,15 @@ Run the formatter in dry-run mode first to preview the adjustments, then add `--write` (or `-w`) to persist them back to disk: ```console -$ vcspull fmt --config ~/.vcspull.yaml -$ vcspull fmt --config ~/.vcspull.yaml --write +$ vcspull fmt --file ~/.vcspull.yaml +$ vcspull fmt --file ~/.vcspull.yaml --write +``` + +Short form: + +```console +$ vcspull fmt -f ~/.vcspull.yaml +$ vcspull fmt -f ~/.vcspull.yaml -w ``` Use `--all` to iterate over the default search locations: the current working @@ -63,5 +70,5 @@ file is reported individually. $ vcspull fmt --all --write ``` -Pair the formatter with [`vcspull import`](cli-import) after scanning the file +Pair the formatter with [`vcspull discover`](cli-discover) after scanning the file system to keep newly added repositories ordered and normalized. diff --git a/docs/cli/import.md b/docs/cli/import.md index c39cc85a..ee7ba23c 100644 --- a/docs/cli/import.md +++ b/docs/cli/import.md @@ -2,21 +2,22 @@ # vcspull import -The `vcspull import` command registers existing repositories with your vcspull -configuration. You can either provide a single repository name and URL or scan -directories for Git repositories that already live on disk. - -## Command - -```{eval-rst} -.. argparse:: - :module: vcspull.cli - :func: create_parser - :prog: vcspull - :path: import - :nodescription: +```{warning} +**This command has been removed** as of vcspull 1.38.0. + +The `import` command has been split into two focused commands: +- Use {ref}`cli-add` to add single repositories manually +- Use {ref}`cli-discover` to scan directories for existing repositories + +See the {ref}`migration guide ` for detailed upgrade instructions. ``` +## Historical Documentation + +The `vcspull import` command previously registered existing repositories with your vcspull +configuration. You could either provide a single repository name and URL or scan +directories for Git repositories that already lived on disk. + ## Manual import Provide a repository name and remote URL to append an entry to your @@ -67,3 +68,32 @@ $ vcspull import --scan ~/company --recursive --config ~/company/.vcspull.yaml Use `--all` with the [`vcspull fmt`](cli-fmt) command after a large scan to keep configuration entries sorted and normalized. + +## Migration Guide + +### Manual import → add + +```diff +- $ vcspull import myproject https://github.com/user/myproject.git -c ~/.vcspull.yaml ++ $ vcspull add myproject https://github.com/user/myproject.git -f ~/.vcspull.yaml +``` + +Changes: +- Command name: `import` → `add` +- Config flag: `-c/--config` → `-f/--file` + +See {ref}`cli-add` for full documentation. + +### Filesystem scanning → discover + +```diff +- $ vcspull import --scan ~/code --recursive -c ~/.vcspull.yaml --yes ++ $ vcspull discover ~/code --recursive -f ~/.vcspull.yaml --yes +``` + +Changes: +- Command: `import --scan` → `discover` +- Config flag: `-c/--config` → `-f/--file` +- `--scan` flag removed (discover always scans) + +See {ref}`cli-discover` for full documentation. diff --git a/docs/cli/index.md b/docs/cli/index.md index b416875d..0de86c86 100644 --- a/docs/cli/index.md +++ b/docs/cli/index.md @@ -7,7 +7,10 @@ :maxdepth: 1 sync -import +add +discover +list +status fmt ``` @@ -33,5 +36,5 @@ completion :nodescription: subparser_name : @replace - See :ref:`cli-sync`, :ref:`cli-import`, :ref:`cli-fmt` + See :ref:`cli-sync`, :ref:`cli-add`, :ref:`cli-discover`, :ref:`cli-list`, :ref:`cli-status`, :ref:`cli-fmt` ``` diff --git a/docs/cli/list.md b/docs/cli/list.md new file mode 100644 index 00000000..600c6e37 --- /dev/null +++ b/docs/cli/list.md @@ -0,0 +1,160 @@ +(cli-list)= + +# vcspull list + +The `vcspull list` command displays configured repositories from your vcspull +configuration files. Use this introspection command to verify your configuration, +filter repositories by patterns, and export structured data for automation. + +## Command + +```{eval-rst} +.. argparse:: + :module: vcspull.cli + :func: create_parser + :prog: vcspull + :path: list + :nodescription: +``` + +## Basic usage + +List all configured repositories: + +```console +$ vcspull list +• tiktoken → /home/d/study/ai/tiktoken +• GeographicLib → /home/d/study/c++/GeographicLib +• flask → /home/d/code/flask +``` + +## Filtering repositories + +Filter repositories using fnmatch-style patterns: + +```console +$ vcspull list 'flask*' +• flask → /home/d/code/flask +• flask-sqlalchemy → /home/d/code/flask-sqlalchemy +``` + +Multiple patterns are supported: + +```console +$ vcspull list django flask +``` + +## Tree view + +Group repositories by workspace root with `--tree`: + +```console +$ vcspull list --tree + +~/study/ai/ + • tiktoken → /home/d/study/ai/tiktoken + +~/study/c++/ + • GeographicLib → /home/d/study/c++/GeographicLib + • anax → /home/d/study/c++/anax + +~/code/ + • flask → /home/d/code/flask +``` + +## JSON output + +Export repository information as JSON for automation and tooling: + +```console +$ vcspull list --json +``` + +Output format: + +```json +[ + { + "name": "tiktoken", + "url": "git+https://github.com/openai/tiktoken.git", + "path": "/home/d/study/ai/tiktoken", + "workspace_root": "~/study/ai/" + }, + { + "name": "flask", + "url": "git+https://github.com/pallets/flask.git", + "path": "/home/d/code/flask", + "workspace_root": "~/code/" + } +] +``` + +The `workspace_root` field shows which configuration section the repository +belongs to, matching the keys in your `.vcspull.yaml` file. + +Filter JSON output with tools like [jq]: + +```console +$ vcspull list --json | jq '.[] | select(.workspace_root | contains("study"))' +``` + +## NDJSON output + +For streaming and line-oriented processing, use `--ndjson`: + +```console +$ vcspull list --ndjson +{"name":"tiktoken","url":"git+https://github.com/openai/tiktoken.git","path":"/home/d/study/ai/tiktoken","workspace_root":"~/study/ai/"} +{"name":"flask","url":"git+https://github.com/pallets/flask.git","path":"/home/d/code/flask","workspace_root":"~/code/"} +``` + +Each line is a complete JSON object, making it ideal for: +- Processing large configurations line-by-line +- Streaming data to other tools +- Parsing with simple line-based tools + +```console +$ vcspull list --ndjson | grep 'study' | jq -r '.name' +``` + +## Choosing configuration files + +By default, vcspull searches for config files in standard locations +(`~/.vcspull.yaml`, `./.vcspull.yaml`, and XDG config directories). + +Specify a custom config file with `-f/--file`: + +```console +$ vcspull list -f ~/projects/.vcspull.yaml +``` + +## Workspace filtering + +Filter repositories by workspace root with `-w/--workspace/--workspace-root`: + +```console +$ vcspull list -w ~/code/ +• flask → /home/d/code/flask +• requests → /home/d/code/requests +``` + +Globbing is supported, so you can target multiple related workspaces: + +```console +$ vcspull list --workspace '*/work/*' +``` + +The workspace filter combines with pattern filters and structured output flags, +allowing you to export subsets of your configuration quickly. + +## Color output + +Control colored output with `--color`: + +- `--color auto` (default): Use colors if outputting to a terminal +- `--color always`: Always use colors +- `--color never`: Never use colors + +The `NO_COLOR` environment variable is also respected. + +[jq]: https://stedolan.github.io/jq/ diff --git a/docs/cli/status.md b/docs/cli/status.md new file mode 100644 index 00000000..ac90e38e --- /dev/null +++ b/docs/cli/status.md @@ -0,0 +1,208 @@ +(cli-status)= + +# vcspull status + +The `vcspull status` command checks the health of configured repositories, +showing which repositories exist on disk, which are missing, and their Git status. +This introspection command helps verify your local workspace matches your configuration. + +## Command + +```{eval-rst} +.. argparse:: + :module: vcspull.cli + :func: create_parser + :prog: vcspull + :path: status + :nodescription: +``` + +## Basic usage + +Check the status of all configured repositories: + +```console +$ vcspull status +✗ tiktoken: missing +✓ flask: up to date +✓ django: up to date + +Summary: 3 repositories, 2 exist, 1 missing +``` + +The command shows: +- Repository name and path +- Whether the repository exists on disk +- If it's a Git repository +- Basic cleanliness status + +## Filtering repositories + +Filter repositories using fnmatch-style patterns: + +```console +$ vcspull status 'django*' +• django → /home/d/code/django (exists, clean) +• django-extensions → /home/d/code/django-extensions (missing) +``` + +Multiple patterns are supported: + +```console +$ vcspull status django flask requests +``` + +## Detailed status + +Show additional information with `--detailed` or `-d`: + +```console +$ vcspull status --detailed +✓ flask: up to date + Path: /home/d/code/flask + Branch: main + Ahead/Behind: 0/0 +``` + +This mode shows the full path, active branch, and divergence counters (`ahead` +and `behind`) relative to the tracked upstream. If the working tree has +uncommitted changes the headline reports `dirty` and the JSON payloads set +`clean` to `false`. + +## JSON output + +Export status information as JSON for automation and monitoring: + +```console +$ vcspull status --json +``` + +Output format: + +```json +[ + { + "reason": "status", + "name": "tiktoken", + "path": "/home/d/study/ai/tiktoken", + "workspace_root": "~/study/ai/", + "exists": false, + "is_git": false, + "clean": null, + "branch": null, + "ahead": null, + "behind": null + }, + { + "reason": "status", + "name": "flask", + "path": "/home/d/code/flask", + "workspace_root": "~/code/", + "exists": true, + "is_git": true, + "clean": true, + "branch": "main", + "ahead": 0, + "behind": 0 + }, + { + "reason": "summary", + "total": 2, + "exists": 1, + "missing": 1, + "clean": 1, + "dirty": 0 + } +] +``` + +Each status entry includes: +- `reason`: Always `"status"` for repository entries, `"summary"` for the final summary +- `name`: Repository name +- `path`: Full filesystem path +- `workspace_root`: Configuration section this repo belongs to +- `exists`: Whether the directory exists +- `is_git`: Whether it's a Git repository +- `clean`: Git working tree status (`null` if not a git repo or missing) +- `branch`: Current branch (when detailed information is available) +- `ahead`, `behind`: Divergence counts relative to the upstream branch + +Filter with [jq]: + +```console +$ vcspull status --json | jq '.[] | select(.reason == "status" and .exists == false)' +$ vcspull status --json | jq '.[] | select(.reason == "summary")' +``` + +## NDJSON output + +For streaming output, use `--ndjson`: + +```console +$ vcspull status --ndjson +{"reason":"status","name":"tiktoken","path":"/home/d/study/ai/tiktoken","workspace_root":"~/study/ai/","exists":false,"is_git":false,"clean":null} +{"reason":"status","name":"flask","path":"/home/d/code/flask","workspace_root":"~/code/","exists":true,"is_git":true,"clean":true} +{"reason":"summary","total":2,"exists":1,"missing":1,"clean":1,"dirty":0} +``` + +Process line-by-line: + +```console +$ vcspull status --ndjson | grep '"exists":false' | jq -r '.name' +``` + +## Use cases + +Monitor missing repositories: + +```console +$ vcspull status --json | jq -r '.[] | select(.reason == "status" and .exists == false) | .name' +``` + +Check which repositories need syncing: + +```console +$ vcspull status --json | jq -r '.[] | select(.reason == "status" and .exists == false) | .name' | xargs vcspull sync +``` + +Generate reports: + +```console +$ vcspull status --json > workspace-status-$(date +%Y%m%d).json +``` + +## Choosing configuration files + +Specify a custom config file with `-f/--file`: + +```console +$ vcspull status -f ~/projects/.vcspull.yaml +``` + +## Workspace filtering + +Filter repositories by workspace root (planned feature): + +```console +$ vcspull status -w ~/code/ +``` + +## Color output + +Control colored output with `--color`: + +- `--color auto` (default): Use colors if outputting to a terminal +- `--color always`: Always use colors +- `--color never`: Never use colors + +The `NO_COLOR` environment variable is also respected. + +## Future enhancements + +The status command will be expanded to include: +- Detailed Git status (ahead/behind remote, current branch) +- Dirty working tree detection +- Remote URL mismatches +- Submodule status + +[jq]: https://stedolan.github.io/jq/ diff --git a/docs/cli/sync.md b/docs/cli/sync.md index b2d25301..9131296f 100644 --- a/docs/cli/sync.md +++ b/docs/cli/sync.md @@ -4,6 +4,10 @@ # vcspull sync +The `vcspull sync` command clones and updates repositories defined in your +vcspull configuration. It's the primary command for keeping your local workspace +synchronized with remote repositories. + ## Command ```{eval-rst} @@ -15,6 +19,114 @@ :nodescription: ``` +## Dry run mode + +Preview what would be synchronized without making changes: + +```console +$ vcspull sync --dry-run '*' +Would sync flask at /home/d/code/flask +Would sync django at /home/d/code/django +Would sync requests at /home/d/code/requests +``` + +Use `--dry-run` or `-n` to: +- Verify your configuration before syncing +- Check which repositories would be updated +- Test pattern filters +- Preview operations in CI/CD + +## JSON output + +Export sync operations as JSON for automation: + +```console +$ vcspull sync --dry-run --json '*' +[ + { + "reason": "sync", + "name": "flask", + "path": "/home/d/code/flask", + "workspace_root": "~/code/", + "status": "preview" + }, + { + "reason": "summary", + "total": 3, + "synced": 0, + "previewed": 3, + "failed": 0 + } +] +``` + +Each event emitted during the run includes: + +- `reason`: `"sync"` for repository events, `"summary"` for the final summary +- `name`, `path`, `workspace_root`: Repository metadata from your config +- `status`: `"synced"`, `"preview"`, or `"error"` (with an `error` field) + +Use `--json` without `--dry-run` to capture actual sync executions—successful +and failed repositories are emitted with their final state. + +## NDJSON output + +Stream sync events line-by-line with `--ndjson`: + +```console +$ vcspull sync --dry-run --ndjson '*' +{"reason":"sync","name":"flask","path":"/home/d/code/flask","workspace_root":"~/code/","status":"preview"} +{"reason":"summary","total":3,"synced":0,"previewed":3,"failed":0} +``` + +Each line is a JSON object representing a sync event, ideal for: +- Real-time processing +- Progress monitoring +- Log aggregation + +## Configuration file selection + +Specify a custom config file with `-f/--file`: + +```console +$ vcspull sync -f ~/projects/.vcspull.yaml '*' +``` + +By default, vcspull searches for config files in: +1. Current directory (`.vcspull.yaml`) +2. Home directory (`~/.vcspull.yaml`) +3. XDG config directory (`~/.config/vcspull/`) + +## Workspace filtering + +Filter repositories by workspace root with `-w/--workspace` or `--workspace-root`: + +```console +$ vcspull sync -w ~/code/ '*' +``` + +This syncs only repositories in the specified workspace root, useful for: +- Selective workspace updates +- Multi-workspace setups +- Targeted sync operations + +All three flag names work identically: + +```console +$ vcspull sync --workspace ~/code/ '*' +$ vcspull sync --workspace-root ~/code/ '*' +``` + +## Color output + +Control colored output with `--color`: + +- `--color auto` (default): Use colors if outputting to a terminal +- `--color always`: Always use colors +- `--color never`: Never use colors + +The `NO_COLOR` environment variable is also respected. + ## Filtering repos As of 1.13.x, `$ vcspull sync` with no args passed will show a help dialog: diff --git a/docs/quickstart.md b/docs/quickstart.md index 460c2960..db5620f4 100644 --- a/docs/quickstart.md +++ b/docs/quickstart.md @@ -112,10 +112,10 @@ YAML? Create a `~/.vcspull.yaml` file: ``` Already have repositories cloned locally? Use -`vcspull import --scan ~/code --recursive` to detect existing Git checkouts and -append them to your configuration. See {ref}`cli-import` for more details and +`vcspull discover ~/code --recursive` to detect existing Git checkouts and +append them to your configuration. See {ref}`cli-discover` for more details and options such as `--workspace-root` and `--yes` for unattended runs. After editing or -importing, run `vcspull fmt --write` (documented in {ref}`cli-fmt`) to +discovering repositories, run `vcspull fmt --write` (documented in {ref}`cli-fmt`) to normalize keys and keep your configuration tidy. The `git+` in front of the repository URL. Mercurial repositories use @@ -138,10 +138,10 @@ be any name): sdl2pp: "git+https://github.com/libSDL2pp/libSDL2pp.git" ``` -Use `-c` to specify a config. +Use `-f/--file` to specify a config. ```console -$ vcspull sync -c .deps.yaml +$ vcspull sync -f .deps.yaml ``` You can also use [fnmatch] to pull repositories from your config in diff --git a/pyproject.toml b/pyproject.toml index 096d97f4..502be07d 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -86,6 +86,7 @@ dev = [ # Testing "gp-libs", "pytest", + "pytest-asyncio", "pytest-rerunfailures", "pytest-mock", "pytest-watcher", @@ -119,6 +120,7 @@ docs = [ testing = [ "gp-libs", "pytest", + "pytest-asyncio", "pytest-rerunfailures", "pytest-mock", "pytest-watcher", diff --git a/src/vcspull/cli/__init__.py b/src/vcspull/cli/__init__.py index 381f0daa..ca5cc0bb 100644 --- a/src/vcspull/cli/__init__.py +++ b/src/vcspull/cli/__init__.py @@ -15,12 +15,11 @@ from vcspull.log import setup_logger from ._formatter import VcspullHelpFormatter -from ._import import ( - create_import_subparser, - import_from_filesystem, - import_repo, -) +from .add import add_repo, create_add_subparser +from .discover import create_discover_subparser, discover_repos from .fmt import create_fmt_subparser, format_config_file +from .list import create_list_subparser, list_repos +from .status import create_status_subparser, status_repos from .sync import create_sync_subparser, sync log = logging.getLogger(__name__) @@ -57,31 +56,41 @@ def build_description( [ 'vcspull sync "*"', 'vcspull sync "django-*"', - 'vcspull sync "django-*" flask', - 'vcspull sync -c ./myrepos.yaml "*"', - "vcspull sync -c ./myrepos.yaml myproject", + 'vcspull sync --dry-run "*"', + 'vcspull sync -f ./myrepos.yaml "*"', + "vcspull sync -w ~/code myproject", + ], + ), + ( + "list", + [ + "vcspull list", + 'vcspull list "django-*"', + "vcspull list --tree", + "vcspull list --json", + ], + ), + ( + "add", + [ + "vcspull add mylib https://github.com/example/mylib.git", + "vcspull add mylib URL -w ~/code", + "vcspull add mylib URL --dry-run", ], ), ( - "import", + "discover", [ - "vcspull import mylib https://github.com/example/mylib.git", - ( - "vcspull import -c ./myrepos.yaml mylib " - "git@github.com:example/mylib.git" - ), - "vcspull import --scan ~/code", - ( - "vcspull import --scan ~/code --recursive " - "--workspace-root ~/code --yes" - ), + "vcspull discover ~/code", + "vcspull discover ~/code --recursive --yes", + "vcspull discover ~/code -w ~/projects --dry-run", ], ), ( "fmt", [ "vcspull fmt", - "vcspull fmt -c ./myrepos.yaml", + "vcspull fmt -f ./myrepos.yaml", "vcspull fmt --write", "vcspull fmt --all", ], @@ -91,7 +100,7 @@ def build_description( SYNC_DESCRIPTION = build_description( """ - sync vcs repos + Synchronize VCS repositories. """, ( ( @@ -99,35 +108,77 @@ def build_description( [ 'vcspull sync "*"', 'vcspull sync "django-*"', - 'vcspull sync "django-*" flask', - 'vcspull sync -c ./myrepos.yaml "*"', - "vcspull sync -c ./myrepos.yaml myproject", + 'vcspull sync --dry-run "*"', + 'vcspull sync -f ./myrepos.yaml "*"', + "vcspull sync -w ~/code myproject", ], ), ), ) -IMPORT_DESCRIPTION = build_description( +LIST_DESCRIPTION = build_description( """ - Import a repository to the vcspull configuration file. + List configured repositories. + """, + ( + ( + None, + [ + "vcspull list", + 'vcspull list "django-*"', + "vcspull list --tree", + "vcspull list --json", + ], + ), + ), +) - Provide NAME and URL to add a single repository, or use --scan to - discover existing git repositories within a directory. +STATUS_DESCRIPTION = build_description( + """ + Check status of repositories. """, ( ( None, [ - "vcspull import mylib https://github.com/example/mylib.git", - ( - "vcspull import -c ./myrepos.yaml mylib " - "git@github.com:example/mylib.git" - ), - "vcspull import --scan ~/code", - ( - "vcspull import --scan ~/code --recursive " - "--workspace-root ~/code --yes" - ), + "vcspull status", + 'vcspull status "django-*"', + "vcspull status --detailed", + "vcspull status --json", + ], + ), + ), +) + +ADD_DESCRIPTION = build_description( + """ + Add a single repository to the configuration. + """, + ( + ( + None, + [ + "vcspull add mylib https://github.com/example/mylib.git", + "vcspull add mylib URL -w ~/code", + "vcspull add mylib URL --dry-run", + ], + ), + ), +) + +DISCOVER_DESCRIPTION = build_description( + """ + Discover and add repositories from filesystem. + + Scans a directory for git repositories and adds them to the configuration. + """, + ( + ( + None, + [ + "vcspull discover ~/code", + "vcspull discover ~/code --recursive --yes", + "vcspull discover ~/code -w ~/projects --dry-run", ], ), ), @@ -145,7 +196,7 @@ def build_description( None, [ "vcspull fmt", - "vcspull fmt -c ./myrepos.yaml", + "vcspull fmt -f ./myrepos.yaml", "vcspull fmt --write", "vcspull fmt --all", ], @@ -188,25 +239,56 @@ def create_parser( ) subparsers = parser.add_subparsers(dest="subparser_name") + + # Sync command sync_parser = subparsers.add_parser( "sync", - help="synchronize repos", + help="synchronize repositories", formatter_class=VcspullHelpFormatter, description=SYNC_DESCRIPTION, ) create_sync_subparser(sync_parser) - import_parser = subparsers.add_parser( - "import", - help="import repository or scan filesystem for repositories", + # List command + list_parser = subparsers.add_parser( + "list", + help="list configured repositories", + formatter_class=VcspullHelpFormatter, + description=LIST_DESCRIPTION, + ) + create_list_subparser(list_parser) + + # Status command + status_parser = subparsers.add_parser( + "status", + help="check repository status", formatter_class=VcspullHelpFormatter, - description=IMPORT_DESCRIPTION, + description=STATUS_DESCRIPTION, ) - create_import_subparser(import_parser) + create_status_subparser(status_parser) + # Add command + add_parser = subparsers.add_parser( + "add", + help="add a single repository", + formatter_class=VcspullHelpFormatter, + description=ADD_DESCRIPTION, + ) + create_add_subparser(add_parser) + + # Discover command + discover_parser = subparsers.add_parser( + "discover", + help="discover repositories from filesystem", + formatter_class=VcspullHelpFormatter, + description=DISCOVER_DESCRIPTION, + ) + create_discover_subparser(discover_parser) + + # Fmt command fmt_parser = subparsers.add_parser( "fmt", - help="format vcspull configuration files", + help="format configuration files", formatter_class=VcspullHelpFormatter, description=FMT_DESCRIPTION, ) @@ -214,14 +296,28 @@ def create_parser( if return_subparsers: # Return all parsers needed by cli() function - return parser, (sync_parser, import_parser, fmt_parser) + return parser, ( + sync_parser, + list_parser, + status_parser, + add_parser, + discover_parser, + fmt_parser, + ) return parser def cli(_args: list[str] | None = None) -> None: """CLI entry point for vcspull.""" parser, subparsers = create_parser(return_subparsers=True) - sync_parser, _import_parser, _fmt_parser = subparsers + ( + sync_parser, + _list_parser, + _status_parser, + _add_parser, + _discover_parser, + _fmt_parser, + ) = subparsers args = parser.parse_args(_args) setup_logger(log=log, level=args.log_level.upper()) @@ -229,36 +325,63 @@ def cli(_args: list[str] | None = None) -> None: if args.subparser_name is None: parser.print_help() return + if args.subparser_name == "sync": sync( repo_patterns=args.repo_patterns, config=pathlib.Path(args.config) if args.config else None, + workspace_root=getattr(args, "workspace_root", None), + dry_run=getattr(args, "dry_run", False), + output_json=getattr(args, "output_json", False), + output_ndjson=getattr(args, "output_ndjson", False), + color=getattr(args, "color", "auto"), exit_on_error=args.exit_on_error, + show_unchanged=getattr(args, "show_unchanged", False), + summary_only=getattr(args, "summary_only", False), + long_view=getattr(args, "long_view", False), + relative_paths=getattr(args, "relative_paths", False), + fetch=getattr(args, "fetch", False), + offline=getattr(args, "offline", False), + verbosity=getattr(args, "verbosity", 0), parser=sync_parser, ) - elif args.subparser_name == "import": - # Unified import command - if args.scan_dir: - # Filesystem scan mode - import_from_filesystem( - scan_dir_str=args.scan_dir, - config_file_path_str=args.config, - recursive=args.recursive, - workspace_root_override=args.workspace_root_path, - yes=args.yes, - ) - elif args.name and args.url: - # Manual import mode - import_repo( - name=args.name, - url=args.url, - config_file_path_str=args.config, - path=args.path, - workspace_root_path=args.workspace_root_path, - ) - else: - # Error: need either name+url or --scan - log.error("Either provide NAME and URL, or use --scan DIR") - parser.exit(status=2) + elif args.subparser_name == "list": + list_repos( + repo_patterns=args.repo_patterns, + config_path=pathlib.Path(args.config) if args.config else None, + workspace_root=getattr(args, "workspace_root", None), + tree=args.tree, + output_json=args.output_json, + output_ndjson=args.output_ndjson, + color=args.color, + ) + elif args.subparser_name == "status": + status_repos( + repo_patterns=args.repo_patterns, + config_path=pathlib.Path(args.config) if args.config else None, + workspace_root=getattr(args, "workspace_root", None), + detailed=args.detailed, + output_json=args.output_json, + output_ndjson=args.output_ndjson, + color=args.color, + ) + elif args.subparser_name == "add": + add_repo( + name=args.name, + url=args.url, + config_file_path_str=args.config, + path=args.path, + workspace_root_path=args.workspace_root_path, + dry_run=args.dry_run, + ) + elif args.subparser_name == "discover": + discover_repos( + scan_dir_str=args.scan_dir, + config_file_path_str=args.config, + recursive=args.recursive, + workspace_root_override=args.workspace_root_path, + yes=args.yes, + dry_run=args.dry_run, + ) elif args.subparser_name == "fmt": format_config_file(args.config, args.write, args.all) diff --git a/src/vcspull/cli/_colors.py b/src/vcspull/cli/_colors.py new file mode 100644 index 00000000..41cffc4c --- /dev/null +++ b/src/vcspull/cli/_colors.py @@ -0,0 +1,126 @@ +"""Color output utilities for vcspull CLI.""" + +from __future__ import annotations + +import os +import sys +from enum import Enum + +from colorama import Fore, Style + + +class ColorMode(Enum): + """Color output modes.""" + + AUTO = "auto" + ALWAYS = "always" + NEVER = "never" + + +class Colors: + """Semantic color constants and utilities.""" + + # Semantic colors + SUCCESS = Fore.GREEN # Success, additions, up-to-date + WARNING = Fore.YELLOW # Warnings, changes needed, behind remote + ERROR = Fore.RED # Errors, deletions, conflicts + INFO = Fore.CYAN # Information, paths, URLs + HIGHLIGHT = Fore.MAGENTA # Workspace roots, important labels + MUTED = Fore.BLUE # Subdued info, bullets + RESET = Style.RESET_ALL + + def __init__(self, mode: ColorMode = ColorMode.AUTO) -> None: + """Initialize color manager. + + Parameters + ---------- + mode : ColorMode + Color mode to use (auto, always, never) + """ + self.mode = mode + self._enabled = self._should_enable_color() + + def _should_enable_color(self) -> bool: + """Determine if color should be enabled. + + Returns + ------- + bool + True if colors should be enabled + """ + # Respect NO_COLOR environment variable + if os.environ.get("NO_COLOR"): + return False + + if self.mode == ColorMode.NEVER: + return False + if self.mode == ColorMode.ALWAYS: + return True + + # AUTO mode: check if stdout is a TTY + return sys.stdout.isatty() + + def colorize(self, text: str, color: str) -> str: + """Apply color to text if colors are enabled. + + Parameters + ---------- + text : str + Text to colorize + color : str + Color code (e.g., Fore.GREEN) + + Returns + ------- + str + Colorized text if enabled, plain text otherwise + """ + if self._enabled: + return f"{color}{text}{self.RESET}" + return text + + def success(self, text: str) -> str: + """Format text as success (green).""" + return self.colorize(text, self.SUCCESS) + + def warning(self, text: str) -> str: + """Format text as warning (yellow).""" + return self.colorize(text, self.WARNING) + + def error(self, text: str) -> str: + """Format text as error (red).""" + return self.colorize(text, self.ERROR) + + def info(self, text: str) -> str: + """Format text as info (cyan).""" + return self.colorize(text, self.INFO) + + def highlight(self, text: str) -> str: + """Format text as highlighted (magenta).""" + return self.colorize(text, self.HIGHLIGHT) + + def muted(self, text: str) -> str: + """Format text as muted (blue).""" + return self.colorize(text, self.MUTED) + + +def get_color_mode(color_arg: str | None = None) -> ColorMode: + """Determine color mode from argument. + + Parameters + ---------- + color_arg : str | None + Color mode argument (auto, always, never) + + Returns + ------- + ColorMode + The determined color mode + """ + if color_arg is None: + return ColorMode.AUTO + + try: + return ColorMode(color_arg.lower()) + except ValueError: + return ColorMode.AUTO diff --git a/src/vcspull/cli/_formatter.py b/src/vcspull/cli/_formatter.py index 809f7f93..da7ffcc1 100644 --- a/src/vcspull/cli/_formatter.py +++ b/src/vcspull/cli/_formatter.py @@ -7,24 +7,31 @@ import typing as t OPTIONS_EXPECTING_VALUE = { - "-c", - "--config", + "-f", + "--file", + "-w", + "--workspace", + "--workspace-root", "--log-level", "--path", - "--workspace-root", - "--scan", + "--color", } OPTIONS_FLAG_ONLY = { "-h", "--help", - "-w", "--write", "--all", "--recursive", "-r", "--yes", "-y", + "--dry-run", + "-n", + "--json", + "--ndjson", + "--tree", + "--detailed", } diff --git a/src/vcspull/cli/_output.py b/src/vcspull/cli/_output.py new file mode 100644 index 00000000..4a822bbb --- /dev/null +++ b/src/vcspull/cli/_output.py @@ -0,0 +1,233 @@ +"""Output formatting utilities for vcspull CLI.""" + +from __future__ import annotations + +import json +import sys +import typing as t +from dataclasses import dataclass, field +from enum import Enum + + +class OutputMode(Enum): + """Output format modes.""" + + HUMAN = "human" + JSON = "json" + NDJSON = "ndjson" + + +class PlanAction(Enum): + """Supported plan actions for repository synchronization.""" + + CLONE = "clone" + UPDATE = "update" + UNCHANGED = "unchanged" + BLOCKED = "blocked" + ERROR = "error" + + +@dataclass +class PlanEntry: + """Represents a single planned action for a repository.""" + + name: str + path: str + workspace_root: str + action: PlanAction + detail: str | None = None + url: str | None = None + branch: str | None = None + remote_branch: str | None = None + current_rev: str | None = None + target_rev: str | None = None + ahead: int | None = None + behind: int | None = None + dirty: bool | None = None + error: str | None = None + diagnostics: list[str] = field(default_factory=list) + + def to_payload(self) -> dict[str, t.Any]: + """Convert the plan entry into a serialisable payload.""" + payload: dict[str, t.Any] = { + "format_version": "1", + "type": "operation", + "name": self.name, + "path": self.path, + "workspace_root": self.workspace_root, + "action": self.action.value, + } + if self.detail: + payload["detail"] = self.detail + if self.url: + payload["url"] = self.url + if self.branch: + payload["branch"] = self.branch + if self.remote_branch: + payload["remote_branch"] = self.remote_branch + if self.current_rev: + payload["current_rev"] = self.current_rev + if self.target_rev: + payload["target_rev"] = self.target_rev + if isinstance(self.ahead, int): + payload["ahead"] = self.ahead + if isinstance(self.behind, int): + payload["behind"] = self.behind + if isinstance(self.dirty, bool): + payload["dirty"] = self.dirty + if self.error: + payload["error"] = self.error + if self.diagnostics: + payload["diagnostics"] = list(self.diagnostics) + return payload + + +@dataclass +class PlanSummary: + """Aggregate summary for a synchronization plan.""" + + clone: int = 0 + update: int = 0 + unchanged: int = 0 + blocked: int = 0 + errors: int = 0 + duration_ms: int | None = None + + def total(self) -> int: + """Return the total number of repositories accounted for.""" + return self.clone + self.update + self.unchanged + self.blocked + self.errors + + def to_payload(self) -> dict[str, t.Any]: + """Convert the summary to a serialisable payload.""" + payload: dict[str, t.Any] = { + "format_version": "1", + "type": "summary", + "clone": self.clone, + "update": self.update, + "unchanged": self.unchanged, + "blocked": self.blocked, + "errors": self.errors, + "total": self.total(), + } + if isinstance(self.duration_ms, int): + payload["duration_ms"] = self.duration_ms + return payload + + +@dataclass +class PlanRenderOptions: + """Rendering options for human plan output.""" + + show_unchanged: bool = False + summary_only: bool = False + long: bool = False + verbosity: int = 0 + relative_paths: bool = False + + +@dataclass +class PlanResult: + """Container for plan entries and their summary.""" + + entries: list[PlanEntry] + summary: PlanSummary + + def to_workspace_mapping(self) -> dict[str, list[PlanEntry]]: + """Group plan entries by workspace root.""" + grouped: dict[str, list[PlanEntry]] = {} + for entry in self.entries: + grouped.setdefault(entry.workspace_root, []).append(entry) + return grouped + + def to_json_object(self) -> dict[str, t.Any]: + """Return the JSON structure for ``--json`` output.""" + workspaces: list[dict[str, t.Any]] = [] + for workspace_root, entries in self.to_workspace_mapping().items(): + workspaces.append( + { + "path": workspace_root, + "operations": [entry.to_payload() for entry in entries], + } + ) + return { + "format_version": "1", + "workspaces": workspaces, + "summary": self.summary.to_payload(), + } + + +class OutputFormatter: + """Manages output formatting for different modes (human, JSON, NDJSON).""" + + def __init__(self, mode: OutputMode = OutputMode.HUMAN) -> None: + """Initialize the output formatter. + + Parameters + ---------- + mode : OutputMode + The output mode to use (human, json, ndjson) + """ + self.mode = mode + self._json_buffer: list[dict[str, t.Any]] = [] + + def emit(self, data: dict[str, t.Any] | PlanEntry | PlanSummary) -> None: + """Emit a data event. + + Parameters + ---------- + data : dict | PlanEntry | PlanSummary + Event data to emit. PlanEntry and PlanSummary instances are serialised + automatically. + """ + if isinstance(data, (PlanEntry, PlanSummary)): + payload = data.to_payload() + else: + payload = data + + if self.mode == OutputMode.NDJSON: + # Stream one JSON object per line immediately + print(json.dumps(payload), file=sys.stdout) + sys.stdout.flush() + elif self.mode == OutputMode.JSON: + # Buffer for later output as single array + self._json_buffer.append(payload) + # Human mode: handled by specific command implementations + + def emit_text(self, text: str) -> None: + """Emit human-readable text (only in HUMAN mode). + + Parameters + ---------- + text : str + Text to output + """ + if self.mode == OutputMode.HUMAN: + print(text) + + def finalize(self) -> None: + """Finalize output (flush JSON buffer if needed).""" + if self.mode == OutputMode.JSON and self._json_buffer: + print(json.dumps(self._json_buffer, indent=2), file=sys.stdout) + self._json_buffer.clear() + + +def get_output_mode(json_flag: bool, ndjson_flag: bool) -> OutputMode: + """Determine output mode from command flags. + + Parameters + ---------- + json_flag : bool + Whether --json was specified + ndjson_flag : bool + Whether --ndjson was specified + + Returns + ------- + OutputMode + The determined output mode (NDJSON takes precedence over JSON) + """ + if ndjson_flag: + return OutputMode.NDJSON + if json_flag: + return OutputMode.JSON + return OutputMode.HUMAN diff --git a/src/vcspull/cli/_workspaces.py b/src/vcspull/cli/_workspaces.py new file mode 100644 index 00000000..e1eb9744 --- /dev/null +++ b/src/vcspull/cli/_workspaces.py @@ -0,0 +1,68 @@ +"""Workspace filtering helpers for vcspull CLI.""" + +from __future__ import annotations + +import fnmatch +import pathlib + +from vcspull.config import canonicalize_workspace_path, workspace_root_label +from vcspull.types import ConfigDict + + +def _normalize_workspace_label( + workspace_root: str, + *, + cwd: pathlib.Path, + home: pathlib.Path, +) -> str: + canonical_path = canonicalize_workspace_path(workspace_root, cwd=cwd) + return workspace_root_label(canonical_path, cwd=cwd, home=home) + + +def _repo_workspace_label( + repo: ConfigDict, + *, + cwd: pathlib.Path, + home: pathlib.Path, +) -> str: + raw_label = repo.get("workspace_root") + if raw_label: + return _normalize_workspace_label(str(raw_label), cwd=cwd, home=home) + + repo_path = pathlib.Path(repo["path"]).expanduser() + return workspace_root_label(repo_path.parent, cwd=cwd, home=home) + + +def filter_by_workspace( + repos: list[ConfigDict], + workspace_root: str | None, + *, + cwd: pathlib.Path | None = None, + home: pathlib.Path | None = None, +) -> list[ConfigDict]: + """Filter repositories by workspace root pattern.""" + if not workspace_root: + return repos + + cwd = cwd or pathlib.Path.cwd() + home = home or pathlib.Path.home() + + normalized_filter = _normalize_workspace_label( + workspace_root, + cwd=cwd, + home=home, + ) + has_glob = any(char in workspace_root for char in "*?[]") + + filtered: list[ConfigDict] = [] + for repo in repos: + repo_label = _repo_workspace_label(repo, cwd=cwd, home=home) + if has_glob: + if fnmatch.fnmatch(repo_label, workspace_root) or fnmatch.fnmatch( + repo_label, + normalized_filter, + ): + filtered.append(repo) + elif repo_label == normalized_filter: + filtered.append(repo) + return filtered diff --git a/src/vcspull/cli/add.py b/src/vcspull/cli/add.py new file mode 100644 index 00000000..b5c7644c --- /dev/null +++ b/src/vcspull/cli/add.py @@ -0,0 +1,307 @@ +"""Add single repository functionality for vcspull.""" + +from __future__ import annotations + +import argparse +import logging +import pathlib +import traceback +import typing as t + +from colorama import Fore, Style + +from vcspull._internal.config_reader import ConfigReader +from vcspull.config import ( + canonicalize_workspace_path, + expand_dir, + find_home_config_files, + normalize_workspace_roots, + save_config_yaml, + workspace_root_label, +) + +log = logging.getLogger(__name__) + + +def create_add_subparser(parser: argparse.ArgumentParser) -> None: + """Create ``vcspull add`` argument subparser. + + Parameters + ---------- + parser : argparse.ArgumentParser + The parser to configure + """ + parser.add_argument( + "name", + help="Name for the repository in the config", + ) + parser.add_argument( + "url", + help="Repository URL (e.g., https://github.com/user/repo.git)", + ) + parser.add_argument( + "-f", + "--file", + dest="config", + metavar="FILE", + help="path to config file (default: ~/.vcspull.yaml or ./.vcspull.yaml)", + ) + parser.add_argument( + "--path", + dest="path", + help="Local directory path where repo will be cloned " + "(determines workspace root if not specified with --workspace)", + ) + parser.add_argument( + "-w", + "--workspace", + "--workspace-root", + dest="workspace_root_path", + metavar="DIR", + help=( + "Workspace root directory in config (e.g., '~/projects/'). " + "If not specified, will be inferred from --path or use current directory." + ), + ) + parser.add_argument( + "--dry-run", + "-n", + action="store_true", + help="Preview changes without writing to config file", + ) + + +def _resolve_workspace_path( + workspace_root: str | None, + repo_path_str: str | None, + *, + cwd: pathlib.Path, +) -> pathlib.Path: + """Resolve workspace path from arguments. + + Parameters + ---------- + workspace_root : str | None + Workspace root path from user + repo_path_str : str | None + Repo path from user + cwd : pathlib.Path + Current working directory + + Returns + ------- + pathlib.Path + Resolved workspace path + """ + if workspace_root: + return canonicalize_workspace_path(workspace_root, cwd=cwd) + if repo_path_str: + return expand_dir(pathlib.Path(repo_path_str), cwd) + return cwd + + +def add_repo( + name: str, + url: str, + config_file_path_str: str | None, + path: str | None, + workspace_root_path: str | None, + dry_run: bool, +) -> None: + """Add a repository to the vcspull configuration. + + Parameters + ---------- + name : str + Repository name for the config + url : str + Repository URL + config_file_path_str : str | None + Path to config file, or None to use default + path : str | None + Local path where repo will be cloned + workspace_root_path : str | None + Workspace root to use in config + dry_run : bool + If True, preview changes without writing + """ + # Determine config file + config_file_path: pathlib.Path + if config_file_path_str: + config_file_path = pathlib.Path(config_file_path_str).expanduser().resolve() + else: + home_configs = find_home_config_files(filetype=["yaml"]) + if not home_configs: + config_file_path = pathlib.Path.cwd() / ".vcspull.yaml" + log.info( + "No config specified and no default found, will create at %s", + config_file_path, + ) + elif len(home_configs) > 1: + log.error( + "Multiple home config files found, please specify one with -f/--file", + ) + return + else: + config_file_path = home_configs[0] + + # Load existing config + raw_config: dict[str, t.Any] = {} + if config_file_path.exists() and config_file_path.is_file(): + try: + loaded_config = ConfigReader._from_file(config_file_path) + except Exception: + log.exception("Error loading YAML from %s. Aborting.", config_file_path) + if log.isEnabledFor(logging.DEBUG): + traceback.print_exc() + return + + if loaded_config is None: + raw_config = {} + elif isinstance(loaded_config, dict): + raw_config = loaded_config + else: + log.error( + "Config file %s is not a valid YAML dictionary.", + config_file_path, + ) + return + else: + log.info( + "Config file %s not found. A new one will be created.", + config_file_path, + ) + + cwd = pathlib.Path.cwd() + home = pathlib.Path.home() + + normalization_result = normalize_workspace_roots( + raw_config, + cwd=cwd, + home=home, + ) + raw_config, workspace_map, merge_conflicts, _merge_changes = normalization_result + config_was_normalized = _merge_changes > 0 + + for message in merge_conflicts: + log.warning(message) + + workspace_path = _resolve_workspace_path( + workspace_root_path, + path, + cwd=cwd, + ) + workspace_label = workspace_map.get(workspace_path) + if workspace_label is None: + workspace_label = workspace_root_label( + workspace_path, + cwd=cwd, + home=home, + ) + workspace_map[workspace_path] = workspace_label + raw_config.setdefault(workspace_label, {}) + + if workspace_label not in raw_config: + raw_config[workspace_label] = {} + elif not isinstance(raw_config[workspace_label], dict): + log.error( + "Workspace root '%s' in configuration is not a dictionary. Aborting.", + workspace_label, + ) + return + + # Check if repo already exists + if name in raw_config[workspace_label]: + existing_config = raw_config[workspace_label][name] + # Handle both string and dict formats + current_url: str + if isinstance(existing_config, str): + current_url = existing_config + elif isinstance(existing_config, dict): + repo_value = existing_config.get("repo") + url_value = existing_config.get("url") + current_url = repo_value or url_value or "unknown" + else: + current_url = str(existing_config) + + log.warning( + "Repository '%s' already exists under '%s'. Current URL: %s. " + "To update, remove and re-add, or edit the YAML file manually.", + name, + workspace_label, + current_url, + ) + if config_was_normalized: + if dry_run: + log.info( + "%s→%s Would save normalized workspace roots to %s%s%s.", + Fore.YELLOW, + Style.RESET_ALL, + Fore.BLUE, + config_file_path, + Style.RESET_ALL, + ) + else: + try: + save_config_yaml(config_file_path, raw_config) + log.info( + "%s✓%s Normalized workspace roots saved to %s%s%s.", + Fore.GREEN, + Style.RESET_ALL, + Fore.BLUE, + config_file_path, + Style.RESET_ALL, + ) + except Exception: + log.exception("Error saving config to %s", config_file_path) + if log.isEnabledFor(logging.DEBUG): + traceback.print_exc() + return + + # Add the repository in verbose format + raw_config[workspace_label][name] = {"repo": url} + + # Save or preview config + if dry_run: + log.info( + "%s→%s Would add %s'%s'%s (%s%s%s) to %s%s%s under '%s%s%s'.", + Fore.YELLOW, + Style.RESET_ALL, + Fore.CYAN, + name, + Style.RESET_ALL, + Fore.YELLOW, + url, + Style.RESET_ALL, + Fore.BLUE, + config_file_path, + Style.RESET_ALL, + Fore.MAGENTA, + workspace_label, + Style.RESET_ALL, + ) + else: + try: + save_config_yaml(config_file_path, raw_config) + log.info( + "%s✓%s Successfully added %s'%s'%s (%s%s%s) to %s%s%s under '%s%s%s'.", + Fore.GREEN, + Style.RESET_ALL, + Fore.CYAN, + name, + Style.RESET_ALL, + Fore.YELLOW, + url, + Style.RESET_ALL, + Fore.BLUE, + config_file_path, + Style.RESET_ALL, + Fore.MAGENTA, + workspace_label, + Style.RESET_ALL, + ) + except Exception: + log.exception("Error saving config to %s", config_file_path) + if log.isEnabledFor(logging.DEBUG): + traceback.print_exc() + return diff --git a/src/vcspull/cli/_import.py b/src/vcspull/cli/discover.py similarity index 65% rename from src/vcspull/cli/_import.py rename to src/vcspull/cli/discover.py index bdbbf164..1103e679 100644 --- a/src/vcspull/cli/_import.py +++ b/src/vcspull/cli/discover.py @@ -1,4 +1,4 @@ -"""Import repository functionality for vcspull.""" +"""Discover repositories from filesystem for vcspull.""" from __future__ import annotations @@ -22,9 +22,6 @@ workspace_root_label, ) -if t.TYPE_CHECKING: - import argparse - log = logging.getLogger(__name__) @@ -55,64 +52,55 @@ def get_git_origin_url(repo_path: pathlib.Path) -> str | None: return None -def create_import_subparser(parser: argparse.ArgumentParser) -> None: - """Create ``vcspull import`` argument subparser.""" - parser.add_argument( - "-c", - "--config", - dest="config", - metavar="file", - help="path to custom config file (default: .vcspull.yaml or ~/.vcspull.yaml)", - ) +def create_discover_subparser(parser: argparse.ArgumentParser) -> None: + """Create ``vcspull discover`` argument subparser. - # Positional arguments for single repo import - parser.add_argument( - "name", - nargs="?", - help="Name for the repository in the config", - ) + Parameters + ---------- + parser : argparse.ArgumentParser + The parser to configure + """ parser.add_argument( - "url", - nargs="?", - help="Repository URL (e.g., https://github.com/user/repo.git)", + "scan_dir", + metavar="PATH", + help="Directory to scan for git repositories", ) - - # Options for single repo import parser.add_argument( - "--path", - dest="path", - help="Local directory path where repo will be cloned " - "(determines workspace root if not specified with --workspace-root)", + "-f", + "--file", + dest="config", + metavar="FILE", + help="path to config file (default: ~/.vcspull.yaml or ./.vcspull.yaml)", ) parser.add_argument( + "-w", + "--workspace", "--workspace-root", dest="workspace_root_path", metavar="DIR", help=( "Workspace root directory in config (e.g., '~/projects/'). " - "If not specified, will be inferred from --path or use current directory. " - "When used with --scan, applies the workspace root to all discovered repos." + "If not specified, uses the scan directory. " + "Applies the workspace root to all discovered repos." ), ) - - # Filesystem scan mode - parser.add_argument( - "--scan", - dest="scan_dir", - metavar="DIR", - help="Scan directory for git repositories and import them", - ) parser.add_argument( "--recursive", "-r", action="store_true", - help="Scan directories recursively (use with --scan)", + help="Scan directories recursively", ) parser.add_argument( "--yes", "-y", action="store_true", - help="Skip confirmation prompt (use with --scan)", + help="Skip confirmation prompt", + ) + parser.add_argument( + "--dry-run", + "-n", + action="store_true", + help="Preview changes without writing to config file", ) @@ -122,6 +110,22 @@ def _resolve_workspace_path( *, cwd: pathlib.Path, ) -> pathlib.Path: + """Resolve workspace path from arguments. + + Parameters + ---------- + workspace_root : str | None + Workspace root path from user + repo_path_str : str | None + Repo path from user + cwd : pathlib.Path + Current working directory + + Returns + ------- + pathlib.Path + Resolved workspace path + """ if workspace_root: return canonicalize_workspace_path(workspace_root, cwd=cwd) if repo_path_str: @@ -129,189 +133,15 @@ def _resolve_workspace_path( return cwd -def import_repo( - name: str, - url: str, - config_file_path_str: str | None, - path: str | None, - workspace_root_path: str | None, -) -> None: - """Import a repository to the vcspull configuration. - - Parameters - ---------- - name : str - Repository name for the config - url : str - Repository URL - config_file_path_str : str | None - Path to config file, or None to use default - path : str | None - Local path where repo will be cloned - workspace_root_path : str | None - Workspace root to use in config - """ - # Determine config file - config_file_path: pathlib.Path - if config_file_path_str: - config_file_path = pathlib.Path(config_file_path_str).expanduser().resolve() - else: - home_configs = find_home_config_files(filetype=["yaml"]) - if not home_configs: - config_file_path = pathlib.Path.cwd() / ".vcspull.yaml" - log.info( - "No config specified and no default found, will create at %s", - config_file_path, - ) - elif len(home_configs) > 1: - log.error( - "Multiple home config files found, please specify one with -c/--config", - ) - return - else: - config_file_path = home_configs[0] - - # Load existing config - raw_config: dict[str, t.Any] = {} - if config_file_path.exists() and config_file_path.is_file(): - try: - loaded_config = ConfigReader._from_file(config_file_path) - except Exception: - log.exception("Error loading YAML from %s. Aborting.", config_file_path) - if log.isEnabledFor(logging.DEBUG): - traceback.print_exc() - return - - if loaded_config is None: - raw_config = {} - elif isinstance(loaded_config, dict): - raw_config = loaded_config - else: - log.error( - "Config file %s is not a valid YAML dictionary.", - config_file_path, - ) - return - else: - log.info( - "Config file %s not found. A new one will be created.", - config_file_path, - ) - - cwd = pathlib.Path.cwd() - home = pathlib.Path.home() - - normalization_result = normalize_workspace_roots( - raw_config, - cwd=cwd, - home=home, - ) - raw_config, workspace_map, merge_conflicts, _merge_changes = normalization_result - config_was_normalized = _merge_changes > 0 - - for message in merge_conflicts: - log.warning(message) - - workspace_path = _resolve_workspace_path( - workspace_root_path, - path, - cwd=cwd, - ) - workspace_label = workspace_map.get(workspace_path) - if workspace_label is None: - workspace_label = workspace_root_label( - workspace_path, - cwd=cwd, - home=home, - ) - workspace_map[workspace_path] = workspace_label - raw_config.setdefault(workspace_label, {}) - - if workspace_label not in raw_config: - raw_config[workspace_label] = {} - elif not isinstance(raw_config[workspace_label], dict): - log.error( - "Workspace root '%s' in configuration is not a dictionary. Aborting.", - workspace_label, - ) - return - - # Check if repo already exists - if name in raw_config[workspace_label]: - existing_config = raw_config[workspace_label][name] - # Handle both string and dict formats - current_url: str - if isinstance(existing_config, str): - current_url = existing_config - elif isinstance(existing_config, dict): - repo_value = existing_config.get("repo") - url_value = existing_config.get("url") - current_url = repo_value or url_value or "unknown" - else: - current_url = str(existing_config) - - log.warning( - "Repository '%s' already exists under '%s'. Current URL: %s. " - "To update, remove and re-add, or edit the YAML file manually.", - name, - workspace_label, - current_url, - ) - if config_was_normalized: - try: - save_config_yaml(config_file_path, raw_config) - log.info( - "%s✓%s Normalized workspace roots saved to %s%s%s.", - Fore.GREEN, - Style.RESET_ALL, - Fore.BLUE, - config_file_path, - Style.RESET_ALL, - ) - except Exception: - log.exception("Error saving config to %s", config_file_path) - if log.isEnabledFor(logging.DEBUG): - traceback.print_exc() - return - - # Add the repository in verbose format - raw_config[workspace_label][name] = {"repo": url} - - # Save config - try: - save_config_yaml(config_file_path, raw_config) - log.info( - "%s✓%s Successfully imported %s'%s'%s (%s%s%s) to %s%s%s under '%s%s%s'.", - Fore.GREEN, - Style.RESET_ALL, - Fore.CYAN, - name, - Style.RESET_ALL, - Fore.YELLOW, - url, - Style.RESET_ALL, - Fore.BLUE, - config_file_path, - Style.RESET_ALL, - Fore.MAGENTA, - workspace_label, - Style.RESET_ALL, - ) - except Exception: - log.exception("Error saving config to %s", config_file_path) - if log.isEnabledFor(logging.DEBUG): - traceback.print_exc() - return - - -def import_from_filesystem( +def discover_repos( scan_dir_str: str, config_file_path_str: str | None, recursive: bool, workspace_root_override: str | None, yes: bool, + dry_run: bool, ) -> None: - """Scan filesystem for git repositories and import to vcspull config. + """Scan filesystem for git repositories and add to vcspull config. Parameters ---------- @@ -325,6 +155,8 @@ def import_from_filesystem( Workspace root to use in config (overrides automatic detection) yes : bool Whether to skip confirmation prompt + dry_run : bool + If True, preview changes without writing """ scan_dir = expand_dir(pathlib.Path(scan_dir_str)) @@ -346,7 +178,7 @@ def import_from_filesystem( ) elif len(home_configs) > 1: log.error( - "Multiple home_config files found, please specify one with -c/--config", + "Multiple home_config files found, please specify one with -f/--file", ) return else: @@ -528,7 +360,7 @@ def import_from_filesystem( Fore.GREEN, Style.RESET_ALL, ) - if changes_made: + if changes_made and not dry_run: try: save_config_yaml(config_file_path, raw_config) log.info( @@ -548,10 +380,11 @@ def import_from_filesystem( # Show what will be added log.info( - "\n%sFound %d new %s to import:%s", + "\n%sFound %d new %s to %s:%s", Fore.GREEN, len(repos_to_add), "repository" if len(repos_to_add) == 1 else "repositories", + "preview" if dry_run else "import", Style.RESET_ALL, ) for repo_name, repo_url, _determined_base_key in repos_to_add: @@ -567,6 +400,17 @@ def import_from_filesystem( Style.RESET_ALL, ) + if dry_run: + log.info( + "\n%s→%s Dry run complete. No changes made to %s%s%s.", + Fore.YELLOW, + Style.RESET_ALL, + Fore.BLUE, + config_file_path, + Style.RESET_ALL, + ) + return + if not yes: confirm = input( f"\n{Fore.CYAN}Import these repositories? [y/N]: {Style.RESET_ALL}", diff --git a/src/vcspull/cli/fmt.py b/src/vcspull/cli/fmt.py index 0be52b5c..41552e48 100644 --- a/src/vcspull/cli/fmt.py +++ b/src/vcspull/cli/fmt.py @@ -26,11 +26,11 @@ def create_fmt_subparser(parser: argparse.ArgumentParser) -> None: """Create ``vcspull fmt`` argument subparser.""" parser.add_argument( - "-c", - "--config", + "-f", + "--file", dest="config", - metavar="file", - help="path to custom config file (default: .vcspull.yaml or ~/.vcspull.yaml)", + metavar="FILE", + help="path to config file (default: .vcspull.yaml or ~/.vcspull.yaml)", ) parser.add_argument( "--write", @@ -407,7 +407,7 @@ def format_config_file( elif len(home_configs) > 1: log.error( "Multiple home config files found, " - "please specify one with -c/--config", + "please specify one with -f/--file", ) return else: diff --git a/src/vcspull/cli/list.py b/src/vcspull/cli/list.py new file mode 100644 index 00000000..60b326a3 --- /dev/null +++ b/src/vcspull/cli/list.py @@ -0,0 +1,228 @@ +"""List repositories functionality for vcspull.""" + +from __future__ import annotations + +import argparse +import logging +import pathlib +import typing as t + +from vcspull.config import filter_repos, find_config_files, load_configs + +from ._colors import Colors, get_color_mode +from ._output import OutputFormatter, get_output_mode +from ._workspaces import filter_by_workspace + +if t.TYPE_CHECKING: + from vcspull.types import ConfigDict + +log = logging.getLogger(__name__) + + +def create_list_subparser(parser: argparse.ArgumentParser) -> None: + """Create ``vcspull list`` argument subparser. + + Parameters + ---------- + parser : argparse.ArgumentParser + The parser to configure + """ + parser.add_argument( + "-f", + "--file", + dest="config", + metavar="FILE", + help="path to config file (default: ~/.vcspull.yaml or ./.vcspull.yaml)", + ) + parser.add_argument( + "-w", + "--workspace", + "--workspace-root", + dest="workspace_root", + metavar="DIR", + help="filter by workspace root directory", + ) + parser.add_argument( + "repo_patterns", + metavar="pattern", + nargs="*", + help="filter repositories by name pattern (supports fnmatch)", + ) + parser.add_argument( + "--tree", + action="store_true", + help="display repositories grouped by workspace root", + ) + parser.add_argument( + "--json", + action="store_true", + dest="output_json", + help="output as JSON", + ) + parser.add_argument( + "--ndjson", + action="store_true", + dest="output_ndjson", + help="output as NDJSON (one JSON per line)", + ) + parser.add_argument( + "--color", + choices=["auto", "always", "never"], + default="auto", + help="when to use colors (default: auto)", + ) + + +def list_repos( + repo_patterns: list[str], + config_path: pathlib.Path | None, + workspace_root: str | None, + tree: bool, + output_json: bool, + output_ndjson: bool, + color: str, +) -> None: + """List configured repositories. + + Parameters + ---------- + repo_patterns : list[str] + Patterns to filter repositories (fnmatch) + config_path : pathlib.Path | None + Path to config file, or None to auto-discover + workspace_root : str | None + Filter by workspace root + tree : bool + Group by workspace root in tree view + output_json : bool + Output as JSON + output_ndjson : bool + Output as NDJSON + color : str + Color mode (auto, always, never) + """ + # Load configs + if config_path: + configs = load_configs([config_path]) + else: + configs = load_configs(find_config_files(include_home=True)) + + # Filter by patterns if provided + if repo_patterns: + found_repos: list[ConfigDict] = [] + for pattern in repo_patterns: + found_repos.extend(filter_repos(configs, name=pattern)) + else: + # No patterns = all repos + found_repos = configs + + # Further filter by workspace root if specified + if workspace_root: + found_repos = filter_by_workspace(found_repos, workspace_root) + + # Initialize output formatter and colors + output_mode = get_output_mode(output_json, output_ndjson) + formatter = OutputFormatter(output_mode) + colors = Colors(get_color_mode(color)) + + if not found_repos: + formatter.emit_text(colors.warning("No repositories found.")) + formatter.finalize() + return + + # Output based on mode + if tree: + _output_tree(found_repos, formatter, colors) + else: + _output_flat(found_repos, formatter, colors) + + formatter.finalize() + + +def _output_flat( + repos: list[ConfigDict], + formatter: OutputFormatter, + colors: Colors, +) -> None: + """Output repositories in flat list format. + + Parameters + ---------- + repos : list[ConfigDict] + Repositories to display + formatter : OutputFormatter + Output formatter + colors : Colors + Color manager + """ + for repo in repos: + repo_name = repo.get("name", "unknown") + repo_url = repo.get("url", repo.get("pip_url", "unknown")) + repo_path = repo.get("path", "unknown") + + # JSON/NDJSON output + formatter.emit( + { + "name": repo_name, + "url": str(repo_url), + "path": str(repo_path), + "workspace_root": str(repo.get("workspace_root", "")), + } + ) + + # Human output + formatter.emit_text( + f"{colors.muted('•')} {colors.info(repo_name)} " + f"{colors.muted('→')} {repo_path}", + ) + + +def _output_tree( + repos: list[ConfigDict], + formatter: OutputFormatter, + colors: Colors, +) -> None: + """Output repositories grouped by workspace root (tree view). + + Parameters + ---------- + repos : list[ConfigDict] + Repositories to display + formatter : OutputFormatter + Output formatter + colors : Colors + Color manager + """ + # Group by workspace root + by_workspace: dict[str, list[ConfigDict]] = {} + for repo in repos: + workspace = str(repo.get("workspace_root", "unknown")) + by_workspace.setdefault(workspace, []).append(repo) + + # Output grouped + for workspace in sorted(by_workspace.keys()): + workspace_repos = by_workspace[workspace] + + # Human output: workspace header + formatter.emit_text(f"\n{colors.highlight(workspace)}") + + for repo in workspace_repos: + repo_name = repo.get("name", "unknown") + repo_url = repo.get("url", repo.get("pip_url", "unknown")) + repo_path = repo.get("path", "unknown") + + # JSON/NDJSON output + formatter.emit( + { + "name": repo_name, + "url": str(repo_url), + "path": str(repo_path), + "workspace_root": workspace, + } + ) + + # Human output: indented repo + formatter.emit_text( + f" {colors.muted('•')} {colors.info(repo_name)} " + f"{colors.muted('→')} {repo_path}", + ) diff --git a/src/vcspull/cli/status.py b/src/vcspull/cli/status.py new file mode 100644 index 00000000..692eda15 --- /dev/null +++ b/src/vcspull/cli/status.py @@ -0,0 +1,354 @@ +"""Repository status checking functionality for vcspull.""" + +from __future__ import annotations + +import argparse +import logging +import pathlib +import subprocess +import typing as t + +from vcspull.config import filter_repos, find_config_files, load_configs + +from ._colors import Colors, get_color_mode +from ._output import OutputFormatter, get_output_mode +from ._workspaces import filter_by_workspace + +if t.TYPE_CHECKING: + from vcspull.types import ConfigDict + +log = logging.getLogger(__name__) + + +def create_status_subparser(parser: argparse.ArgumentParser) -> None: + """Create ``vcspull status`` argument subparser. + + Parameters + ---------- + parser : argparse.ArgumentParser + The parser to configure + """ + parser.add_argument( + "-f", + "--file", + dest="config", + metavar="FILE", + help="path to config file (default: ~/.vcspull.yaml or ./.vcspull.yaml)", + ) + parser.add_argument( + "-w", + "--workspace", + "--workspace-root", + dest="workspace_root", + metavar="DIR", + help="filter by workspace root directory", + ) + parser.add_argument( + "repo_patterns", + metavar="pattern", + nargs="*", + help="filter repositories by name pattern (supports fnmatch)", + ) + parser.add_argument( + "--detailed", + "-d", + action="store_true", + help="show detailed status information", + ) + parser.add_argument( + "--json", + action="store_true", + dest="output_json", + help="output as JSON", + ) + parser.add_argument( + "--ndjson", + action="store_true", + dest="output_ndjson", + help="output as NDJSON (one JSON per line)", + ) + parser.add_argument( + "--color", + choices=["auto", "always", "never"], + default="auto", + help="when to use colors (default: auto)", + ) + + +def _run_git_command( + repo_path: pathlib.Path, + *args: str, +) -> subprocess.CompletedProcess[str] | None: + """Execute a git command and return the completed process.""" + try: + return subprocess.run( + ["git", *args], + cwd=repo_path, + capture_output=True, + text=True, + check=True, + ) + except (subprocess.CalledProcessError, FileNotFoundError): + return None + + +def check_repo_status(repo: ConfigDict, detailed: bool = False) -> dict[str, t.Any]: + """Check the status of a single repository. + + Parameters + ---------- + repo : ConfigDict + Repository configuration + detailed : bool + Whether to include detailed status information + + Returns + ------- + dict + Repository status information + """ + repo_path = pathlib.Path(str(repo.get("path", ""))) + repo_name = repo.get("name", "unknown") + workspace_root = repo.get("workspace_root", "") + + status: dict[str, t.Any] = { + "name": repo_name, + "path": str(repo_path), + "workspace_root": workspace_root, + "exists": False, + "is_git": False, + "clean": None, + "branch": None, + "ahead": None, + "behind": None, + } + + # Check if repository exists + if repo_path.exists(): + status["exists"] = True + + # Check if it's a git repository + if (repo_path / ".git").exists(): + status["is_git"] = True + + porcelain_result = _run_git_command(repo_path, "status", "--porcelain") + if porcelain_result is not None: + status["clean"] = porcelain_result.stdout.strip() == "" + else: + status["clean"] = True + + if detailed: + branch_result = _run_git_command( + repo_path, + "rev-parse", + "--abbrev-ref", + "HEAD", + ) + if branch_result is not None: + status["branch"] = branch_result.stdout.strip() + + ahead = 0 + behind = 0 + upstream_available = _run_git_command( + repo_path, + "rev-parse", + "--abbrev-ref", + "@{upstream}", + ) + if upstream_available is not None: + counts = _run_git_command( + repo_path, + "rev-list", + "--left-right", + "--count", + "@{upstream}...HEAD", + ) + if counts is not None: + parts = counts.stdout.strip().split() + if len(parts) == 2: + behind, ahead = (int(parts[0]), int(parts[1])) + status["ahead"] = ahead + status["behind"] = behind + + # Maintain clean flag if porcelain failed + if status["clean"] is None: + status["clean"] = True + else: + status["branch"] = None + status["ahead"] = None + status["behind"] = None + + return status + + +def status_repos( + repo_patterns: list[str], + config_path: pathlib.Path | None, + workspace_root: str | None, + detailed: bool, + output_json: bool, + output_ndjson: bool, + color: str, +) -> None: + """Check status of configured repositories. + + Parameters + ---------- + repo_patterns : list[str] + Patterns to filter repositories (fnmatch) + config_path : pathlib.Path | None + Path to config file, or None to auto-discover + workspace_root : str | None + Filter by workspace root + detailed : bool + Show detailed status information + output_json : bool + Output as JSON + output_ndjson : bool + Output as NDJSON + color : str + Color mode (auto, always, never) + """ + # Load configs + if config_path: + configs = load_configs([config_path]) + else: + configs = load_configs(find_config_files(include_home=True)) + + # Filter by patterns if provided + if repo_patterns: + found_repos: list[ConfigDict] = [] + for pattern in repo_patterns: + found_repos.extend(filter_repos(configs, name=pattern)) + else: + # No patterns = all repos + found_repos = configs + + # Further filter by workspace root if specified + if workspace_root: + found_repos = filter_by_workspace(found_repos, workspace_root) + + # Initialize output formatter and colors + output_mode = get_output_mode(output_json, output_ndjson) + formatter = OutputFormatter(output_mode) + colors = Colors(get_color_mode(color)) + + if not found_repos: + formatter.emit_text(colors.warning("No repositories found.")) + formatter.finalize() + return + + # Check status of each repository + summary = {"total": 0, "exists": 0, "missing": 0, "clean": 0, "dirty": 0} + + for repo in found_repos: + status = check_repo_status(repo, detailed=detailed) + summary["total"] += 1 + + if status["exists"]: + summary["exists"] += 1 + if status["clean"]: + summary["clean"] += 1 + else: + summary["dirty"] += 1 + else: + summary["missing"] += 1 + + # Emit status + formatter.emit( + { + "reason": "status", + **status, + } + ) + + # Human output + _format_status_line(status, formatter, colors, detailed) + + # Emit summary + formatter.emit( + { + "reason": "summary", + **summary, + } + ) + + # Human summary + formatter.emit_text( + f"\n{colors.info('Summary:')} {summary['total']} repositories, " + f"{colors.success(str(summary['exists']))} exist, " + f"{colors.error(str(summary['missing']))} missing", + ) + + formatter.finalize() + + +def _format_status_line( + status: dict[str, t.Any], + formatter: OutputFormatter, + colors: Colors, + detailed: bool, +) -> None: + """Format a single repository status line for human output. + + Parameters + ---------- + status : dict + Repository status information + formatter : OutputFormatter + Output formatter + colors : Colors + Color manager + detailed : bool + Whether to show detailed information + """ + name = status["name"] + + if not status["exists"]: + symbol = colors.error("✗") + message = "missing" + status_color = colors.error(message) + elif status["is_git"]: + symbol = colors.success("✓") + clean_state = status["clean"] + ahead = status.get("ahead") + behind = status.get("behind") + if clean_state is False: + message = "dirty" + status_color = colors.warning(message) + elif isinstance(ahead, int) and isinstance(behind, int): + if ahead > 0 and behind > 0: + message = f"diverged (ahead {ahead}, behind {behind})" + status_color = colors.warning(message) + elif ahead > 0: + message = f"ahead by {ahead}" + status_color = colors.info(message) + elif behind > 0: + message = f"behind by {behind}" + status_color = colors.warning(message) + else: + message = "up to date" + status_color = colors.success(message) + else: + message = "up to date" if clean_state else "dirty" + status_color = ( + colors.success(message) + if clean_state in {True, None} + else colors.warning(message) + ) + else: + symbol = colors.warning("⚠") + message = "not a git repo" + status_color = colors.warning(message) + + formatter.emit_text(f"{symbol} {colors.info(name)}: {status_color}") + + if detailed: + formatter.emit_text(f" {colors.muted('Path:')} {status['path']}") + branch = status.get("branch") + if branch: + formatter.emit_text(f" {colors.muted('Branch:')} {branch}") + ahead = status.get("ahead") + behind = status.get("behind") + if isinstance(ahead, int) and isinstance(behind, int): + formatter.emit_text(f" {colors.muted('Ahead/Behind:')} {ahead}/{behind}") diff --git a/src/vcspull/cli/sync.py b/src/vcspull/cli/sync.py index a332f4e2..640777d1 100644 --- a/src/vcspull/cli/sync.py +++ b/src/vcspull/cli/sync.py @@ -2,27 +2,150 @@ from __future__ import annotations +import asyncio +import contextlib +import json import logging +import os +import pathlib +import re +import subprocess import sys import typing as t +from collections.abc import Callable from copy import deepcopy +from dataclasses import dataclass +from datetime import datetime +from io import StringIO +from time import perf_counter from libvcs._internal.shortcuts import create_project from libvcs.url import registry as url_tools from vcspull import exc from vcspull.config import filter_repos, find_config_files, load_configs +from vcspull.types import ConfigDict + +from ._colors import Colors, get_color_mode +from ._output import ( + OutputFormatter, + OutputMode, + PlanAction, + PlanEntry, + PlanRenderOptions, + PlanResult, + PlanSummary, + get_output_mode, +) +from ._workspaces import filter_by_workspace +from .status import check_repo_status if t.TYPE_CHECKING: import argparse import pathlib - from datetime import datetime from libvcs._internal.types import VCSLiteral from libvcs.sync.git import GitSync log = logging.getLogger(__name__) +ProgressCallback = Callable[[str, datetime], None] + + +PLAN_SYMBOLS: dict[PlanAction, str] = { + PlanAction.CLONE: "+", + PlanAction.UPDATE: "~", + PlanAction.UNCHANGED: "✓", + PlanAction.BLOCKED: "⚠", + PlanAction.ERROR: "✗", +} + +PLAN_ORDER: dict[PlanAction, int] = { + PlanAction.ERROR: 0, + PlanAction.BLOCKED: 1, + PlanAction.CLONE: 2, + PlanAction.UPDATE: 3, + PlanAction.UNCHANGED: 4, +} + +PLAN_TIP_MESSAGE = ( + "Tip: run without --dry-run to apply. Use --show-unchanged to include ✓ rows." +) + +DEFAULT_PLAN_CONCURRENCY = max(1, min(32, (os.cpu_count() or 4) * 2)) +ANSI_ESCAPE_RE = re.compile(r"\x1b\[[0-9;]*m") + + +@dataclass +class SyncPlanConfig: + """Configuration options for building sync plans.""" + + fetch: bool + offline: bool + + +def _visible_length(text: str) -> int: + """Return the printable length of string stripped of ANSI codes.""" + return len(ANSI_ESCAPE_RE.sub("", text)) + + +class PlanProgressPrinter: + """Render incremental plan progress for human-readable dry runs.""" + + def __init__(self, total: int, colors: Colors, enabled: bool) -> None: + self.total = total + self._colors = colors + self._enabled = enabled and total > 0 + self._stream = sys.stdout + self._last_render_len = 0 + + def update(self, summary: PlanSummary, processed: int) -> None: + """Update the progress line with the latest summary counts.""" + if not self._enabled: + return + + line = " ".join( + ( + f"Progress: {processed}/{self.total}", + self._colors.success(f"+:{summary.clone}"), + self._colors.warning(f"~:{summary.update}"), + self._colors.muted(f"✓:{summary.unchanged}"), + self._colors.warning(f"⚠:{summary.blocked}"), + self._colors.error(f"✗:{summary.errors}"), + ) + ) + clean_len = _visible_length(line) + padding = max(self._last_render_len - clean_len, 0) + self._stream.write("\r" + line + " " * padding) + self._stream.flush() + self._last_render_len = clean_len + + def finish(self) -> None: + """Ensure the progress line is terminated with a newline.""" + if not self._enabled: + return + self._stream.write("\n") + self._stream.flush() + + +def _extract_repo_url(repo: ConfigDict) -> str | None: + """Extract the primary repository URL from a config dictionary.""" + url = repo.get("url") + if isinstance(url, str): + return url + pip_url = repo.get("pip_url") + if isinstance(pip_url, str): + return pip_url + return None + + +def _get_repo_path(repo: ConfigDict) -> pathlib.Path: + """Return the resolved filesystem path for a repository entry.""" + raw_path = repo.get("path") + if raw_path is None: + return pathlib.Path().resolve() + return pathlib.Path(str(raw_path)).expanduser() + def clamp(n: int, _min: int, _max: int) -> int: """Clamp a number between a min and max value.""" @@ -33,20 +156,398 @@ def clamp(n: int, _min: int, _max: int) -> int: NO_REPOS_FOR_TERM_MSG = 'No repo found in config(s) for "{name}"' +def _maybe_fetch( + repo_path: pathlib.Path, + *, + config: SyncPlanConfig, +) -> tuple[bool, str | None]: + """Optionally fetch remote refs to provide accurate status.""" + if config.offline or not config.fetch: + return True, None + if not (repo_path / ".git").exists(): + return True, None + + try: + result = subprocess.run( + ["git", "fetch", "--prune"], + cwd=repo_path, + capture_output=True, + text=True, + check=False, + ) + except FileNotFoundError: + return False, "git executable not found" + except OSError as exc: + return False, str(exc) + + if result.returncode != 0: + message = result.stderr.strip() or result.stdout.strip() + if not message: + message = f"git fetch failed with exit code {result.returncode}" + return False, message + + return True, None + + +def _determine_plan_action( + status: dict[str, t.Any], + *, + config: SyncPlanConfig, +) -> tuple[PlanAction, str | None]: + """Decide which plan action applies to a repository.""" + if not status.get("exists"): + return PlanAction.CLONE, "missing" + + if not status.get("is_git"): + return PlanAction.BLOCKED, "not a git repository" + + clean_state = status.get("clean") + if clean_state is False: + return PlanAction.BLOCKED, "working tree has local changes" + + ahead = status.get("ahead") + behind = status.get("behind") + + if isinstance(ahead, int) and isinstance(behind, int): + if ahead > 0 and behind > 0: + return PlanAction.BLOCKED, f"diverged (ahead {ahead}, behind {behind})" + if behind > 0: + return PlanAction.UPDATE, f"behind {behind}" + if ahead > 0: + return PlanAction.BLOCKED, f"ahead by {ahead}" + return PlanAction.UNCHANGED, "up to date" + + if config.offline: + return PlanAction.UPDATE, "remote state unknown (offline)" + + return PlanAction.UPDATE, "remote state unknown; use --fetch" + + +def _update_summary(summary: PlanSummary, action: PlanAction) -> None: + """Update summary counters for the given plan action.""" + if action is PlanAction.CLONE: + summary.clone += 1 + elif action is PlanAction.UPDATE: + summary.update += 1 + elif action is PlanAction.UNCHANGED: + summary.unchanged += 1 + elif action is PlanAction.BLOCKED: + summary.blocked += 1 + elif action is PlanAction.ERROR: + summary.errors += 1 + + +def _build_plan_entry( + repo: ConfigDict, + *, + config: SyncPlanConfig, +) -> PlanEntry: + """Construct a plan entry for a repository configuration.""" + repo_path = _get_repo_path(repo) + workspace_root = str(repo.get("workspace_root", "")) + + fetch_ok = True + fetch_error: str | None = None + if repo_path.exists() and (repo_path / ".git").exists(): + fetch_ok, fetch_error = _maybe_fetch(repo_path, config=config) + + status = check_repo_status(repo, detailed=True) + + action: PlanAction + detail: str | None + if not fetch_ok: + action = PlanAction.ERROR + detail = fetch_error or "failed to refresh remotes" + else: + action, detail = _determine_plan_action(status, config=config) + + return PlanEntry( + name=str(repo.get("name", "unknown")), + path=str(repo_path), + workspace_root=workspace_root, + action=action, + detail=detail, + url=_extract_repo_url(repo), + branch=status.get("branch"), + remote_branch=None, + current_rev=None, + target_rev=None, + ahead=status.get("ahead"), + behind=status.get("behind"), + dirty=status.get("clean") is False if status.get("clean") is not None else None, + error=fetch_error if not fetch_ok else None, + ) + + +async def _build_plan_result_async( + repos: list[ConfigDict], + *, + config: SyncPlanConfig, + progress: PlanProgressPrinter | None, +) -> PlanResult: + """Build a plan asynchronously while updating progress output.""" + if not repos: + return PlanResult(entries=[], summary=PlanSummary()) + + semaphore = asyncio.Semaphore(min(DEFAULT_PLAN_CONCURRENCY, len(repos))) + entries: list[PlanEntry] = [] + summary = PlanSummary() + + async def evaluate(repo: ConfigDict) -> PlanEntry: + async with semaphore: + return await asyncio.to_thread(_build_plan_entry, repo=repo, config=config) + + tasks = [asyncio.create_task(evaluate(repo)) for repo in repos] + + for index, task in enumerate(asyncio.as_completed(tasks), start=1): + entry = await task + entries.append(entry) + _update_summary(summary, entry.action) + if progress is not None: + progress.update(summary, index) + + return PlanResult(entries=entries, summary=summary) + + +def _filter_entries_for_display( + entries: list[PlanEntry], + *, + show_unchanged: bool, +) -> list[PlanEntry]: + """Filter entries based on whether unchanged repos should be rendered.""" + if show_unchanged: + return list(entries) + return [entry for entry in entries if entry.action is not PlanAction.UNCHANGED] + + +def _format_detail_text( + entry: PlanEntry, + *, + colors: Colors, + include_extras: bool, +) -> str: + """Generate the detail text for a plan entry.""" + detail = entry.detail or "" + extra_bits: list[str] = [] + + if include_extras: + if entry.action is PlanAction.UPDATE and entry.behind: + extra_bits.append(f"behind {entry.behind}") + if entry.action is PlanAction.CLONE and entry.url: + extra_bits.append(entry.url) + if entry.action is PlanAction.BLOCKED and entry.error: + extra_bits.append(entry.error) + + if extra_bits: + detail = f"{detail} {'; '.join(extra_bits)}".strip() + + color_map: dict[PlanAction, t.Callable[[str], str]] = { + PlanAction.CLONE: colors.success, + PlanAction.UPDATE: colors.warning, + PlanAction.UNCHANGED: colors.muted, + PlanAction.BLOCKED: colors.warning, + PlanAction.ERROR: colors.error, + } + + formatter = color_map.get(entry.action, colors.info) + return formatter(detail) if detail else "" + + +def _render_plan( + formatter: OutputFormatter, + colors: Colors, + plan: PlanResult, + render_options: PlanRenderOptions, + *, + dry_run: bool, + total_repos: int, +) -> None: + """Render the plan in human-readable format.""" + summary = plan.summary + summary_line = ( + f"Plan: " + f"{colors.success(str(summary.clone))} to clone (+), " + f"{colors.warning(str(summary.update))} to update (~), " + f"{colors.muted(str(summary.unchanged))} unchanged (✓), " + f"{colors.warning(str(summary.blocked))} blocked (⚠), " + f"{colors.error(str(summary.errors))} errors (✗)" + ) + formatter.emit_text(summary_line) + + if total_repos == 0: + formatter.emit_text(colors.warning("No repositories matched the criteria.")) + return + + if render_options.summary_only: + if dry_run: + formatter.emit_text(colors.muted(PLAN_TIP_MESSAGE)) + return + + display_entries = _filter_entries_for_display( + sorted( + plan.entries, + key=lambda entry: ( + PLAN_ORDER.get(entry.action, 99), + entry.workspace_root or "", + entry.name.lower(), + ), + ), + show_unchanged=render_options.show_unchanged, + ) + + if not display_entries: + formatter.emit_text(colors.muted("All repositories are up to date.")) + if dry_run: + formatter.emit_text(colors.muted(PLAN_TIP_MESSAGE)) + return + + formatter.emit_text("") + + grouped: dict[str, list[PlanEntry]] = {} + for entry in display_entries: + key = entry.workspace_root or "(no workspace)" + grouped.setdefault(key, []).append(entry) + + for idx, (workspace, group_entries) in enumerate(grouped.items()): + if idx > 0: + formatter.emit_text("") + formatter.emit_text(colors.highlight(workspace)) + name_width = max(len(entry.name) for entry in group_entries) + + for entry in group_entries: + symbol = PLAN_SYMBOLS.get(entry.action, "?") + color_map: dict[PlanAction, t.Callable[[str], str]] = { + PlanAction.CLONE: colors.success, + PlanAction.UPDATE: colors.warning, + PlanAction.UNCHANGED: colors.muted, + PlanAction.BLOCKED: colors.warning, + PlanAction.ERROR: colors.error, + } + symbol_text = color_map.get(entry.action, colors.info)(symbol) + + display_path = entry.path + if render_options.relative_paths and entry.workspace_root: + workspace_path = pathlib.Path(entry.workspace_root).expanduser() + try: + rel_path = pathlib.Path(entry.path).relative_to(workspace_path) + display_path = str(rel_path) + except ValueError: + display_path = entry.path + + detail_text = _format_detail_text( + entry, + colors=colors, + include_extras=render_options.verbosity > 0 or render_options.long, + ) + + line = ( + f" {symbol_text} {colors.info(entry.name.ljust(name_width))} " + f"{colors.muted(display_path)}" + ) + if detail_text: + line = f"{line} {detail_text}" + formatter.emit_text(line.rstrip()) + + if render_options.long or render_options.verbosity > 1: + extra_lines: list[str] = [] + if entry.url: + extra_lines.append(f"url: {entry.url}") + if entry.ahead is not None or entry.behind is not None: + extra_lines.append( + f"ahead/behind: {entry.ahead or 0}/{entry.behind or 0}" + ) + if entry.error: + extra_lines.append(f"error: {entry.error}") + for msg in extra_lines: + formatter.emit_text(f" {colors.muted(msg)}") + + if dry_run: + formatter.emit_text(colors.muted(PLAN_TIP_MESSAGE)) + + +def _emit_plan_output( + formatter: OutputFormatter, + colors: Colors, + plan: PlanResult, + render_options: PlanRenderOptions, + *, + dry_run: bool, + total_repos: int, +) -> None: + """Emit plan output for the requested format.""" + if formatter.mode == OutputMode.HUMAN: + _render_plan( + formatter=formatter, + colors=colors, + plan=plan, + render_options=render_options, + dry_run=dry_run, + total_repos=total_repos, + ) + return + + display_entries = _filter_entries_for_display( + plan.entries, + show_unchanged=render_options.show_unchanged, + ) + + if formatter.mode == OutputMode.NDJSON: + for entry in display_entries: + formatter.emit(entry) + formatter.emit(plan.summary) + return + + structured = PlanResult(entries=display_entries, summary=plan.summary) + print(json.dumps(structured.to_json_object(), indent=2)) + + def create_sync_subparser(parser: argparse.ArgumentParser) -> argparse.ArgumentParser: """Create ``vcspull sync`` argument subparser.""" config_file = parser.add_argument( - "--config", - "-c", - metavar="config-file", - help="optional filepath to specify vcspull config", + "-f", + "--file", + dest="config", + metavar="FILE", + help="path to config file (default: ~/.vcspull.yaml or ./.vcspull.yaml)", + ) + parser.add_argument( + "-w", + "--workspace", + "--workspace-root", + dest="workspace_root", + metavar="DIR", + help="filter by workspace root directory", ) parser.add_argument( "repo_patterns", - metavar="filter", + metavar="pattern", nargs="*", help="patterns / terms of repos, accepts globs / fnmatch(3)", ) + parser.add_argument( + "--dry-run", + "-n", + action="store_true", + help="preview what would be synced without making changes", + ) + parser.add_argument( + "--json", + action="store_true", + dest="output_json", + help="output as JSON", + ) + parser.add_argument( + "--ndjson", + action="store_true", + dest="output_ndjson", + help="output as NDJSON (one JSON per line)", + ) + parser.add_argument( + "--color", + choices=["auto", "always", "never"], + default="auto", + help="when to use colors (default: auto)", + ) parser.add_argument( "--exit-on-error", "-x", @@ -54,6 +555,47 @@ def create_sync_subparser(parser: argparse.ArgumentParser) -> argparse.ArgumentP dest="exit_on_error", help="exit immediately encountering error (when syncing multiple repos)", ) + parser.add_argument( + "--show-unchanged", + action="store_true", + help="include repositories that are already up to date", + ) + parser.add_argument( + "--summary-only", + action="store_true", + dest="summary_only", + help="print only the plan summary line", + ) + parser.add_argument( + "--long", + action="store_true", + dest="long_view", + help="show extended details for each repository", + ) + parser.add_argument( + "--relative-paths", + action="store_true", + dest="relative_paths", + help="display repository paths relative to the workspace root", + ) + parser.add_argument( + "--fetch", + action="store_true", + help="refresh remote tracking information before planning", + ) + parser.add_argument( + "--offline", + action="store_true", + help="skip network access while planning (overrides --fetch)", + ) + parser.add_argument( + "-v", + "--verbose", + action="count", + dest="verbosity", + default=0, + help="increase plan verbosity (-vv for maximum detail)", + ) try: import shtab @@ -67,21 +609,42 @@ def create_sync_subparser(parser: argparse.ArgumentParser) -> argparse.ArgumentP def sync( repo_patterns: list[str], config: pathlib.Path | None, + workspace_root: str | None, + dry_run: bool, + output_json: bool, + output_ndjson: bool, + color: str, exit_on_error: bool, + show_unchanged: bool, + summary_only: bool, + long_view: bool, + relative_paths: bool, + fetch: bool, + offline: bool, + verbosity: int, parser: argparse.ArgumentParser | None = None, # optional so sync can be unit tested ) -> None: """Entry point for ``vcspull sync``.""" - if isinstance(repo_patterns, list) and len(repo_patterns) == 0: - if parser is not None: - parser.print_help() - sys.exit(2) + output_mode = get_output_mode(output_json, output_ndjson) + formatter = OutputFormatter(output_mode) + colors = Colors(get_color_mode(color)) + + verbosity_level = clamp(verbosity, 0, 2) + render_options = PlanRenderOptions( + show_unchanged=show_unchanged, + summary_only=summary_only, + long=long_view, + verbosity=verbosity_level, + relative_paths=relative_paths, + ) + plan_config = SyncPlanConfig(fetch=bool(fetch and not offline), offline=offline) if config: configs = load_configs([config]) else: configs = load_configs(find_config_files(include_home=True)) - found_repos = [] + found_repos: list[ConfigDict] = [] for repo_pattern in repo_patterns: path, vcs_url, name = None, None, None @@ -92,27 +655,149 @@ def sync( else: name = repo_pattern - # collect the repos from the config files found = filter_repos(configs, path=path, vcs_url=vcs_url, name=name) - if len(found) == 0: + if not found and formatter.mode == OutputMode.HUMAN: log.info(NO_REPOS_FOR_TERM_MSG.format(name=name)) - found_repos.extend(filter_repos(configs, path=path, vcs_url=vcs_url, name=name)) + found_repos.extend(found) + + if workspace_root: + found_repos = filter_by_workspace(found_repos, workspace_root) + + total_repos = len(found_repos) + + if dry_run: + progress_enabled = formatter.mode == OutputMode.HUMAN and sys.stdout.isatty() + progress_printer = PlanProgressPrinter(total_repos, colors, progress_enabled) + start_time = perf_counter() + plan_result = asyncio.run( + _build_plan_result_async( + found_repos, + config=plan_config, + progress=progress_printer if progress_enabled else None, + ) + ) + plan_result.summary.duration_ms = int((perf_counter() - start_time) * 1000) + if progress_enabled: + progress_printer.finish() + _emit_plan_output( + formatter=formatter, + colors=colors, + plan=plan_result, + render_options=render_options, + dry_run=True, + total_repos=total_repos, + ) + formatter.finalize() + return + + if total_repos == 0: + formatter.emit_text(colors.warning("No repositories matched the criteria.")) + formatter.finalize() + return + + is_human = formatter.mode == OutputMode.HUMAN + + summary = {"total": 0, "synced": 0, "previewed": 0, "failed": 0} + + progress_callback: ProgressCallback + if is_human: + progress_callback = progress_cb + else: + + def silent_progress(output: str, timestamp: datetime) -> None: + """Suppress progress for machine-readable output.""" + return None + + progress_callback = silent_progress for repo in found_repos: + repo_name = repo.get("name", "unknown") + repo_path = repo.get("path", "unknown") + workspace_label = repo.get("workspace_root", "") + + summary["total"] += 1 + + event: dict[str, t.Any] = { + "reason": "sync", + "name": repo_name, + "path": str(repo_path), + "workspace_root": str(workspace_label), + } + + buffer: StringIO | None = None + captured_output: str | None = None try: - update_repo(repo) - except Exception as e: # noqa: PERF203 - log.info( - f"Failed syncing {repo.get('name')}", - ) + if is_human: + update_repo(repo, progress_callback=progress_callback) + else: + buffer = StringIO() + with ( + contextlib.redirect_stdout(buffer), + contextlib.redirect_stderr( + buffer, + ), + ): + update_repo(repo, progress_callback=progress_callback) + captured_output = buffer.getvalue() + except Exception as e: + summary["failed"] += 1 + event["status"] = "error" + event["error"] = str(e) + if not is_human and buffer is not None and not captured_output: + captured_output = buffer.getvalue() + if captured_output: + event["details"] = captured_output.strip() + formatter.emit(event) + if is_human: + log.info( + f"Failed syncing {repo_name}", + ) if log.isEnabledFor(logging.DEBUG): import traceback traceback.print_exc() + formatter.emit_text( + f"{colors.error('✗')} Failed syncing {colors.info(repo_name)}: " + f"{colors.error(str(e))}", + ) if exit_on_error: + formatter.emit( + { + "reason": "summary", + **summary, + } + ) + formatter.finalize() if parser is not None: parser.exit(status=1, message=EXIT_ON_ERROR_MSG) raise SystemExit(EXIT_ON_ERROR_MSG) from e + continue + + summary["synced"] += 1 + event["status"] = "synced" + formatter.emit(event) + formatter.emit_text( + f"{colors.success('✓')} Synced {colors.info(repo_name)} " + f"{colors.muted('→')} {repo_path}", + ) + + formatter.emit( + { + "reason": "summary", + **summary, + } + ) + + if formatter.mode == OutputMode.HUMAN: + formatter.emit_text( + f"\n{colors.info('Summary:')} " + f"{summary['total']} repos, " + f"{colors.success(str(summary['synced']))} synced, " + f"{colors.warning(str(summary['previewed']))} previewed, " + f"{colors.error(str(summary['failed']))} failed", + ) + + formatter.finalize() def progress_cb(output: str, timestamp: datetime) -> None: @@ -144,6 +829,7 @@ def __init__(self, repo_url: str, *args: object, **kwargs: object) -> None: def update_repo( repo_dict: t.Any, + progress_callback: ProgressCallback | None = None, # repo_dict: Dict[str, Union[str, Dict[str, GitRemote], pathlib.Path]] ) -> GitSync: """Synchronize a single repository.""" @@ -152,7 +838,8 @@ def update_repo( repo_dict["pip_url"] = repo_dict.pop("url") if "url" not in repo_dict: repo_dict["url"] = repo_dict.pop("pip_url") - repo_dict["progress_callback"] = progress_cb + + repo_dict["progress_callback"] = progress_callback or progress_cb if repo_dict.get("vcs") is None: vcs = guess_vcs(url=repo_dict["url"]) diff --git a/src/vcspull/config.py b/src/vcspull/config.py index 1b74a3e3..5fcd9e10 100644 --- a/src/vcspull/config.py +++ b/src/vcspull/config.py @@ -108,6 +108,9 @@ def extract_repos( if "name" not in conf: conf["name"] = repo + if "workspace_root" not in conf: + conf["workspace_root"] = directory + if "path" not in conf: conf["path"] = expand_dir( pathlib.Path(expand_dir(pathlib.Path(directory), cwd=cwd)) diff --git a/src/vcspull/types.py b/src/vcspull/types.py index 9053956e..e26574c3 100644 --- a/src/vcspull/types.py +++ b/src/vcspull/types.py @@ -60,6 +60,7 @@ class ConfigDict(TypedDict): name: str path: pathlib.Path url: str + workspace_root: str remotes: NotRequired[GitSyncRemoteDict | None] shell_command_after: NotRequired[list[str] | None] diff --git a/tests/cli/test_add.py b/tests/cli/test_add.py new file mode 100644 index 00000000..bc8fe534 --- /dev/null +++ b/tests/cli/test_add.py @@ -0,0 +1,299 @@ +"""Tests for vcspull add command.""" + +from __future__ import annotations + +import pathlib +import typing as t + +import pytest + +from vcspull.cli.add import add_repo + +if t.TYPE_CHECKING: + from _pytest.monkeypatch import MonkeyPatch + + +class AddRepoFixture(t.NamedTuple): + """Fixture for add repo test cases.""" + + test_id: str + name: str + url: str + workspace_root: str | None + path_relative: str | None + dry_run: bool + use_default_config: bool + preexisting_config: dict[str, t.Any] | None + expected_in_config: dict[str, t.Any] + expected_log_messages: list[str] + + +ADD_REPO_FIXTURES: list[AddRepoFixture] = [ + AddRepoFixture( + test_id="simple-add-default-workspace", + name="myproject", + url="git+https://github.com/user/myproject.git", + workspace_root=None, + path_relative=None, + dry_run=False, + use_default_config=False, + preexisting_config=None, + expected_in_config={ + "./": { + "myproject": {"repo": "git+https://github.com/user/myproject.git"}, + }, + }, + expected_log_messages=["Successfully added 'myproject'"], + ), + AddRepoFixture( + test_id="add-with-custom-workspace", + name="flask", + url="git+https://github.com/pallets/flask.git", + workspace_root="~/code/", + path_relative=None, + dry_run=False, + use_default_config=False, + preexisting_config=None, + expected_in_config={ + "~/code/": { + "flask": {"repo": "git+https://github.com/pallets/flask.git"}, + }, + }, + expected_log_messages=["Successfully added 'flask'"], + ), + AddRepoFixture( + test_id="dry-run-no-write", + name="django", + url="git+https://github.com/django/django.git", + workspace_root=None, + path_relative=None, + dry_run=True, + use_default_config=False, + preexisting_config=None, + expected_in_config={}, # Nothing written in dry-run + expected_log_messages=["Would add 'django'"], + ), + AddRepoFixture( + test_id="default-config-created-when-missing", + name="autoproject", + url="git+https://github.com/user/autoproject.git", + workspace_root=None, + path_relative=None, + dry_run=False, + use_default_config=True, + preexisting_config=None, + expected_in_config={ + "./": { + "autoproject": { + "repo": "git+https://github.com/user/autoproject.git", + }, + }, + }, + expected_log_messages=[ + "No config specified and no default found", + "Successfully added 'autoproject'", + ], + ), + AddRepoFixture( + test_id="path-inferrs-workspace-root", + name="lib", + url="git+https://github.com/user/lib.git", + workspace_root=None, + path_relative="code/lib", + dry_run=False, + use_default_config=False, + preexisting_config=None, + expected_in_config={ + "~/code/lib/": { + "lib": {"repo": "git+https://github.com/user/lib.git"}, + }, + }, + expected_log_messages=["Successfully added 'lib'"], + ), + AddRepoFixture( + test_id="normalizes-existing-workspace-label", + name="extra", + url="git+https://github.com/user/extra.git", + workspace_root=None, + path_relative=None, + dry_run=False, + use_default_config=False, + preexisting_config={ + "~/code": { + "existing": {"repo": "git+https://github.com/user/existing.git"}, + }, + }, + expected_in_config={ + "~/code/": { + "existing": {"repo": "git+https://github.com/user/existing.git"}, + }, + "./": { + "extra": {"repo": "git+https://github.com/user/extra.git"}, + }, + }, + expected_log_messages=["Successfully added 'extra'"], + ), +] + + +@pytest.mark.parametrize( + list(AddRepoFixture._fields), + ADD_REPO_FIXTURES, + ids=[fixture.test_id for fixture in ADD_REPO_FIXTURES], +) +def test_add_repo( + test_id: str, + name: str, + url: str, + workspace_root: str | None, + path_relative: str | None, + dry_run: bool, + use_default_config: bool, + preexisting_config: dict[str, t.Any] | None, + expected_in_config: dict[str, t.Any], + expected_log_messages: list[str], + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + caplog: t.Any, +) -> None: + """Test adding a repository to the config.""" + # Set logging level to capture INFO messages + import logging + + caplog.set_level(logging.INFO) + + # Set up temp directory as home + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + target_config_file = tmp_path / ".vcspull.yaml" + config_argument: str | None = ( + None if use_default_config else str(target_config_file) + ) + + if preexisting_config is not None: + import yaml + + target_config_file.write_text( + yaml.dump(preexisting_config), + encoding="utf-8", + ) + + path_argument = str(tmp_path / path_relative) if path_relative else None + + # Run add_repo + add_repo( + name=name, + url=url, + config_file_path_str=config_argument, + path=path_argument, + workspace_root_path=workspace_root, + dry_run=dry_run, + ) + + # Check log messages + log_output = caplog.text + for expected_msg in expected_log_messages: + assert expected_msg in log_output, ( + f"Expected '{expected_msg}' in log output, got: {log_output}" + ) + + # Check config file + if dry_run: + # In dry-run mode, config file should not be created + if len(expected_in_config) == 0 and not use_default_config: + assert not target_config_file.exists(), ( + "Config file should not be created in dry-run mode" + ) + else: + # In normal mode, check the config was written correctly + if len(expected_in_config) > 0: + assert target_config_file.exists(), "Config file should be created" + + import yaml + + with target_config_file.open() as f: + actual_config = yaml.safe_load(f) + + for workspace, repos in expected_in_config.items(): + assert workspace in actual_config, ( + f"Workspace '{workspace}' should be in config" + ) + for repo_name, repo_data in repos.items(): + assert repo_name in actual_config[workspace], ( + f"Repo '{repo_name}' should be in workspace '{workspace}'" + ) + assert actual_config[workspace][repo_name] == repo_data + + +def test_add_repo_duplicate_warning( + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + caplog: t.Any, +) -> None: + """Test that adding a duplicate repository shows a warning.""" + import logging + + caplog.set_level(logging.INFO) + + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + config_file = tmp_path / ".vcspull.yaml" + + # Add repo first time + add_repo( + name="myproject", + url="git+https://github.com/user/myproject.git", + config_file_path_str=str(config_file), + path=None, + workspace_root_path=None, + dry_run=False, + ) + + # Clear logs + caplog.clear() + + # Try to add again + add_repo( + name="myproject", + url="git+https://github.com/user/myproject-v2.git", + config_file_path_str=str(config_file), + path=None, + workspace_root_path=None, + dry_run=False, + ) + + # Should have warning + assert "already exists" in caplog.text + + +def test_add_repo_creates_new_file( + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, +) -> None: + """Test that add_repo creates a new config file if it doesn't exist.""" + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + config_file = tmp_path / ".vcspull.yaml" + assert not config_file.exists() + + add_repo( + name="newrepo", + url="git+https://github.com/user/newrepo.git", + config_file_path_str=str(config_file), + path=None, + workspace_root_path=None, + dry_run=False, + ) + + assert config_file.exists() + + import yaml + + with config_file.open() as f: + config = yaml.safe_load(f) + + assert "./" in config + assert "newrepo" in config["./"] diff --git a/tests/cli/test_discover.py b/tests/cli/test_discover.py new file mode 100644 index 00000000..bed0bfbd --- /dev/null +++ b/tests/cli/test_discover.py @@ -0,0 +1,688 @@ +"""Tests for vcspull discover command.""" + +from __future__ import annotations + +import pathlib +import subprocess +import typing as t + +import pytest + +from vcspull.cli.discover import discover_repos + +if t.TYPE_CHECKING: + from _pytest.monkeypatch import MonkeyPatch + + +def init_git_repo(repo_path: pathlib.Path, remote_url: str) -> None: + """Initialize a git repository with a remote.""" + repo_path.mkdir(parents=True, exist_ok=True) + subprocess.run(["git", "init"], cwd=repo_path, check=True, capture_output=True) + subprocess.run( + ["git", "remote", "add", "origin", remote_url], + cwd=repo_path, + check=True, + capture_output=True, + ) + + +class DiscoverFixture(t.NamedTuple): + """Fixture for discover test cases.""" + + test_id: str + repos_to_create: list[tuple[str, str]] # (name, remote_url) + recursive: bool + workspace_override: str | None + dry_run: bool + yes: bool + expected_repo_count: int + config_relpath: str | None + preexisting_config: dict[str, t.Any] | None + user_input: str | None + expected_workspace_labels: set[str] | None + + +DISCOVER_FIXTURES: list[DiscoverFixture] = [ + DiscoverFixture( + test_id="discover-single-level", + repos_to_create=[ + ("repo1", "git+https://github.com/user/repo1.git"), + ("repo2", "git+https://github.com/user/repo2.git"), + ], + recursive=False, + workspace_override=None, + dry_run=False, + yes=True, + expected_repo_count=2, + config_relpath=".vcspull.yaml", + preexisting_config=None, + user_input=None, + expected_workspace_labels={"~/code/"}, + ), + DiscoverFixture( + test_id="discover-recursive", + repos_to_create=[ + ("repo1", "git+https://github.com/user/repo1.git"), + ("subdir/repo2", "git+https://github.com/user/repo2.git"), + ("subdir/nested/repo3", "git+https://github.com/user/repo3.git"), + ], + recursive=True, + workspace_override=None, + dry_run=False, + yes=True, + expected_repo_count=3, + config_relpath=".vcspull.yaml", + preexisting_config=None, + user_input=None, + expected_workspace_labels={"~/code/"}, + ), + DiscoverFixture( + test_id="discover-dry-run", + repos_to_create=[ + ("repo1", "git+https://github.com/user/repo1.git"), + ], + recursive=False, + workspace_override=None, + dry_run=True, + yes=True, + expected_repo_count=0, # Nothing written in dry-run + config_relpath=".vcspull.yaml", + preexisting_config=None, + user_input=None, + expected_workspace_labels=None, + ), + DiscoverFixture( + test_id="discover-default-config", + repos_to_create=[ + ("repo1", "git+https://github.com/user/repo1.git"), + ], + recursive=False, + workspace_override=None, + dry_run=False, + yes=True, + expected_repo_count=1, + config_relpath=None, + preexisting_config=None, + user_input=None, + expected_workspace_labels={"~/code/"}, + ), + DiscoverFixture( + test_id="discover-workspace-normalization", + repos_to_create=[ + ("repo1", "git+https://github.com/user/repo1.git"), + ], + recursive=False, + workspace_override=None, + dry_run=False, + yes=True, + expected_repo_count=2, + config_relpath=".vcspull.yaml", + preexisting_config={ + "~/code": { + "existing": {"repo": "git+https://github.com/user/existing.git"}, + }, + }, + user_input=None, + expected_workspace_labels={"~/code/"}, + ), + DiscoverFixture( + test_id="discover-interactive-confirm", + repos_to_create=[ + ("repo1", "git+https://github.com/user/repo1.git"), + ], + recursive=False, + workspace_override=None, + dry_run=False, + yes=False, + expected_repo_count=1, + config_relpath=".vcspull.yaml", + preexisting_config=None, + user_input="y", + expected_workspace_labels={"~/code/"}, + ), + DiscoverFixture( + test_id="discover-interactive-abort", + repos_to_create=[ + ("repo1", "git+https://github.com/user/repo1.git"), + ], + recursive=False, + workspace_override=None, + dry_run=False, + yes=False, + expected_repo_count=0, + config_relpath=".vcspull.yaml", + preexisting_config=None, + user_input="n", + expected_workspace_labels=None, + ), +] + + +class DiscoverLoadEdgeFixture(t.NamedTuple): + """Fixture describing discover configuration loading edge cases.""" + + test_id: str + mode: t.Literal["multi_home", "non_dict", "exception"] + expected_log_fragment: str + + +DISCOVER_LOAD_EDGE_FIXTURES: list[DiscoverLoadEdgeFixture] = [ + DiscoverLoadEdgeFixture( + test_id="multiple-home-configs", + mode="multi_home", + expected_log_fragment="Multiple home_config files found", + ), + DiscoverLoadEdgeFixture( + test_id="non-dict-config", + mode="non_dict", + expected_log_fragment="is not a valid YAML dictionary", + ), + DiscoverLoadEdgeFixture( + test_id="config-reader-exception", + mode="exception", + expected_log_fragment="Error loading YAML", + ), +] + + +class DiscoverNormalizationFixture(t.NamedTuple): + """Fixture for normalization-only save branches.""" + + test_id: str + preexisting_config: dict[str, dict[str, dict[str, str]]] + expected_workspace_label: str + + +DISCOVER_NORMALIZATION_FIXTURES: list[DiscoverNormalizationFixture] = [ + DiscoverNormalizationFixture( + test_id="normalizes-and-saves-existing", + preexisting_config={ + "~/code": { + "existing-repo": {"repo": "git+https://example.com/existing.git"}, + }, + }, + expected_workspace_label="~/code/", + ), +] + + +class DiscoverInvalidWorkspaceFixture(t.NamedTuple): + """Fixture describing non-dict workspace entries.""" + + test_id: str + workspace_section: list[str] + expected_warning: str + + +DISCOVER_INVALID_WORKSPACE_FIXTURES: list[DiscoverInvalidWorkspaceFixture] = [ + DiscoverInvalidWorkspaceFixture( + test_id="non-dict-workspace-entry", + workspace_section=[], + expected_warning="Workspace root", + ), +] + + +class DiscoverExistingSummaryFixture(t.NamedTuple): + """Fixture asserting existing repository summary messaging.""" + + test_id: str + repo_count: int + expected_log_fragment: str + + +DISCOVER_EXISTING_SUMMARY_FIXTURES: list[DiscoverExistingSummaryFixture] = [ + DiscoverExistingSummaryFixture( + test_id="existing-summary-detailed", + repo_count=3, + expected_log_fragment="Found 3 existing repositories in configuration:", + ), + DiscoverExistingSummaryFixture( + test_id="existing-summary-aggregate", + repo_count=6, + expected_log_fragment="Found 6 existing repositories already in configuration.", + ), +] + + +@pytest.mark.parametrize( + list(DiscoverFixture._fields), + DISCOVER_FIXTURES, + ids=[fixture.test_id for fixture in DISCOVER_FIXTURES], +) +def test_discover_repos( + test_id: str, + repos_to_create: list[tuple[str, str]], + recursive: bool, + workspace_override: str | None, + dry_run: bool, + yes: bool, + expected_repo_count: int, + config_relpath: str | None, + preexisting_config: dict[str, t.Any] | None, + user_input: str | None, + expected_workspace_labels: set[str] | None, + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + caplog: t.Any, +) -> None: + """Test discovering repositories from filesystem.""" + import logging + + caplog.set_level(logging.INFO) + + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + scan_dir = tmp_path / "code" + scan_dir.mkdir() + + # Create git repos + for repo_name, remote_url in repos_to_create: + repo_path = scan_dir / repo_name + init_git_repo(repo_path, remote_url) + + if config_relpath is None: + target_config_file = tmp_path / ".vcspull.yaml" + config_argument = None + else: + target_config_file = tmp_path / config_relpath + target_config_file.parent.mkdir(parents=True, exist_ok=True) + config_argument = str(target_config_file) + + if preexisting_config is not None: + import yaml + + target_config_file.write_text( + yaml.dump(preexisting_config), + encoding="utf-8", + ) + + if user_input is not None: + monkeypatch.setattr("builtins.input", lambda _: user_input) + + # Run discover + discover_repos( + scan_dir_str=str(scan_dir), + config_file_path_str=config_argument, + recursive=recursive, + workspace_root_override=workspace_override, + yes=yes, + dry_run=dry_run, + ) + + if dry_run: + # In dry-run mode, config file should not be created/modified + if expected_repo_count == 0: + assert "Dry run complete" in caplog.text + return + + # Check config file was created and has expected repos + if expected_repo_count > 0: + assert target_config_file.exists() + + import yaml + + with target_config_file.open() as f: + config = yaml.safe_load(f) + + if expected_workspace_labels is not None: + assert set(config.keys()) == expected_workspace_labels + + # Count repos in config + total_repos = sum( + len(repos) for repos in config.values() if isinstance(repos, dict) + ) + assert total_repos == expected_repo_count, ( + f"Expected {expected_repo_count} repos, got {total_repos}" + ) + + +@pytest.mark.parametrize( + list(DiscoverLoadEdgeFixture._fields), + DISCOVER_LOAD_EDGE_FIXTURES, + ids=[fixture.test_id for fixture in DISCOVER_LOAD_EDGE_FIXTURES], +) +def test_discover_config_load_edges( + test_id: str, + mode: t.Literal["multi_home", "non_dict", "exception"], + expected_log_fragment: str, + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + caplog: t.Any, +) -> None: + """Ensure discover handles configuration loading edge cases gracefully.""" + import logging + + caplog.set_level(logging.INFO) + + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + scan_dir = tmp_path / "scan" + scan_dir.mkdir(parents=True, exist_ok=True) + + if mode == "multi_home": + fake_paths = [tmp_path / "a.yaml", tmp_path / "b.yaml"] + monkeypatch.setattr( + "vcspull.cli.discover.find_home_config_files", + lambda filetype=None: fake_paths, + ) + discover_repos( + scan_dir_str=str(scan_dir), + config_file_path_str=None, + recursive=False, + workspace_root_override=None, + yes=True, + dry_run=False, + ) + else: + config_file = tmp_path / "config.yaml" + config_file.write_text("[]\n", encoding="utf-8") + + if mode == "non_dict": + monkeypatch.setattr( + "vcspull.cli.discover.ConfigReader._from_file", + lambda _path: ["invalid"], + ) + else: # mode == "exception" + + def _raise(_path: pathlib.Path) -> t.NoReturn: + error_message = "ConfigReader failed" + raise ValueError(error_message) + + monkeypatch.setattr( + "vcspull.cli.discover.ConfigReader._from_file", + _raise, + ) + + discover_repos( + scan_dir_str=str(scan_dir), + config_file_path_str=str(config_file), + recursive=False, + workspace_root_override=None, + yes=True, + dry_run=False, + ) + + assert expected_log_fragment in caplog.text + + +def test_discover_skips_repos_without_remote( + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + caplog: t.Any, +) -> None: + """Test that discover skips git repos without a remote.""" + import logging + + caplog.set_level(logging.INFO) + + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + scan_dir = tmp_path / "code" + scan_dir.mkdir() + + # Create a repo without remote + repo_path = scan_dir / "no-remote" + repo_path.mkdir() + subprocess.run(["git", "init"], cwd=repo_path, check=True, capture_output=True) + + config_file = tmp_path / ".vcspull.yaml" + + discover_repos( + scan_dir_str=str(scan_dir), + config_file_path_str=str(config_file), + recursive=False, + workspace_root_override=None, + yes=True, + dry_run=False, + ) + + # Should log a warning + assert "Could not determine remote URL" in caplog.text + + +def test_discover_shows_existing_repos( + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + caplog: t.Any, +) -> None: + """Test that discover shows which repos already exist in config.""" + import logging + + caplog.set_level(logging.INFO) + + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + scan_dir = tmp_path / "code" + scan_dir.mkdir() + + # Create a git repo + repo_path = scan_dir / "existing-repo" + init_git_repo(repo_path, "git+https://github.com/user/existing-repo.git") + + config_file = tmp_path / ".vcspull.yaml" + + # First discovery + discover_repos( + scan_dir_str=str(scan_dir), + config_file_path_str=str(config_file), + recursive=False, + workspace_root_override=None, + yes=True, + dry_run=False, + ) + + # Clear logs + caplog.clear() + + # Second discovery (should find existing repo) + discover_repos( + scan_dir_str=str(scan_dir), + config_file_path_str=str(config_file), + recursive=False, + workspace_root_override=None, + yes=True, + dry_run=False, + ) + + # Should mention existing repos + assert "existing" in caplog.text.lower() or "already" in caplog.text.lower() + + +def test_discover_with_workspace_override( + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, +) -> None: + """Test discover with workspace root override.""" + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + scan_dir = tmp_path / "code" + scan_dir.mkdir() + + # Create a git repo + repo_path = scan_dir / "myrepo" + init_git_repo(repo_path, "git+https://github.com/user/myrepo.git") + + config_file = tmp_path / ".vcspull.yaml" + + # Discover with workspace override + discover_repos( + scan_dir_str=str(scan_dir), + config_file_path_str=str(config_file), + recursive=False, + workspace_root_override="~/projects/", + yes=True, + dry_run=False, + ) + + import yaml + + with config_file.open() as f: + config = yaml.safe_load(f) + + # Should use the overridden workspace root + assert "~/projects/" in config + assert "myrepo" in config["~/projects/"] + + +@pytest.mark.parametrize( + list(DiscoverExistingSummaryFixture._fields), + DISCOVER_EXISTING_SUMMARY_FIXTURES, + ids=[fixture.test_id for fixture in DISCOVER_EXISTING_SUMMARY_FIXTURES], +) +def test_discover_existing_summary_branches( + test_id: str, + repo_count: int, + expected_log_fragment: str, + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + caplog: t.Any, +) -> None: + """Ensure existing repository summaries cover both detailed and aggregate forms.""" + import logging + + import yaml + + caplog.set_level(logging.INFO) + + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + scan_dir = tmp_path / "code" + scan_dir.mkdir() + + repos_config: dict[str, dict[str, dict[str, str]]] = {"~/code/": {}} + for idx in range(repo_count): + repo_name = f"repo-{idx}" + repo_path = scan_dir / repo_name + init_git_repo(repo_path, f"git+https://example.com/{repo_name}.git") + repos_config["~/code/"][repo_name] = { + "repo": f"git+https://example.com/{repo_name}.git", + } + + config_file = tmp_path / ".vcspull.yaml" + config_file.write_text(yaml.dump(repos_config), encoding="utf-8") + + discover_repos( + scan_dir_str=str(scan_dir), + config_file_path_str=str(config_file), + recursive=False, + workspace_root_override=None, + yes=True, + dry_run=True, + ) + + assert expected_log_fragment in caplog.text + + +@pytest.mark.parametrize( + list(DiscoverNormalizationFixture._fields), + DISCOVER_NORMALIZATION_FIXTURES, + ids=[fixture.test_id for fixture in DISCOVER_NORMALIZATION_FIXTURES], +) +def test_discover_normalization_only_save( + test_id: str, + preexisting_config: dict[str, dict[str, dict[str, str]]], + expected_workspace_label: str, + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + caplog: t.Any, +) -> None: + """Normalization-only changes should still trigger a save.""" + import logging + + import yaml + + caplog.set_level(logging.INFO) + + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + scan_dir = tmp_path / "code" + scan_dir.mkdir() + + repo_path = scan_dir / "existing-repo" + init_git_repo(repo_path, "git+https://example.com/existing.git") + + config_file = tmp_path / ".vcspull.yaml" + config_file.write_text(yaml.dump(preexisting_config), encoding="utf-8") + + save_calls: list[tuple[pathlib.Path, dict[str, t.Any]]] = [] + + def _fake_save(path: pathlib.Path, data: dict[str, t.Any]) -> None: + save_calls.append((path, data)) + + monkeypatch.setattr("vcspull.cli.discover.save_config_yaml", _fake_save) + + discover_repos( + scan_dir_str=str(scan_dir), + config_file_path_str=str(config_file), + recursive=False, + workspace_root_override=None, + yes=True, + dry_run=False, + ) + + assert save_calls, "Expected normalization changes to trigger a save." + saved_path, saved_config = save_calls[-1] + assert saved_path == config_file + assert expected_workspace_label in saved_config + assert "Successfully updated" in caplog.text + + +@pytest.mark.parametrize( + list(DiscoverInvalidWorkspaceFixture._fields), + DISCOVER_INVALID_WORKSPACE_FIXTURES, + ids=[fixture.test_id for fixture in DISCOVER_INVALID_WORKSPACE_FIXTURES], +) +def test_discover_skips_non_dict_workspace( + test_id: str, + workspace_section: list[str], + expected_warning: str, + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + caplog: t.Any, +) -> None: + """Repos targeting non-dict workspaces should be skipped without saving.""" + import logging + + import yaml + + caplog.set_level(logging.INFO) + + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + scan_dir = tmp_path / "code" + scan_dir.mkdir() + + repo_path = scan_dir / "new-repo" + init_git_repo(repo_path, "git+https://example.com/new.git") + + config_file = tmp_path / ".vcspull.yaml" + config_file.write_text( + yaml.dump({"~/code/": workspace_section}), + encoding="utf-8", + ) + + def _fail_save(path: pathlib.Path, data: dict[str, t.Any]) -> None: + error_message = "save_config_yaml should not be called when skipping repo" + raise AssertionError(error_message) + + monkeypatch.setattr("vcspull.cli.discover.save_config_yaml", _fail_save) + + discover_repos( + scan_dir_str=str(scan_dir), + config_file_path_str=str(config_file), + recursive=False, + workspace_root_override=None, + yes=True, + dry_run=False, + ) + + assert expected_warning in caplog.text diff --git a/tests/cli/test_import.py b/tests/cli/test_import.py deleted file mode 100644 index 2c5f5724..00000000 --- a/tests/cli/test_import.py +++ /dev/null @@ -1,1003 +0,0 @@ -"""Tests for vcspull import command functionality.""" - -from __future__ import annotations - -import contextlib -import logging -import subprocess -import typing as t - -import pytest -import yaml - -from vcspull.cli import cli -from vcspull.cli._import import get_git_origin_url, import_from_filesystem, import_repo -from vcspull.config import canonicalize_workspace_path, workspace_root_label - -if t.TYPE_CHECKING: - import pathlib - - from libvcs.pytest_plugin import CreateRepoPytestFixtureFn - from typing_extensions import TypeAlias - - ExpectedOutput: TypeAlias = t.Optional[t.Union[str, list[str]]] - - -def setup_git_repo( - path: pathlib.Path, - remote_url: str | None, - git_envvars: dict[str, str], -) -> None: - """Set up a git repository.""" - path.mkdir(parents=True, exist_ok=True) - subprocess.run( - ["git", "init"], - cwd=path, - check=True, - capture_output=True, - env=git_envvars, - ) - - if remote_url: - subprocess.run( - ["git", "remote", "add", "origin", remote_url], - cwd=path, - check=True, - capture_output=True, - env=git_envvars, - ) - - -def clone_repo( - remote_url: str, - local_path: pathlib.Path, - git_envvars: dict[str, str], -) -> None: - """Clone a git repository.""" - subprocess.run( - ["git", "clone", remote_url, str(local_path)], - check=True, - capture_output=True, - env=git_envvars, - ) - - -# ============================================================================= -# Test fixtures for single repo import -# ============================================================================= - - -class ImportRepoFixture(t.NamedTuple): - """Pytest fixture for vcspull import command (single repo mode).""" - - # pytest internal: used for naming test - test_id: str - - # test parameters - cli_args: list[str] - initial_config: dict[str, t.Any] | None - expected_config_contains: dict[str, t.Any] - expected_in_output: ExpectedOutput = None - expected_not_in_output: ExpectedOutput = None - expected_log_level: str = "INFO" - should_create_config: bool = False - - -IMPORT_REPO_FIXTURES: list[ImportRepoFixture] = [ - # Simple repo import with default workspace root - ImportRepoFixture( - test_id="simple-repo-default-root", - cli_args=["import", "myproject", "git@github.com:user/myproject.git"], - initial_config=None, - should_create_config=True, - expected_config_contains={ - "./": { - "myproject": {"repo": "git@github.com:user/myproject.git"}, - }, - }, - expected_in_output="Successfully imported 'myproject'", - ), - # Import with custom workspace root - ImportRepoFixture( - test_id="custom-workspace-root", - cli_args=[ - "import", - "mylib", - "https://github.com/org/mylib", - "--workspace-root", - "~/projects/libs", - ], - initial_config=None, - should_create_config=True, - expected_config_contains={ - "~/projects/libs/": { - "mylib": {"repo": "https://github.com/org/mylib"}, - }, - }, - expected_in_output="Successfully imported 'mylib'", - ), - # Import to existing config under specific workspace root - ImportRepoFixture( - test_id="import-to-existing", - cli_args=[ - "import", - "project2", - "git@github.com:user/project2.git", - "--workspace-root", - "~/work", - ], - initial_config={ - "~/work/": { - "project1": {"repo": "git@github.com:user/project1.git"}, - }, - }, - expected_config_contains={ - "~/work/": { - "project1": {"repo": "git@github.com:user/project1.git"}, - "project2": {"repo": "git@github.com:user/project2.git"}, - }, - }, - expected_in_output="Successfully imported 'project2'", - ), - # Duplicate repo detection - ImportRepoFixture( - test_id="duplicate-repo", - cli_args=[ - "import", - "existing", - "git@github.com:other/existing.git", - "--workspace-root", - "~/code", - ], - initial_config={ - "~/code/": { - "existing": {"repo": "git@github.com:user/existing.git"}, - }, - }, - expected_config_contains={ - "~/code/": { - "existing": {"repo": "git@github.com:user/existing.git"}, - }, - }, - expected_in_output=[ - "Repository 'existing' already exists", - "Current URL: git@github.com:user/existing.git", - ], - expected_log_level="WARNING", - ), - # Path inference - ImportRepoFixture( - test_id="path-inference", - cli_args=[ - "import", - "inferred", - "git@github.com:user/inferred.git", - "--path", - "~/dev/projects/inferred", - ], - initial_config=None, - should_create_config=True, - expected_config_contains={ - "~/dev/projects/inferred/": { - "inferred": {"repo": "git@github.com:user/inferred.git"}, - }, - }, - expected_in_output="Successfully imported 'inferred'", - ), -] - - -class ScanExistingFixture(t.NamedTuple): - """Fixture for scan behaviour when configuration already includes a repo.""" - - test_id: str - workspace_root_key_style: str - scan_arg_style: str - - -SCAN_EXISTING_FIXTURES: list[ScanExistingFixture] = [ - ScanExistingFixture( - test_id="tilde-no-slash_scan-tilde", - workspace_root_key_style="tilde_no_slash", - scan_arg_style="tilde", - ), - ScanExistingFixture( - test_id="tilde-no-slash_scan-abs", - workspace_root_key_style="tilde_no_slash", - scan_arg_style="absolute", - ), - ScanExistingFixture( - test_id="tilde-with-slash_scan-tilde", - workspace_root_key_style="tilde_with_slash", - scan_arg_style="tilde", - ), - ScanExistingFixture( - test_id="tilde-with-slash_scan-abs", - workspace_root_key_style="tilde_with_slash", - scan_arg_style="absolute", - ), - ScanExistingFixture( - test_id="absolute-no-slash_scan-abs", - workspace_root_key_style="absolute_no_slash", - scan_arg_style="absolute", - ), - ScanExistingFixture( - test_id="absolute-with-slash_scan-abs", - workspace_root_key_style="absolute_with_slash", - scan_arg_style="absolute", - ), -] - - -@pytest.mark.parametrize( - list(ImportRepoFixture._fields), - IMPORT_REPO_FIXTURES, - ids=[test.test_id for test in IMPORT_REPO_FIXTURES], -) -def test_import_repo_cli( - tmp_path: pathlib.Path, - capsys: pytest.CaptureFixture[str], - caplog: pytest.LogCaptureFixture, - monkeypatch: pytest.MonkeyPatch, - test_id: str, - cli_args: list[str], - initial_config: dict[str, t.Any] | None, - expected_config_contains: dict[str, t.Any], - expected_in_output: ExpectedOutput, - expected_not_in_output: ExpectedOutput, - expected_log_level: str, - should_create_config: bool, -) -> None: - """Test vcspull import command through CLI (single repo mode).""" - caplog.set_level(expected_log_level) - - # Set up config file path - config_file = tmp_path / ".vcspull.yaml" - - # Create initial config if provided - if initial_config: - yaml_content = yaml.dump(initial_config, default_flow_style=False) - config_file.write_text(yaml_content, encoding="utf-8") - - # Add config path to CLI args if not specified - if "-c" not in cli_args and "--config" not in cli_args: - cli_args = [*cli_args[:1], "-c", str(config_file), *cli_args[1:]] - - # Change to tmp directory - monkeypatch.chdir(tmp_path) - - # Run CLI command - with contextlib.suppress(SystemExit): - cli(cli_args) - - # Capture output - captured = capsys.readouterr() - output = "".join([*caplog.messages, captured.out, captured.err]) - - # Check expected output (strip ANSI codes for comparison) - import re - - clean_output = re.sub(r"\x1b\[[0-9;]*m", "", output) # Strip ANSI codes - - if expected_in_output is not None: - if isinstance(expected_in_output, str): - expected_in_output = [expected_in_output] - for needle in expected_in_output: - assert needle in clean_output, ( - f"Expected '{needle}' in output, got: {clean_output}" - ) - - if expected_not_in_output is not None: - if isinstance(expected_not_in_output, str): - expected_not_in_output = [expected_not_in_output] - for needle in expected_not_in_output: - assert needle not in clean_output, f"Unexpected '{needle}' in output" - - # Verify config file - if should_create_config or initial_config: - assert config_file.exists(), "Config file should exist" - - # Load and verify config - with config_file.open() as f: - config_data = yaml.safe_load(f) - - # Check expected config contents - for key, value in expected_config_contains.items(): - assert key in config_data, f"Expected key '{key}' in config" - if isinstance(value, dict): - for subkey, subvalue in value.items(): - assert subkey in config_data[key], ( - f"Expected '{subkey}' in config['{key}']" - ) - assert config_data[key][subkey] == subvalue, ( - f"Config mismatch for {key}/{subkey}: " - f"expected {subvalue}, got {config_data[key][subkey]}" - ) - - -# ============================================================================= -# Test fixtures for filesystem scan import -# ============================================================================= - - -class ImportScanFixture(t.NamedTuple): - """Pytest fixture for vcspull import --scan command.""" - - # pytest internal: used for naming test - test_id: str - - # test parameters - repo_setup: list[tuple[str, str, bool]] # (name, subdir, has_remote) - cli_args: list[str] - initial_config: dict[str, t.Any] | None - expected_config_contains: dict[str, t.Any] | None - expected_in_output: ExpectedOutput = None - expected_not_in_output: ExpectedOutput = None - expected_log_level: str = "INFO" - should_create_config: bool = False - user_input: str | None = None # For confirmation prompts - - -IMPORT_SCAN_FIXTURES: list[ImportScanFixture] = [ - # Single repository scan - ImportScanFixture( - test_id="single-repo-scan", - repo_setup=[("myproject", "", True)], # One repo with remote - cli_args=["import", "--scan", ".", "-y"], - initial_config=None, - should_create_config=True, - expected_config_contains={"has_repos": True}, # Will verify dynamically - expected_in_output=[ - "Found 1 new repository to import:", - "+ myproject", - "Successfully updated", - ], - ), - # Multiple repositories non-recursive - ImportScanFixture( - test_id="multiple-repos-non-recursive-scan", - repo_setup=[ - ("repo1", "", True), - ("repo2", "", True), - ("nested", "subdir", True), # Should be ignored without -r - ], - cli_args=["import", "--scan", ".", "-y"], - initial_config=None, - should_create_config=True, - expected_config_contains={"has_repos": True}, - expected_in_output=[ - "Found 2 new repositories to import:", - "+ repo1", - "+ repo2", - "Successfully updated", - ], - expected_not_in_output="nested", - ), - # Recursive scan - ImportScanFixture( - test_id="recursive-scan", - repo_setup=[ - ("repo1", "", True), - ("nested", "subdir", True), - ], - cli_args=["import", "--scan", ".", "-r", "-y"], - initial_config=None, - should_create_config=True, - expected_config_contains={"has_repos": True}, - expected_in_output=[ - "Found 2 new repositories to import:", - "+ repo1", - "+ nested", - "Successfully updated", - ], - ), - # Custom workspace root override - ImportScanFixture( - test_id="custom-workspace-root-scan", - repo_setup=[("myrepo", "", True)], - cli_args=[ - "import", - "--scan", - ".", - "--workspace-root", - "~/custom/path", - "-y", - ], - initial_config=None, - should_create_config=True, - expected_config_contains={ - "~/custom/path/": {"myrepo": {}}, - }, # Just check repo exists - expected_in_output=[ - "Found 1 new repository to import:", - "Successfully updated", - ], - ), - # No repositories found - ImportScanFixture( - test_id="no-repos-scan", - repo_setup=[], # No repositories - cli_args=["import", "--scan", ".", "-y"], - initial_config=None, - should_create_config=False, - expected_config_contains=None, - expected_in_output="No git repositories found", - ), - # Repository without remote - ImportScanFixture( - test_id="repo-without-remote-scan", - repo_setup=[("local_only", "", False)], # No remote - cli_args=["import", "--scan", ".", "-y"], - initial_config=None, - should_create_config=False, - expected_config_contains=None, - expected_in_output="No git repositories found", - expected_log_level="WARNING", - ), - # All repositories already exist - ImportScanFixture( - test_id="all-existing-scan", - repo_setup=[("existing1", "", True), ("existing2", "", True)], - cli_args=["import", "--scan", ".", "-y"], - initial_config={"dynamic": "will_be_set_in_test"}, # Will be set dynamically - should_create_config=False, - expected_config_contains=None, - expected_in_output=[ - "Found 2 existing repositories", - "All found repositories already exist", - ], - ), - # Mixed existing and new - ImportScanFixture( - test_id="mixed-existing-new-scan", - repo_setup=[ - ("existing", "", True), - ("newrepo", "", True), - ], - cli_args=["import", "--scan", ".", "-y"], - initial_config={"dynamic": "will_be_set_in_test"}, # Will be set for existing - should_create_config=False, - expected_config_contains={"has_repos": True}, - expected_in_output=[ - "Found 1 existing repositories", # Note: plural form in message - "Found 1 new repository to import:", - "+ newrepo", - "Successfully updated", - ], - ), - # User confirmation - yes - ImportScanFixture( - test_id="user-confirm-yes-scan", - repo_setup=[("repo_confirm", "", True)], - cli_args=["import", "--scan", "."], # No -y flag - initial_config=None, - should_create_config=True, - expected_config_contains={"has_repos": True}, - expected_in_output=[ - "Found 1 new repository to import:", - "Successfully updated", - ], - user_input="y\n", - ), - # User confirmation - no - ImportScanFixture( - test_id="user-confirm-no-scan", - repo_setup=[("repo_no_confirm", "", True)], - cli_args=["import", "--scan", "."], # No -y flag - initial_config=None, - should_create_config=False, - expected_config_contains=None, - expected_in_output=[ - "Found 1 new repository to import:", - "Aborted by user", - ], - user_input="n\n", - ), -] - - -@pytest.mark.parametrize( - list(ImportScanFixture._fields), - IMPORT_SCAN_FIXTURES, - ids=[test.test_id for test in IMPORT_SCAN_FIXTURES], -) -def test_import_scan_cli( - tmp_path: pathlib.Path, - capsys: pytest.CaptureFixture[str], - caplog: pytest.LogCaptureFixture, - monkeypatch: pytest.MonkeyPatch, - create_git_remote_repo: CreateRepoPytestFixtureFn, - git_commit_envvars: dict[str, str], - test_id: str, - repo_setup: list[tuple[str, str, bool]], - cli_args: list[str], - initial_config: dict[str, t.Any] | None, - expected_config_contains: dict[str, t.Any] | None, - expected_in_output: ExpectedOutput, - expected_not_in_output: ExpectedOutput, - expected_log_level: str, - should_create_config: bool, - user_input: str | None, -) -> None: - """Test vcspull import --scan command through CLI.""" - # Set up scan directory - scan_dir = tmp_path / "scan_dir" - scan_dir.mkdir() - - # Set up repositories based on fixture - repo_urls = {} - for repo_name, subdir, has_remote in repo_setup: - repo_parent = scan_dir / subdir if subdir else scan_dir - repo_parent.mkdir(exist_ok=True, parents=True) - repo_path = repo_parent / repo_name - - if has_remote: - # Create remote and clone - remote_path = create_git_remote_repo() - remote_url = f"file://{remote_path}" - clone_repo(remote_url, repo_path, git_commit_envvars) - repo_urls[repo_name] = remote_url - else: - # Create local repo without remote - setup_git_repo(repo_path, None, git_commit_envvars) - - # Set up config file - config_file = tmp_path / ".vcspull.yaml" - - # Handle dynamic initial config for existing repo tests - if initial_config and "dynamic" in initial_config: - if test_id == "all-existing-scan": - # All repos should be in config - initial_config = { - str(scan_dir) + "/": { - name: {"repo": repo_urls[name]} - for name, _, has_remote in repo_setup - if has_remote - }, - } - elif test_id == "mixed-existing-new-scan": - # Only "existing" repo should be in config - initial_config = { - str(scan_dir) + "/": {"existing": {"repo": repo_urls["existing"]}}, - } - - if initial_config: - yaml_content = yaml.dump(initial_config, default_flow_style=False) - config_file.write_text(yaml_content, encoding="utf-8") - - # Update CLI args: replace "." with scan_dir and add config - updated_cli_args = [] - for i, arg in enumerate(cli_args): - if arg == "." and i > 0 and cli_args[i - 1] == "--scan": - updated_cli_args.append(str(scan_dir)) - else: - updated_cli_args.append(arg) - - # Insert config argument after "import" - import_idx = updated_cli_args.index("import") - updated_cli_args = [ - *updated_cli_args[: import_idx + 1], - "-c", - str(config_file), - *updated_cli_args[import_idx + 1 :], - ] - - # Change to tmp directory - monkeypatch.chdir(tmp_path) - - # Mock user input if needed - if user_input: - monkeypatch.setattr("builtins.input", lambda _: user_input.strip()) - - # Run CLI command - with contextlib.suppress(SystemExit): - cli(updated_cli_args) - - # Capture output - captured = capsys.readouterr() - output = "".join([*caplog.messages, captured.out, captured.err]) - - # Strip ANSI codes for comparison - import re - - clean_output = re.sub(r"\x1b\[[0-9;]*m", "", output) - - # Check expected output - if expected_in_output is not None: - if isinstance(expected_in_output, str): - expected_in_output = [expected_in_output] - for needle in expected_in_output: - assert needle in clean_output, ( - f"Expected '{needle}' in output, got: {clean_output}" - ) - - if expected_not_in_output is not None: - if isinstance(expected_not_in_output, str): - expected_not_in_output = [expected_not_in_output] - for needle in expected_not_in_output: - assert needle not in clean_output, f"Unexpected '{needle}' in output" - - # Verify config file - if should_create_config or (initial_config and expected_config_contains): - assert config_file.exists(), "Config file should exist" - - # Load and verify config - with config_file.open() as f: - config_data = yaml.safe_load(f) - - # Check expected config contents - if expected_config_contains: - if "has_repos" in expected_config_contains: - # Just check that repos were added - assert config_data, "Config should have content" - assert any(isinstance(v, dict) for v in config_data.values()), ( - "Should have repo entries" - ) - else: - for key, value in expected_config_contains.items(): - assert key in config_data, f"Expected key '{key}' in config" - if isinstance(value, dict): - for subkey, subvalue in value.items(): - assert subkey in config_data[key], ( - f"Expected '{subkey}' in config['{key}']" - ) - # If subvalue is empty dict, just check that the key exists - if subvalue == {}: - assert isinstance(config_data[key][subkey], dict) - elif subvalue != t.Any: - assert config_data[key][subkey] == subvalue - - -# ============================================================================= -# Unit tests -# ============================================================================= - - -class TestImportRepoUnit: - """Unit tests for import_repo function.""" - - def test_import_repo_direct_call( - self, - tmp_path: pathlib.Path, - caplog: pytest.LogCaptureFixture, - ) -> None: - """Test direct import_repo function call.""" - config_file = tmp_path / ".vcspull.yaml" - - # Call import_repo directly - import_repo( - name="direct-test", - url="git@github.com:user/direct.git", - config_file_path_str=str(config_file), - path=None, - workspace_root_path=None, - ) - - # Verify - assert config_file.exists() - with config_file.open() as f: - config_data = yaml.safe_load(f) - - assert "./" in config_data - assert "direct-test" in config_data["./"] - assert config_data["./"]["direct-test"] == { - "repo": "git@github.com:user/direct.git", - } - - def test_import_repo_invalid_config( - self, - tmp_path: pathlib.Path, - caplog: pytest.LogCaptureFixture, - monkeypatch: pytest.MonkeyPatch, - ) -> None: - """Test handling of invalid config file.""" - config_file = tmp_path / ".vcspull.yaml" - - # Write invalid YAML - config_file.write_text("invalid: yaml: content:", encoding="utf-8") - - # Change to tmp directory - monkeypatch.chdir(tmp_path) - - # Try to import repo and capture log output - with caplog.at_level(logging.ERROR): - import_repo( - name="test", - url="git@github.com:user/test.git", - config_file_path_str=str(config_file), - path=None, - workspace_root_path=None, - ) - - assert "Error loading YAML" in caplog.text - - -class TestGetGitOriginUrl: - """Unit tests for get_git_origin_url function.""" - - def test_get_origin_url_success( - self, - create_git_remote_repo: CreateRepoPytestFixtureFn, - tmp_path: pathlib.Path, - git_commit_envvars: dict[str, str], - ) -> None: - """Test successfully getting origin URL.""" - # Create and clone a repo - remote_path = create_git_remote_repo() - remote_url = f"file://{remote_path}" - local_path = tmp_path / "test_repo" - - clone_repo(remote_url, local_path, git_commit_envvars) - - # Test getting URL - url = get_git_origin_url(local_path) - assert url == remote_url - - def test_get_origin_url_no_remote( - self, - tmp_path: pathlib.Path, - git_commit_envvars: dict[str, str], - ) -> None: - """Test handling repo without origin.""" - repo_path = tmp_path / "local_only" - setup_git_repo(repo_path, None, git_commit_envvars) - - url = get_git_origin_url(repo_path) - assert url is None - - def test_get_origin_url_not_git( - self, - tmp_path: pathlib.Path, - ) -> None: - """Test handling non-git directory.""" - regular_dir = tmp_path / "not_git" - regular_dir.mkdir() - - url = get_git_origin_url(regular_dir) - assert url is None - - -class TestImportFromFilesystemUnit: - """Unit tests for import_from_filesystem function.""" - - def test_import_scan_direct_call( - self, - create_git_remote_repo: CreateRepoPytestFixtureFn, - tmp_path: pathlib.Path, - git_commit_envvars: dict[str, str], - capsys: pytest.CaptureFixture[str], - ) -> None: - """Test direct import_from_filesystem call.""" - # Set up a repo - scan_dir = tmp_path / "repos" - scan_dir.mkdir() - - remote_path = create_git_remote_repo() - remote_url = f"file://{remote_path}" - repo_path = scan_dir / "test_repo" - clone_repo(remote_url, repo_path, git_commit_envvars) - - config_file = tmp_path / ".vcspull.yaml" - - # Call function directly - import_from_filesystem( - scan_dir_str=str(scan_dir), - config_file_path_str=str(config_file), - recursive=False, - workspace_root_override=None, - yes=True, - ) - - # Verify config created - assert config_file.exists() - with config_file.open() as f: - config_data = yaml.safe_load(f) - - expected_key = str(scan_dir) + "/" - assert expected_key in config_data - assert "test_repo" in config_data[expected_key] - - def test_many_existing_repos_summary( - self, - create_git_remote_repo: CreateRepoPytestFixtureFn, - tmp_path: pathlib.Path, - git_commit_envvars: dict[str, str], - capsys: pytest.CaptureFixture[str], - caplog: pytest.LogCaptureFixture, - monkeypatch: pytest.MonkeyPatch, - ) -> None: - """Test summary output when many repos already exist.""" - scan_dir = tmp_path / "many_repos" - scan_dir.mkdir() - - # Create many repos (>5 for summary mode) - repo_data = {} - for i in range(8): - remote_path = create_git_remote_repo() - remote_url = f"file://{remote_path}" - repo_name = f"repo{i}" - repo_path = scan_dir / repo_name - clone_repo(remote_url, repo_path, git_commit_envvars) - repo_data[repo_name] = {"repo": remote_url} - - # Pre-create config with all repos - config_file = tmp_path / ".vcspull.yaml" - initial_config = {str(scan_dir) + "/": repo_data} - yaml_content = yaml.dump(initial_config, default_flow_style=False) - config_file.write_text(yaml_content, encoding="utf-8") - - # Change to tmp directory - monkeypatch.chdir(tmp_path) - - # Run scan through CLI - with contextlib.suppress(SystemExit): - cli(["import", "--scan", str(scan_dir), "-c", str(config_file), "-y"]) - - # Check for summary message (not detailed list) - captured = capsys.readouterr() - output = "\n".join(caplog.messages) + captured.out + captured.err - - # Strip ANSI codes - import re - - clean_output = re.sub(r"\x1b\[[0-9;]*m", "", output) - - assert "Found 8 existing repositories already in configuration" in clean_output - assert "All found repositories already exist" in clean_output - - @pytest.mark.parametrize( - list(ScanExistingFixture._fields), - [ - ( - fixture.test_id, - fixture.workspace_root_key_style, - fixture.scan_arg_style, - ) - for fixture in SCAN_EXISTING_FIXTURES - ], - ids=[fixture.test_id for fixture in SCAN_EXISTING_FIXTURES], - ) - def test_scan_respects_existing_config_sections( - self, - tmp_path: pathlib.Path, - monkeypatch: pytest.MonkeyPatch, - test_id: str, - workspace_root_key_style: str, - scan_arg_style: str, - ) -> None: - """Ensure filesystem scan does not duplicate repositories in config.""" - home_dir = tmp_path / "home" - scan_dir = home_dir / "study" / "c" - repo_dir = scan_dir / "cpython" - repo_git_dir = repo_dir / ".git" - repo_git_dir.mkdir(parents=True, exist_ok=True) - - expected_repo_url = "git+https://github.com/python/cpython.git" - config_file = home_dir / ".vcspull.yaml" - - if workspace_root_key_style == "tilde_no_slash": - workspace_root_key = "~/study/c" - elif workspace_root_key_style == "tilde_with_slash": - workspace_root_key = "~/study/c/" - elif workspace_root_key_style == "absolute_no_slash": - workspace_root_key = str(scan_dir) - elif workspace_root_key_style == "absolute_with_slash": - workspace_root_key = str(scan_dir) + "/" - else: - error_msg = ( - f"Unhandled workspace_root_key_style: {workspace_root_key_style}" - ) - raise AssertionError(error_msg) - - config_file.write_text( - (f"{workspace_root_key}:\n cpython:\n repo: {expected_repo_url}\n"), - encoding="utf-8", - ) - - monkeypatch.setenv("HOME", str(home_dir)) - monkeypatch.delenv("XDG_CONFIG_HOME", raising=False) - monkeypatch.setattr( - "vcspull.cli._import.get_git_origin_url", - lambda _path: expected_repo_url, - ) - - if scan_arg_style == "tilde": - scan_arg = "~/study/c" - elif scan_arg_style == "absolute": - scan_arg = str(scan_dir) - else: - error_msg = f"Unhandled scan_arg_style: {scan_arg_style}" - raise AssertionError(error_msg) - - import_from_filesystem( - scan_dir_str=scan_arg, - config_file_path_str=None, - recursive=False, - workspace_root_override=None, - yes=True, - ) - - with config_file.open(encoding="utf-8") as f: - config_data = yaml.safe_load(f) - - expected_path = canonicalize_workspace_path( - workspace_root_key, - cwd=home_dir, - ) - expected_label = workspace_root_label( - expected_path, - cwd=home_dir, - home=home_dir, - ) - assert expected_label in config_data - assert "cpython" in config_data[expected_label] - assert config_data[expected_label]["cpython"]["repo"] == expected_repo_url - assert len(config_data) == 1 - assert "~/study/c" not in config_data - - -# ============================================================================= -# Help and output tests -# ============================================================================= - - -def test_import_command_help( - capsys: pytest.CaptureFixture[str], -) -> None: - """Test import command help output.""" - with contextlib.suppress(SystemExit): - cli(["import", "--help"]) - - captured = capsys.readouterr() - output = captured.out + captured.err - - # Check help content - assert "Import a repository to the vcspull configuration file" in output - assert "name" in output - assert "url" in output - assert "--path" in output - assert "--workspace-root" in output - assert "--scan" in output - assert "--config" in output - - -def test_import_scan_stream_output( - tmp_path: pathlib.Path, - capsys: pytest.CaptureFixture[str], - monkeypatch: pytest.MonkeyPatch, - create_git_remote_repo: CreateRepoPytestFixtureFn, - git_commit_envvars: dict[str, str], -) -> None: - """Ensure the CLI prints repo summaries to the user-facing stream.""" - scan_dir = tmp_path / "scan" - scan_dir.mkdir() - - repo_name = "sample" - remote_path = create_git_remote_repo() - remote_url = f"file://{remote_path}" - clone_repo(remote_url, scan_dir / repo_name, git_commit_envvars) - - config_file = tmp_path / ".vcspull.yaml" - - monkeypatch.chdir(tmp_path) - - with contextlib.suppress(SystemExit): - cli( - [ - "import", - "--scan", - str(scan_dir), - "--yes", - "-c", - str(config_file), - ], - ) - - captured = capsys.readouterr() - - import re - - clean_output = re.sub(r"\x1b\[[0-9;]*m", "", captured.out + captured.err) - - assert "Found 1 new repository to import:" in clean_output - assert repo_name in clean_output diff --git a/tests/cli/test_list.py b/tests/cli/test_list.py new file mode 100644 index 00000000..87a1cdc3 --- /dev/null +++ b/tests/cli/test_list.py @@ -0,0 +1,267 @@ +"""Tests for vcspull list command.""" + +from __future__ import annotations + +import json +import pathlib +import typing as t + +import pytest +import yaml + +from vcspull.cli.list import list_repos + +if t.TYPE_CHECKING: + from _pytest.monkeypatch import MonkeyPatch + + +def create_test_config(config_path: pathlib.Path, repos: dict[str, t.Any]) -> None: + """Create a test config file.""" + with config_path.open("w") as f: + yaml.dump(repos, f) + + +class ListReposFixture(t.NamedTuple): + """Fixture for list repos test cases.""" + + test_id: str + config_data: dict[str, t.Any] + patterns: list[str] + tree: bool + output_json: bool + output_ndjson: bool + workspace_filter: str | None + expected_repo_names: list[str] + + +LIST_REPOS_FIXTURES: list[ListReposFixture] = [ + ListReposFixture( + test_id="list-all-repos", + config_data={ + "~/code/": { + "flask": {"repo": "git+https://github.com/pallets/flask.git"}, + "django": {"repo": "git+https://github.com/django/django.git"}, + }, + }, + patterns=[], + tree=False, + output_json=False, + output_ndjson=False, + workspace_filter=None, + expected_repo_names=["flask", "django"], + ), + ListReposFixture( + test_id="list-with-pattern", + config_data={ + "~/code/": { + "flask": {"repo": "git+https://github.com/pallets/flask.git"}, + "django": {"repo": "git+https://github.com/django/django.git"}, + "requests": {"repo": "git+https://github.com/psf/requests.git"}, + }, + }, + patterns=["fla*"], + tree=False, + output_json=False, + output_ndjson=False, + workspace_filter=None, + expected_repo_names=["flask"], + ), + ListReposFixture( + test_id="list-json-output", + config_data={ + "~/code/": { + "flask": {"repo": "git+https://github.com/pallets/flask.git"}, + }, + }, + patterns=[], + tree=False, + output_json=True, + output_ndjson=False, + workspace_filter=None, + expected_repo_names=["flask"], + ), + ListReposFixture( + test_id="list-ndjson-output", + config_data={ + "~/code/": { + "flask": {"repo": "git+https://github.com/pallets/flask.git"}, + }, + }, + patterns=[], + tree=False, + output_json=False, + output_ndjson=True, + workspace_filter=None, + expected_repo_names=["flask"], + ), + ListReposFixture( + test_id="list-workspace-filter", + config_data={ + "~/code/": { + "flask": {"repo": "git+https://github.com/pallets/flask.git"}, + }, + "~/work/": { + "internal": {"repo": "git+https://github.com/user/internal.git"}, + }, + }, + patterns=[], + tree=False, + output_json=False, + output_ndjson=False, + workspace_filter="~/code/", + expected_repo_names=["flask"], + ), +] + + +@pytest.mark.parametrize( + list(ListReposFixture._fields), + LIST_REPOS_FIXTURES, + ids=[fixture.test_id for fixture in LIST_REPOS_FIXTURES], +) +def test_list_repos( + test_id: str, + config_data: dict[str, t.Any], + patterns: list[str], + tree: bool, + output_json: bool, + output_ndjson: bool, + workspace_filter: str | None, + expected_repo_names: list[str], + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + capsys: t.Any, +) -> None: + """Test listing repositories.""" + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + config_file = tmp_path / ".vcspull.yaml" + create_test_config(config_file, config_data) + + # Run list_repos + list_repos( + repo_patterns=patterns, + config_path=config_file, + workspace_root=workspace_filter, + tree=tree, + output_json=output_json, + output_ndjson=output_ndjson, + color="never", + ) + + captured = capsys.readouterr() + + if output_json: + # Parse JSON output + output_data = json.loads(captured.out) + assert isinstance(output_data, list) + repo_names_in_output = [item["name"] for item in output_data] + for expected_name in expected_repo_names: + assert expected_name in repo_names_in_output + elif output_ndjson: + # Parse NDJSON output + lines = [line for line in captured.out.strip().split("\n") if line] + repo_names_in_output = [json.loads(line)["name"] for line in lines] + for expected_name in expected_repo_names: + assert expected_name in repo_names_in_output + else: + # Human-readable output + for expected_name in expected_repo_names: + assert expected_name in captured.out + + +def test_list_repos_tree_mode( + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + capsys: t.Any, +) -> None: + """Test listing repositories in tree mode.""" + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + config_file = tmp_path / ".vcspull.yaml" + config_data = { + "~/code/": { + "flask": {"repo": "git+https://github.com/pallets/flask.git"}, + }, + "~/work/": { + "myproject": {"repo": "git+https://github.com/user/myproject.git"}, + }, + } + create_test_config(config_file, config_data) + + list_repos( + repo_patterns=[], + config_path=config_file, + workspace_root=None, + tree=True, + output_json=False, + output_ndjson=False, + color="never", + ) + + captured = capsys.readouterr() + + # Should show repos (workspace roots may be normalized to paths) + assert "flask" in captured.out + assert "myproject" in captured.out + # Tree mode should group repos + assert "•" in captured.out + + +def test_list_repos_empty_config( + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + capsys: t.Any, +) -> None: + """Test listing with empty config shows appropriate message.""" + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + config_file = tmp_path / ".vcspull.yaml" + create_test_config(config_file, {}) + + list_repos( + repo_patterns=[], + config_path=config_file, + workspace_root=None, + tree=False, + output_json=False, + output_ndjson=False, + color="never", + ) + + captured = capsys.readouterr() + assert "No repositories found" in captured.out + + +def test_list_repos_pattern_no_match( + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + capsys: t.Any, +) -> None: + """Test listing with pattern that matches nothing.""" + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + config_file = tmp_path / ".vcspull.yaml" + config_data = { + "~/code/": { + "flask": {"repo": "git+https://github.com/pallets/flask.git"}, + }, + } + create_test_config(config_file, config_data) + + list_repos( + repo_patterns=["nonexistent*"], + config_path=config_file, + workspace_root=None, + tree=False, + output_json=False, + output_ndjson=False, + color="never", + ) + + captured = capsys.readouterr() + assert "No repositories found" in captured.out diff --git a/tests/cli/test_plan_output_helpers.py b/tests/cli/test_plan_output_helpers.py new file mode 100644 index 00000000..96ac228e --- /dev/null +++ b/tests/cli/test_plan_output_helpers.py @@ -0,0 +1,231 @@ +"""Unit tests for sync plan output helpers.""" + +from __future__ import annotations + +import io +import json +import typing as t +from contextlib import redirect_stdout + +import pytest + +from vcspull.cli._colors import ColorMode, Colors +from vcspull.cli._output import ( + OutputFormatter, + OutputMode, + PlanAction, + PlanEntry, + PlanResult, + PlanSummary, +) +from vcspull.cli.sync import PlanProgressPrinter + + +class PlanEntryPayloadFixture(t.NamedTuple): + """Fixture for PlanEntry payload serialization.""" + + test_id: str + kwargs: dict[str, t.Any] + expected_keys: dict[str, t.Any] + unexpected_keys: set[str] + + +PLAN_ENTRY_PAYLOAD_FIXTURES: list[PlanEntryPayloadFixture] = [ + PlanEntryPayloadFixture( + test_id="clone-with-url", + kwargs={ + "name": "repo-one", + "path": "/tmp/repo-one", + "workspace_root": "~/code/", + "action": PlanAction.CLONE, + "detail": "missing", + "url": "git+https://example.com/repo-one.git", + }, + expected_keys={ + "type": "operation", + "action": "clone", + "detail": "missing", + "url": "git+https://example.com/repo-one.git", + }, + unexpected_keys={"branch", "ahead", "behind", "dirty", "error"}, + ), + PlanEntryPayloadFixture( + test_id="update-with-status", + kwargs={ + "name": "repo-two", + "path": "/tmp/repo-two", + "workspace_root": "~/code/", + "action": PlanAction.UPDATE, + "detail": "behind 2", + "branch": "main", + "remote_branch": "origin/main", + "current_rev": "abc1234", + "target_rev": "def5678", + "ahead": 0, + "behind": 2, + "dirty": False, + }, + expected_keys={ + "branch": "main", + "remote_branch": "origin/main", + "current_rev": "abc1234", + "target_rev": "def5678", + "ahead": 0, + "behind": 2, + "dirty": False, + }, + unexpected_keys={"url", "error"}, + ), +] + + +@pytest.mark.parametrize( + list(PlanEntryPayloadFixture._fields), + PLAN_ENTRY_PAYLOAD_FIXTURES, + ids=[fixture.test_id for fixture in PLAN_ENTRY_PAYLOAD_FIXTURES], +) +def test_plan_entry_to_payload( + test_id: str, + kwargs: dict[str, t.Any], + expected_keys: dict[str, t.Any], + unexpected_keys: set[str], +) -> None: + """Ensure PlanEntry serialises optional fields correctly.""" + entry = PlanEntry(**kwargs) + payload = entry.to_payload() + + for key, value in expected_keys.items(): + assert payload[key] == value + + for key in unexpected_keys: + assert key not in payload + + assert payload["format_version"] == "1" + assert payload["type"] == "operation" + assert payload["name"] == kwargs["name"] + assert payload["path"] == kwargs["path"] + assert payload["workspace_root"] == kwargs["workspace_root"] + + +class PlanSummaryPayloadFixture(t.NamedTuple): + """Fixture for PlanSummary payload serialization.""" + + test_id: str + summary: PlanSummary + expected_total: int + + +PLAN_SUMMARY_PAYLOAD_FIXTURES: list[PlanSummaryPayloadFixture] = [ + PlanSummaryPayloadFixture( + test_id="basic-counts", + summary=PlanSummary(clone=1, update=2, unchanged=3, blocked=4, errors=5), + expected_total=15, + ), + PlanSummaryPayloadFixture( + test_id="with-duration", + summary=PlanSummary( + clone=0, update=1, unchanged=0, blocked=0, errors=0, duration_ms=120 + ), + expected_total=1, + ), +] + + +@pytest.mark.parametrize( + list(PlanSummaryPayloadFixture._fields), + PLAN_SUMMARY_PAYLOAD_FIXTURES, + ids=[fixture.test_id for fixture in PLAN_SUMMARY_PAYLOAD_FIXTURES], +) +def test_plan_summary_to_payload( + test_id: str, + summary: PlanSummary, + expected_total: int, +) -> None: + """Validate PlanSummary total and serialization behaviour.""" + payload = summary.to_payload() + assert payload["total"] == expected_total + assert payload["clone"] == summary.clone + assert payload["update"] == summary.update + assert payload["unchanged"] == summary.unchanged + assert payload["blocked"] == summary.blocked + assert payload["errors"] == summary.errors + if summary.duration_ms is not None: + assert payload["duration_ms"] == summary.duration_ms + else: + assert "duration_ms" not in payload + + +def test_plan_result_grouping_and_json_output() -> None: + """PlanResult should group entries and produce stable JSON.""" + entries = [ + PlanEntry( + name="repo-a", + path="/tmp/workspace-a/repo-a", + workspace_root="~/workspace-a/", + action=PlanAction.CLONE, + ), + PlanEntry( + name="repo-b", + path="/tmp/workspace-b/repo-b", + workspace_root="~/workspace-b/", + action=PlanAction.UPDATE, + ), + PlanEntry( + name="repo-c", + path="/tmp/workspace-a/repo-c", + workspace_root="~/workspace-a/", + action=PlanAction.UNCHANGED, + ), + ] + summary = PlanSummary(clone=1, update=1, unchanged=1) + result = PlanResult(entries=entries, summary=summary) + + mapping = result.to_workspace_mapping() + assert set(mapping.keys()) == {"~/workspace-a/", "~/workspace-b/"} + assert {entry.name for entry in mapping["~/workspace-a/"]} == {"repo-a", "repo-c"} + assert {entry.name for entry in mapping["~/workspace-b/"]} == {"repo-b"} + + json_object = result.to_json_object() + assert json_object["summary"]["total"] == 3 + workspaces = { + workspace["path"]: workspace for workspace in json_object["workspaces"] + } + assert set(workspaces) == {"~/workspace-a/", "~/workspace-b/"} + assert len(workspaces["~/workspace-a/"]["operations"]) == 2 + assert workspaces["~/workspace-b/"]["operations"][0]["name"] == "repo-b" + + +def test_output_formatter_json_mode_finalises_buffer() -> None: + """OutputFormatter should flush buffered JSON payloads on finalize.""" + entry = PlanEntry( + name="repo-buffer", + path="/tmp/repo-buffer", + workspace_root="~/code/", + action=PlanAction.CLONE, + ) + formatter = OutputFormatter(mode=OutputMode.JSON) + captured = io.StringIO() + with redirect_stdout(captured): + formatter.emit(entry) + formatter.emit(PlanSummary(clone=1)) + formatter.finalize() + + output = json.loads(captured.getvalue()) + assert len(output) == 2 + assert output[0]["name"] == "repo-buffer" + assert output[1]["type"] == "summary" + + +def test_plan_progress_printer_updates_and_finishes() -> None: + """Progress printer should render a single line and terminate cleanly.""" + colors = Colors(mode=ColorMode.NEVER) + printer = PlanProgressPrinter(total=3, colors=colors, enabled=True) + buffer = io.StringIO() + printer._stream = buffer + + summary = PlanSummary(clone=1) + printer.update(summary, processed=1) + assert "Progress: 1/3" in buffer.getvalue() + + printer.finish() + assert buffer.getvalue().endswith("\n") diff --git a/tests/cli/test_status.py b/tests/cli/test_status.py new file mode 100644 index 00000000..2712f123 --- /dev/null +++ b/tests/cli/test_status.py @@ -0,0 +1,549 @@ +"""Tests for vcspull status command.""" + +from __future__ import annotations + +import json +import pathlib +import subprocess +import typing as t + +import pytest +import yaml + +from vcspull.cli.status import check_repo_status, status_repos + +if t.TYPE_CHECKING: + from _pytest.monkeypatch import MonkeyPatch + + +def create_test_config(config_path: pathlib.Path, repos: dict[str, t.Any]) -> None: + """Create a test config file.""" + with config_path.open("w") as f: + yaml.dump(repos, f) + + +def init_git_repo(repo_path: pathlib.Path) -> None: + """Initialize a git repository.""" + repo_path.mkdir(parents=True, exist_ok=True) + subprocess.run(["git", "init"], cwd=repo_path, check=True, capture_output=True) + + +def git(repo_path: pathlib.Path, *args: str) -> subprocess.CompletedProcess[bytes]: + """Run a git command in the provided repository.""" + return subprocess.run( + ["git", *args], + cwd=repo_path, + check=True, + capture_output=True, + ) + + +def configure_git_identity(repo_path: pathlib.Path) -> None: + """Configure Git author information for disposable repositories.""" + git(repo_path, "config", "user.email", "ci@example.com") + git(repo_path, "config", "user.name", "vcspull-tests") + + +def commit_file( + repo_path: pathlib.Path, + filename: str, + content: str, + message: str, +) -> None: + """Create a file, add it, and commit.""" + file_path = repo_path / filename + file_path.parent.mkdir(parents=True, exist_ok=True) + file_path.write_text(content) + git(repo_path, "add", filename) + git(repo_path, "commit", "-m", message) + + +def setup_repo_with_remote( + base_path: pathlib.Path, +) -> tuple[pathlib.Path, pathlib.Path]: + """Create a repository with a bare remote and an initial commit.""" + remote_path = base_path / "remote.git" + subprocess.run( + ["git", "init", "--bare", str(remote_path)], + check=True, + capture_output=True, + ) + + repo_path = base_path / "workspace" / "project" + repo_path.mkdir(parents=True, exist_ok=True) + git(repo_path, "init") + configure_git_identity(repo_path) + commit_file(repo_path, "README.md", "initial", "feat: initial commit") + git(repo_path, "branch", "-M", "main") + git(repo_path, "remote", "add", "origin", str(remote_path)) + git(repo_path, "push", "-u", "origin", "main") + + return repo_path, remote_path + + +class CheckRepoStatusFixture(t.NamedTuple): + """Fixture for check_repo_status test cases.""" + + test_id: str + create_repo: bool + create_git: bool + expected_exists: bool + expected_is_git: bool + + +CHECK_REPO_STATUS_FIXTURES: list[CheckRepoStatusFixture] = [ + CheckRepoStatusFixture( + test_id="repo-exists-with-git", + create_repo=True, + create_git=True, + expected_exists=True, + expected_is_git=True, + ), + CheckRepoStatusFixture( + test_id="repo-exists-no-git", + create_repo=True, + create_git=False, + expected_exists=True, + expected_is_git=False, + ), + CheckRepoStatusFixture( + test_id="repo-missing", + create_repo=False, + create_git=False, + expected_exists=False, + expected_is_git=False, + ), +] + + +class StatusRunFixture(t.NamedTuple): + """Fixture for end-to-end status command runs.""" + + test_id: str + workspace_filter: str | None + output_ndjson: bool + expected_names: list[str] + + +STATUS_RUN_FIXTURES: list[StatusRunFixture] = [ + StatusRunFixture( + test_id="workspace-filter", + workspace_filter="~/code/", + output_ndjson=False, + expected_names=["repo1"], + ), + StatusRunFixture( + test_id="ndjson-output", + workspace_filter=None, + output_ndjson=True, + expected_names=["repo1"], + ), +] + + +class StatusDetailedFixture(t.NamedTuple): + """Fixture for detailed status scenarios.""" + + test_id: str + make_dirty: bool + local_ahead: bool + local_behind: bool + expected_clean: bool + expected_ahead: int + expected_behind: int + + +STATUS_DETAILED_FIXTURES: list[StatusDetailedFixture] = [ + StatusDetailedFixture( + test_id="clean-in-sync", + make_dirty=False, + local_ahead=False, + local_behind=False, + expected_clean=True, + expected_ahead=0, + expected_behind=0, + ), + StatusDetailedFixture( + test_id="dirty-working-tree", + make_dirty=True, + local_ahead=False, + local_behind=False, + expected_clean=False, + expected_ahead=0, + expected_behind=0, + ), + StatusDetailedFixture( + test_id="ahead-of-remote", + make_dirty=False, + local_ahead=True, + local_behind=False, + expected_clean=True, + expected_ahead=1, + expected_behind=0, + ), + StatusDetailedFixture( + test_id="behind-remote", + make_dirty=False, + local_ahead=False, + local_behind=True, + expected_clean=True, + expected_ahead=0, + expected_behind=1, + ), +] + + +@pytest.mark.parametrize( + list(CheckRepoStatusFixture._fields), + CHECK_REPO_STATUS_FIXTURES, + ids=[fixture.test_id for fixture in CHECK_REPO_STATUS_FIXTURES], +) +def test_check_repo_status( + test_id: str, + create_repo: bool, + create_git: bool, + expected_exists: bool, + expected_is_git: bool, + tmp_path: pathlib.Path, +) -> None: + """Test checking individual repository status.""" + repo_path = tmp_path / "test-repo" + + if create_repo: + if create_git: + init_git_repo(repo_path) + else: + repo_path.mkdir(parents=True) + + repo_dict: t.Any = {"name": "test-repo", "path": str(repo_path)} + + status = check_repo_status(repo_dict, detailed=False) + + assert status["exists"] == expected_exists + assert status["is_git"] == expected_is_git + assert status["name"] == "test-repo" + + +def test_status_repos_all( + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + capsys: t.Any, +) -> None: + """Test checking status of all repositories.""" + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + # Create config with repos + config_file = tmp_path / ".vcspull.yaml" + repo1_path = tmp_path / "code" / "repo1" + + config_data = { + str(tmp_path / "code") + "/": { + "repo1": {"repo": "git+https://github.com/user/repo1.git"}, + "repo2": {"repo": "git+https://github.com/user/repo2.git"}, + }, + } + create_test_config(config_file, config_data) + + # Create one repo, leave other missing + init_git_repo(repo1_path) + + # Run status + status_repos( + repo_patterns=[], + config_path=config_file, + workspace_root=None, + detailed=False, + output_json=False, + output_ndjson=False, + color="never", + ) + + captured = capsys.readouterr() + + # Should mention repo1 exists + assert "repo1" in captured.out + # Should mention repo2 is missing + assert "repo2" in captured.out + assert "missing" in captured.out.lower() + # Should have summary + assert "Summary" in captured.out + + +def test_status_repos_json_output( + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + capsys: t.Any, +) -> None: + """Test status output in JSON format.""" + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + config_file = tmp_path / ".vcspull.yaml" + repo_path = tmp_path / "code" / "myrepo" + + config_data = { + str(tmp_path / "code") + "/": { + "myrepo": {"repo": "git+https://github.com/user/myrepo.git"}, + }, + } + create_test_config(config_file, config_data) + + # Create the repo + init_git_repo(repo_path) + + # Run status with JSON output + status_repos( + repo_patterns=[], + config_path=config_file, + workspace_root=None, + detailed=False, + output_json=True, + output_ndjson=False, + color="never", + ) + + captured = capsys.readouterr() + + # Parse JSON output + output_data = json.loads(captured.out) + assert isinstance(output_data, list) + + # Find status and summary entries + status_entries = [item for item in output_data if item.get("reason") == "status"] + summary_entries = [item for item in output_data if item.get("reason") == "summary"] + + assert len(status_entries) > 0 + assert len(summary_entries) == 1 + + # Check status entry + repo_status = status_entries[0] + assert repo_status["name"] == "myrepo" + assert repo_status["exists"] is True + assert repo_status["is_git"] is True + + +def test_status_repos_detailed( + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + capsys: t.Any, +) -> None: + """Test detailed status output.""" + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + config_file = tmp_path / ".vcspull.yaml" + repo_path, remote_path = setup_repo_with_remote(tmp_path) + + config_data = { + str(repo_path.parent) + "/": { + "project": {"repo": f"git+file://{remote_path}"}, + }, + } + create_test_config(config_file, config_data) + + # Run status with detailed mode + status_repos( + repo_patterns=[], + config_path=config_file, + workspace_root=None, + detailed=True, + output_json=False, + output_ndjson=False, + color="never", + ) + + captured = capsys.readouterr() + + # Should show path and branch details in detailed mode + assert "Path:" in captured.out or str(repo_path) in captured.out + assert "Branch:" in captured.out + assert "Ahead/Behind:" in captured.out + + +def test_status_repos_pattern_filter( + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + capsys: t.Any, +) -> None: + """Test status with pattern filtering.""" + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + config_file = tmp_path / ".vcspull.yaml" + + config_data = { + str(tmp_path / "code") + "/": { + "flask": {"repo": "git+https://github.com/pallets/flask.git"}, + "django": {"repo": "git+https://github.com/django/django.git"}, + }, + } + create_test_config(config_file, config_data) + + # Run status with pattern + status_repos( + repo_patterns=["fla*"], + config_path=config_file, + workspace_root=None, + detailed=False, + output_json=False, + output_ndjson=False, + color="never", + ) + + captured = capsys.readouterr() + + # Should only show flask + assert "flask" in captured.out + assert "django" not in captured.out + + +@pytest.mark.parametrize( + list(StatusRunFixture._fields), + STATUS_RUN_FIXTURES, + ids=[fixture.test_id for fixture in STATUS_RUN_FIXTURES], +) +def test_status_repos_workspace_filter_and_ndjson( + test_id: str, + workspace_filter: str | None, + output_ndjson: bool, + expected_names: list[str], + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + capsys: t.Any, +) -> None: + """Test status workspace filtering and NDJSON output.""" + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + config_file = tmp_path / ".vcspull.yaml" + repo_path = tmp_path / "code" / "repo1" + other_repo_path = tmp_path / "work" / "repo2" + + config_data = { + str(tmp_path / "code") + "/": { + "repo1": {"repo": "git+https://github.com/user/repo1.git"}, + }, + str(tmp_path / "work") + "/": { + "repo2": {"repo": "git+https://github.com/user/repo2.git"}, + }, + } + create_test_config(config_file, config_data) + + init_git_repo(repo_path) + init_git_repo(other_repo_path) + + status_repos( + repo_patterns=[], + config_path=config_file, + workspace_root=workspace_filter, + detailed=False, + output_json=False, + output_ndjson=output_ndjson, + color="never", + ) + + captured = capsys.readouterr() + + if output_ndjson: + status_entries = [] + for line in captured.out.splitlines(): + line = line.strip() + if not line: + continue + payload = json.loads(line) + if payload.get("reason") == "status": + status_entries.append(payload) + names = [entry["name"] for entry in status_entries] + for expected in expected_names: + assert expected in names + else: + for expected in expected_names: + assert expected in captured.out + # Ensure other repo is not shown when filtered + if workspace_filter: + assert "repo2" not in captured.out + + +@pytest.mark.parametrize( + list(StatusDetailedFixture._fields), + STATUS_DETAILED_FIXTURES, + ids=[fixture.test_id for fixture in STATUS_DETAILED_FIXTURES], +) +def test_status_repos_detailed_metrics( + test_id: str, + make_dirty: bool, + local_ahead: bool, + local_behind: bool, + expected_clean: bool, + expected_ahead: int, + expected_behind: int, + tmp_path: pathlib.Path, + monkeypatch: MonkeyPatch, + capsys: t.Any, +) -> None: + """Detailed output includes branch and ahead/behind counters.""" + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + repo_path, remote_path = setup_repo_with_remote(tmp_path) + + if make_dirty: + dirty_file = repo_path / f"dirty-{test_id}.txt" + dirty_file.write_text("dirty worktree") + + if local_ahead: + commit_file( + repo_path, + f"ahead-{test_id}.txt", + "ahead", + f"feat: ahead commit for {test_id}", + ) + + if local_behind: + other_clone = tmp_path / "other" + subprocess.run( + ["git", "clone", str(remote_path), str(other_clone)], + check=True, + capture_output=True, + ) + git(other_clone, "checkout", "-B", "main", "origin/main") + configure_git_identity(other_clone) + commit_file( + other_clone, + f"remote-{test_id}.txt", + "remote", + f"feat: remote commit for {test_id}", + ) + git(other_clone, "push", "origin", "main") + git(repo_path, "fetch", "origin") + + config_file = tmp_path / ".vcspull.yaml" + config_data = { + str(repo_path.parent) + "/": { + "project": {"repo": f"git+file://{remote_path}"}, + }, + } + create_test_config(config_file, config_data) + + status_repos( + repo_patterns=[], + config_path=config_file, + workspace_root=None, + detailed=True, + output_json=True, + output_ndjson=False, + color="never", + ) + + captured = capsys.readouterr() + payload = json.loads(captured.out) + status_entries = [entry for entry in payload if entry.get("reason") == "status"] + assert len(status_entries) == 1 + + entry = status_entries[0] + assert entry["name"] == "project" + assert entry["branch"] == "main" + assert entry["clean"] == expected_clean + assert entry["ahead"] == expected_ahead + assert entry["behind"] == expected_behind diff --git a/tests/cli/test_sync_plan_helpers.py b/tests/cli/test_sync_plan_helpers.py new file mode 100644 index 00000000..51b12958 --- /dev/null +++ b/tests/cli/test_sync_plan_helpers.py @@ -0,0 +1,248 @@ +"""Tests for sync planner helper utilities.""" + +from __future__ import annotations + +import pathlib +import subprocess +import typing as t + +import pytest + +from vcspull.cli._output import PlanAction +from vcspull.cli.sync import SyncPlanConfig, _determine_plan_action, _maybe_fetch + + +class MaybeFetchFixture(t.NamedTuple): + """Fixture for _maybe_fetch behaviours.""" + + test_id: str + fetch: bool + offline: bool + create_repo: bool + create_git_dir: bool + subprocess_behavior: str | None + expected_result: tuple[bool, str | None] + + +MAYBE_FETCH_FIXTURES: list[MaybeFetchFixture] = [ + MaybeFetchFixture( + test_id="offline-short-circuit", + fetch=True, + offline=True, + create_repo=True, + create_git_dir=True, + subprocess_behavior=None, + expected_result=(True, None), + ), + MaybeFetchFixture( + test_id="no-git-directory", + fetch=True, + offline=False, + create_repo=True, + create_git_dir=False, + subprocess_behavior=None, + expected_result=(True, None), + ), + MaybeFetchFixture( + test_id="missing-git-executable", + fetch=True, + offline=False, + create_repo=True, + create_git_dir=True, + subprocess_behavior="file-not-found", + expected_result=(False, "git executable not found"), + ), + MaybeFetchFixture( + test_id="fetch-non-zero-exit", + fetch=True, + offline=False, + create_repo=True, + create_git_dir=True, + subprocess_behavior="non-zero", + expected_result=(False, "remote rejected"), + ), + MaybeFetchFixture( + test_id="fetch-oserror", + fetch=True, + offline=False, + create_repo=True, + create_git_dir=True, + subprocess_behavior="os-error", + expected_result=(False, "Permission denied"), + ), + MaybeFetchFixture( + test_id="fetch-disabled", + fetch=False, + offline=False, + create_repo=True, + create_git_dir=True, + subprocess_behavior="non-zero", + expected_result=(True, None), + ), +] + + +@pytest.mark.parametrize( + list(MaybeFetchFixture._fields), + MAYBE_FETCH_FIXTURES, + ids=[fixture.test_id for fixture in MAYBE_FETCH_FIXTURES], +) +def test_maybe_fetch_behaviour( + tmp_path: pathlib.Path, + monkeypatch: pytest.MonkeyPatch, + test_id: str, + fetch: bool, + offline: bool, + create_repo: bool, + create_git_dir: bool, + subprocess_behavior: str | None, + expected_result: tuple[bool, str | None], +) -> None: + """Ensure _maybe_fetch handles subprocess outcomes correctly.""" + repo_path = tmp_path / "repo" + if create_repo: + repo_path.mkdir() + if create_git_dir: + (repo_path / ".git").mkdir(parents=True, exist_ok=True) + + if subprocess_behavior: + + def _patched_run( + *args: t.Any, + **kwargs: t.Any, + ) -> subprocess.CompletedProcess[str]: + if subprocess_behavior == "file-not-found": + error_message = "git executable not found" + raise FileNotFoundError(error_message) + if subprocess_behavior == "os-error": + error_message = "Permission denied" + raise OSError(error_message) + if subprocess_behavior == "non-zero": + return subprocess.CompletedProcess( + args=args[0], + returncode=1, + stdout="", + stderr="remote rejected", + ) + return subprocess.CompletedProcess( + args=args[0], + returncode=0, + stdout="", + stderr="", + ) + + monkeypatch.setattr("subprocess.run", _patched_run) + + result = _maybe_fetch( + repo_path=repo_path, + config=SyncPlanConfig(fetch=fetch, offline=offline), + ) + + assert result == expected_result + + +class DeterminePlanActionFixture(t.NamedTuple): + """Fixture for _determine_plan_action outcomes.""" + + test_id: str + status: dict[str, t.Any] + config: SyncPlanConfig + expected_action: PlanAction + expected_detail: str + + +DETERMINE_PLAN_ACTION_FIXTURES: list[DeterminePlanActionFixture] = [ + DeterminePlanActionFixture( + test_id="missing-repo", + status={"exists": False}, + config=SyncPlanConfig(fetch=False, offline=False), + expected_action=PlanAction.CLONE, + expected_detail="missing", + ), + DeterminePlanActionFixture( + test_id="not-git", + status={"exists": True, "is_git": False}, + config=SyncPlanConfig(fetch=True, offline=False), + expected_action=PlanAction.BLOCKED, + expected_detail="not a git repository", + ), + DeterminePlanActionFixture( + test_id="dirty-working-tree", + status={"exists": True, "is_git": True, "clean": False}, + config=SyncPlanConfig(fetch=True, offline=False), + expected_action=PlanAction.BLOCKED, + expected_detail="working tree has local changes", + ), + DeterminePlanActionFixture( + test_id="diverged", + status={"exists": True, "is_git": True, "clean": True, "ahead": 2, "behind": 3}, + config=SyncPlanConfig(fetch=True, offline=False), + expected_action=PlanAction.BLOCKED, + expected_detail="diverged (ahead 2, behind 3)", + ), + DeterminePlanActionFixture( + test_id="behind-remote", + status={"exists": True, "is_git": True, "clean": True, "ahead": 0, "behind": 4}, + config=SyncPlanConfig(fetch=True, offline=False), + expected_action=PlanAction.UPDATE, + expected_detail="behind 4", + ), + DeterminePlanActionFixture( + test_id="ahead-remote", + status={"exists": True, "is_git": True, "clean": True, "ahead": 1, "behind": 0}, + config=SyncPlanConfig(fetch=True, offline=False), + expected_action=PlanAction.BLOCKED, + expected_detail="ahead by 1", + ), + DeterminePlanActionFixture( + test_id="up-to-date", + status={"exists": True, "is_git": True, "clean": True, "ahead": 0, "behind": 0}, + config=SyncPlanConfig(fetch=True, offline=False), + expected_action=PlanAction.UNCHANGED, + expected_detail="up to date", + ), + DeterminePlanActionFixture( + test_id="offline-remote-unknown", + status={ + "exists": True, + "is_git": True, + "clean": True, + "ahead": None, + "behind": None, + }, + config=SyncPlanConfig(fetch=True, offline=True), + expected_action=PlanAction.UPDATE, + expected_detail="remote state unknown (offline)", + ), + DeterminePlanActionFixture( + test_id="needs-fetch", + status={ + "exists": True, + "is_git": True, + "clean": True, + "ahead": None, + "behind": None, + }, + config=SyncPlanConfig(fetch=True, offline=False), + expected_action=PlanAction.UPDATE, + expected_detail="remote state unknown; use --fetch", + ), +] + + +@pytest.mark.parametrize( + list(DeterminePlanActionFixture._fields), + DETERMINE_PLAN_ACTION_FIXTURES, + ids=[fixture.test_id for fixture in DETERMINE_PLAN_ACTION_FIXTURES], +) +def test_determine_plan_action( + test_id: str, + status: dict[str, t.Any], + config: SyncPlanConfig, + expected_action: PlanAction, + expected_detail: str, +) -> None: + """Verify _determine_plan_action handles edge cases.""" + action, detail = _determine_plan_action(status, config=config) + assert action is expected_action + assert detail == expected_detail diff --git a/tests/fixtures/example.py b/tests/fixtures/example.py index e84d19ce..eb7c4ca2 100644 --- a/tests/fixtures/example.py +++ b/tests/fixtures/example.py @@ -46,30 +46,35 @@ "name": "linux", "path": pathlib.Path("/home/me/myproject/study/linux"), "url": "git+git://git.kernel.org/linux/torvalds/linux.git", + "workspace_root": "/home/me/myproject/study/", }, { "vcs": "git", "name": "freebsd", "path": pathlib.Path("/home/me/myproject/study/freebsd"), "url": "git+https://github.com/freebsd/freebsd.git", + "workspace_root": "/home/me/myproject/study/", }, { "vcs": "git", "name": "sphinx", "path": pathlib.Path("/home/me/myproject/study/sphinx"), "url": "hg+https://bitbucket.org/birkenfeld/sphinx", + "workspace_root": "/home/me/myproject/study/", }, { "vcs": "git", "name": "docutils", "path": pathlib.Path("/home/me/myproject/study/docutils"), "url": "svn+http://svn.code.sf.net/p/docutils/code/trunk", + "workspace_root": "/home/me/myproject/study/", }, { "vcs": "git", "name": "kaptan", "url": "git+git@github.com:tony/kaptan.git", "path": pathlib.Path("/home/me/myproject/github_projects/kaptan"), + "workspace_root": "/home/me/myproject/github_projects/", "remotes": { "upstream": GitRemote( name="upstream", @@ -88,6 +93,7 @@ "name": ".vim", "path": pathlib.Path("/home/me/myproject/.vim"), "url": "git+git@github.com:tony/vim-config.git", + "workspace_root": "/home/me/myproject", "shell_command_after": ["ln -sf /home/me/.vim/.vimrc /home/me/.vimrc"], }, { @@ -95,6 +101,7 @@ "name": ".tmux", "path": pathlib.Path("/home/me/myproject/.tmux"), "url": "git+git@github.com:tony/tmux-config.git", + "workspace_root": "/home/me/myproject", "shell_command_after": ["ln -sf /home/me/.tmux/.tmux.conf /home/me/.tmux.conf"], }, ] diff --git a/tests/test_cli.py b/tests/test_cli.py index cc221542..ba502786 100644 --- a/tests/test_cli.py +++ b/tests/test_cli.py @@ -3,7 +3,11 @@ from __future__ import annotations import contextlib +import importlib +import json +import pathlib import shutil +import sys import typing as t import pytest @@ -11,8 +15,11 @@ from vcspull.__about__ import __version__ from vcspull.cli import cli +from vcspull.cli._output import PlanAction, PlanEntry, PlanResult, PlanSummary from vcspull.cli.sync import EXIT_ON_ERROR_MSG, NO_REPOS_FOR_TERM_MSG +sync_module = importlib.import_module("vcspull.cli.sync") + if t.TYPE_CHECKING: import pathlib @@ -173,7 +180,7 @@ class SyncFixture(t.NamedTuple): test_id="sync--empty", sync_args=["sync"], expected_exit_code=0, - expected_in_out=["positional arguments:"], + expected_in_out=["No repositories matched the criteria."], ), # Sync: Help SyncFixture( @@ -200,6 +207,34 @@ class SyncFixture(t.NamedTuple): ] +class CLINegativeFixture(t.NamedTuple): + """Fixture for CLI negative flow validation.""" + + test_id: str + cli_args: list[str] + scenario: t.Literal["discover-non-dict-config", "status-missing-git"] + expected_log_fragment: str | None + expected_stdout_fragment: str | None + + +CLI_NEGATIVE_FIXTURES: list[CLINegativeFixture] = [ + CLINegativeFixture( + test_id="discover-invalid-config", + cli_args=["discover"], + scenario="discover-non-dict-config", + expected_log_fragment="not a valid YAML dictionary", + expected_stdout_fragment=None, + ), + CLINegativeFixture( + test_id="status-missing-git", + cli_args=["status", "--detailed"], + scenario="status-missing-git", + expected_log_fragment=None, + expected_stdout_fragment="Summary:", + ), +] + + @pytest.mark.parametrize( list(SyncFixture._fields), SYNC_REPO_FIXTURES, @@ -410,3 +445,532 @@ def test_sync_broken( expected_not_in_err = [expected_not_in_err] for needle in expected_not_in_err: assert needle not in err + + +@pytest.mark.parametrize( + list(CLINegativeFixture._fields), + CLI_NEGATIVE_FIXTURES, + ids=[fixture.test_id for fixture in CLI_NEGATIVE_FIXTURES], +) +def test_cli_negative_flows( + test_id: str, + cli_args: list[str], + scenario: t.Literal["discover-non-dict-config", "status-missing-git"], + expected_log_fragment: str | None, + expected_stdout_fragment: str | None, + tmp_path: pathlib.Path, + capsys: pytest.CaptureFixture[str], + caplog: pytest.LogCaptureFixture, + monkeypatch: pytest.MonkeyPatch, +) -> None: + """Exercise common CLI error flows without raising.""" + import logging + import subprocess + + import yaml + + caplog.set_level(logging.INFO) + + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + if scenario == "discover-non-dict-config": + scan_dir = tmp_path / "scan" + scan_dir.mkdir(parents=True, exist_ok=True) + config_file = tmp_path / "config.yaml" + config_file.write_text("[]\n", encoding="utf-8") + + with contextlib.suppress(SystemExit): + cli([*cli_args, str(scan_dir), "--file", str(config_file)]) + else: + workspace_dir = tmp_path / "workspace" + repo_dir = workspace_dir / "project" + repo_dir.mkdir(parents=True, exist_ok=True) + (repo_dir / ".git").mkdir() + + config_file = tmp_path / "status.yaml" + config_file.write_text( + yaml.dump( + { + "~/workspace/": { + "project": { + "url": "git+https://example.com/project.git", + "path": str(repo_dir), + }, + }, + } + ), + encoding="utf-8", + ) + + def _missing_git( + cmd: list[str], **kwargs: object + ) -> subprocess.CompletedProcess[str]: + if cmd and cmd[0] == "git": + error_message = "git not installed" + raise FileNotFoundError(error_message) + return subprocess.CompletedProcess(cmd, 0, stdout="", stderr="") + + monkeypatch.setattr("vcspull.cli.status.subprocess.run", _missing_git) + + with contextlib.suppress(SystemExit): + cli([*cli_args, "--file", str(config_file)]) + + captured = capsys.readouterr() + + if expected_log_fragment is not None: + assert expected_log_fragment in caplog.text + + if expected_stdout_fragment is not None: + assert expected_stdout_fragment in captured.out + + +class DryRunPlanFixture(t.NamedTuple): + """Fixture for Terraform-style dry-run plan output.""" + + test_id: str + cli_args: list[str] + pre_sync: bool = False + expected_contains: list[str] | None = None + expected_not_contains: list[str] | None = None + repository_names: tuple[str, ...] = ("my_git_repo",) + force_tty: bool = False + plan_entries: list[PlanEntry] | None = None + plan_summary: PlanSummary | None = None + set_no_color: bool = True + + +DRY_RUN_PLAN_FIXTURES: list[DryRunPlanFixture] = [ + DryRunPlanFixture( + test_id="clone-default", + cli_args=["sync", "--dry-run", "my_git_repo"], + expected_contains=[ + "Plan: 1 to clone (+)", + "+ my_git_repo", + "missing", + ], + ), + DryRunPlanFixture( + test_id="summary-only", + cli_args=["sync", "--dry-run", "--summary-only", "my_git_repo"], + expected_contains=["Plan: 1 to clone (+)", "Tip: run without --dry-run"], + expected_not_contains=["~/github_projects/"], + ), + DryRunPlanFixture( + test_id="unchanged-show", + cli_args=["sync", "--dry-run", "--show-unchanged", "my_git_repo"], + pre_sync=True, + expected_contains=["Plan: 0 to clone (+)", "✓ my_git_repo"], + ), + DryRunPlanFixture( + test_id="long-format", + cli_args=["sync", "--dry-run", "--long", "repo-long"], + expected_contains=[ + "Plan: 1 to clone (+)", + "+ repo-long", + "url: git+https://example.com/repo-long.git", + ], + repository_names=("repo-long",), + plan_entries=[ + PlanEntry( + name="repo-long", + path="~/github_projects/repo-long", + workspace_root="~/github_projects/", + action=PlanAction.CLONE, + detail="missing", + url="git+https://example.com/repo-long.git", + ), + ], + ), + DryRunPlanFixture( + test_id="relative-paths", + cli_args=["sync", "--dry-run", "--relative-paths", "repo-rel"], + expected_contains=[ + "Plan: 0 to clone (+), 1 to update (~)", + "~ repo-rel", + "repo-rel remote state unknown; use --fetch", + ], + expected_not_contains=["~/github_projects/repo-rel"], + repository_names=("repo-rel",), + plan_entries=[ + PlanEntry( + name="repo-rel", + path="~/github_projects/repo-rel", + workspace_root="~/github_projects/", + action=PlanAction.UPDATE, + detail="remote state unknown; use --fetch", + ), + ], + ), + DryRunPlanFixture( + test_id="offline-detail", + cli_args=["sync", "--dry-run", "--offline", "repo-offline"], + expected_contains=[ + "Plan: 0 to clone (+), 1 to update (~)", + "~ repo-offline", + "remote state unknown (offline)", + ], + repository_names=("repo-offline",), + plan_entries=[ + PlanEntry( + name="repo-offline", + path="~/github_projects/repo-offline", + workspace_root="~/github_projects/", + action=PlanAction.UPDATE, + detail="remote state unknown (offline)", + ), + ], + ), +] + + +@pytest.mark.parametrize( + list(DryRunPlanFixture._fields), + DRY_RUN_PLAN_FIXTURES, + ids=[fixture.test_id for fixture in DRY_RUN_PLAN_FIXTURES], +) +def test_sync_dry_run_plan_human( + test_id: str, + cli_args: list[str], + pre_sync: bool, + expected_contains: list[str] | None, + expected_not_contains: list[str] | None, + repository_names: tuple[str, ...], + force_tty: bool, + plan_entries: list[PlanEntry] | None, + plan_summary: PlanSummary | None, + set_no_color: bool, + tmp_path: pathlib.Path, + capsys: pytest.CaptureFixture[str], + monkeypatch: pytest.MonkeyPatch, + user_path: pathlib.Path, + config_path: pathlib.Path, + git_repo: GitSync, +) -> None: + """Validate human-readable plan output variants.""" + if set_no_color: + monkeypatch.setenv("NO_COLOR", "1") + + config: dict[str, dict[str, dict[str, t.Any]]] = {"~/github_projects/": {}} + for name in repository_names: + config["~/github_projects/"][name] = { + "url": f"git+file://{git_repo.path}", + "remotes": {"origin": f"git+file://{git_repo.path}"}, + } + + yaml_config = config_path / ".vcspull.yaml" + yaml_config.write_text( + yaml.dump(config, default_flow_style=False), + encoding="utf-8", + ) + + monkeypatch.chdir(tmp_path) + + workspace_root = pathlib.Path(user_path) / "github_projects" + for name in repository_names: + candidate = workspace_root / name + if candidate.exists(): + shutil.rmtree(candidate) + + if force_tty: + monkeypatch.setattr(sys.stdout, "isatty", lambda: True) + + if pre_sync: + with contextlib.suppress(SystemExit): + cli(["sync", repository_names[0]]) + + if plan_entries is not None: + for entry in plan_entries: + entry.path = str(workspace_root / entry.name) + computed_summary = plan_summary + if computed_summary is None: + computed_summary = PlanSummary( + clone=sum(entry.action is PlanAction.CLONE for entry in plan_entries), + update=sum(entry.action is PlanAction.UPDATE for entry in plan_entries), + unchanged=sum( + entry.action is PlanAction.UNCHANGED for entry in plan_entries + ), + blocked=sum( + entry.action is PlanAction.BLOCKED for entry in plan_entries + ), + errors=sum(entry.action is PlanAction.ERROR for entry in plan_entries), + ) + + async def _fake_plan(*args: t.Any, **kwargs: t.Any) -> PlanResult: + return PlanResult(entries=plan_entries, summary=computed_summary) + + monkeypatch.setattr(sync_module, "_build_plan_result_async", _fake_plan) + + with contextlib.suppress(SystemExit): + cli(cli_args) + + captured = capsys.readouterr() + output = "".join([captured.out, captured.err]) + + if expected_contains: + for needle in expected_contains: + assert needle in output + + if expected_not_contains: + for needle in expected_not_contains: + assert needle not in output + + +class DryRunPlanMachineFixture(t.NamedTuple): + """Fixture for JSON/NDJSON plan output.""" + + test_id: str + cli_args: list[str] + mode: t.Literal["json", "ndjson"] + expected_summary: dict[str, int] + repository_names: tuple[str, ...] = ("my_git_repo",) + pre_sync: bool = True + plan_entries: list[PlanEntry] | None = None + plan_summary: PlanSummary | None = None + expected_operation_subset: dict[str, t.Any] | None = None + + +DRY_RUN_PLAN_MACHINE_FIXTURES: list[DryRunPlanMachineFixture] = [ + DryRunPlanMachineFixture( + test_id="json-summary", + cli_args=["sync", "--dry-run", "--json", "--show-unchanged", "my_git_repo"], + mode="json", + expected_summary={ + "clone": 0, + "update": 0, + "unchanged": 1, + "blocked": 0, + "errors": 0, + }, + ), + DryRunPlanMachineFixture( + test_id="ndjson-summary", + cli_args=["sync", "--dry-run", "--ndjson", "--show-unchanged", "my_git_repo"], + mode="ndjson", + expected_summary={ + "clone": 0, + "update": 0, + "unchanged": 1, + "blocked": 0, + "errors": 0, + }, + ), + DryRunPlanMachineFixture( + test_id="json-operation-fields", + cli_args=["sync", "--dry-run", "--json", "repo-json"], + mode="json", + expected_summary={ + "clone": 0, + "update": 1, + "unchanged": 0, + "blocked": 0, + "errors": 0, + }, + repository_names=("repo-json",), + pre_sync=False, + plan_entries=[ + PlanEntry( + name="repo-json", + path="~/github_projects/repo-json", + workspace_root="~/github_projects/", + action=PlanAction.UPDATE, + detail="behind 2", + ahead=0, + behind=2, + branch="main", + remote_branch="origin/main", + ) + ], + expected_operation_subset={ + "name": "repo-json", + "detail": "behind 2", + "behind": 2, + "branch": "main", + }, + ), + DryRunPlanMachineFixture( + test_id="ndjson-operation-fields", + cli_args=["sync", "--dry-run", "--ndjson", "repo-ndjson"], + mode="ndjson", + expected_summary={ + "clone": 1, + "update": 0, + "unchanged": 0, + "blocked": 0, + "errors": 0, + }, + repository_names=("repo-ndjson",), + pre_sync=False, + plan_entries=[ + PlanEntry( + name="repo-ndjson", + path="~/github_projects/repo-ndjson", + workspace_root="~/github_projects/", + action=PlanAction.CLONE, + detail="missing", + url="git+https://example.com/repo-ndjson.git", + ) + ], + expected_operation_subset={ + "name": "repo-ndjson", + "action": "clone", + "url": "git+https://example.com/repo-ndjson.git", + }, + ), +] + + +@pytest.mark.parametrize( + list(DryRunPlanMachineFixture._fields), + DRY_RUN_PLAN_MACHINE_FIXTURES, + ids=[fixture.test_id for fixture in DRY_RUN_PLAN_MACHINE_FIXTURES], +) +def test_sync_dry_run_plan_machine( + test_id: str, + cli_args: list[str], + mode: t.Literal["json", "ndjson"], + expected_summary: dict[str, int], + repository_names: tuple[str, ...], + pre_sync: bool, + plan_entries: list[PlanEntry] | None, + plan_summary: PlanSummary | None, + expected_operation_subset: dict[str, t.Any] | None, + tmp_path: pathlib.Path, + capsys: pytest.CaptureFixture[str], + monkeypatch: pytest.MonkeyPatch, + user_path: pathlib.Path, + config_path: pathlib.Path, + git_repo: GitSync, +) -> None: + """Validate machine-readable plan parity.""" + monkeypatch.setenv("NO_COLOR", "1") + + config: dict[str, dict[str, dict[str, t.Any]]] = {"~/github_projects/": {}} + for name in repository_names: + config["~/github_projects/"][name] = { + "url": f"git+file://{git_repo.path}", + "remotes": {"origin": f"git+file://{git_repo.path}"}, + } + + yaml_config = config_path / ".vcspull.yaml" + yaml_config.write_text( + yaml.dump(config, default_flow_style=False), + encoding="utf-8", + ) + + monkeypatch.chdir(tmp_path) + + workspace_root = pathlib.Path(user_path) / "github_projects" + for name in repository_names: + candidate = workspace_root / name + if candidate.exists(): + shutil.rmtree(candidate) + + if pre_sync: + with contextlib.suppress(SystemExit): + cli(["sync", repository_names[0]]) + capsys.readouterr() + + if plan_entries is not None: + for entry in plan_entries: + entry.path = str(workspace_root / entry.name) + computed_summary = plan_summary + if computed_summary is None: + computed_summary = PlanSummary( + clone=sum(entry.action is PlanAction.CLONE for entry in plan_entries), + update=sum(entry.action is PlanAction.UPDATE for entry in plan_entries), + unchanged=sum( + entry.action is PlanAction.UNCHANGED for entry in plan_entries + ), + blocked=sum( + entry.action is PlanAction.BLOCKED for entry in plan_entries + ), + errors=sum(entry.action is PlanAction.ERROR for entry in plan_entries), + ) + + async def _fake_plan(*args: t.Any, **kwargs: t.Any) -> PlanResult: + return PlanResult(entries=plan_entries, summary=computed_summary) + + monkeypatch.setattr(sync_module, "_build_plan_result_async", _fake_plan) + + with contextlib.suppress(SystemExit): + cli(cli_args) + + captured = capsys.readouterr() + + if mode == "json": + payload = json.loads(captured.out) + summary = payload["summary"] + else: + events = [ + json.loads(line) for line in captured.out.splitlines() if line.strip() + ] + assert events, "Expected NDJSON payload" + summary = events[-1] + if expected_operation_subset: + operation_payload = next( + (event for event in events if event.get("type") == "operation"), + None, + ) + assert operation_payload is not None + for key, value in expected_operation_subset.items(): + assert operation_payload[key] == value + + assert summary["clone"] == expected_summary["clone"] + assert summary["update"] == expected_summary["update"] + assert summary["unchanged"] == expected_summary["unchanged"] + assert summary["blocked"] == expected_summary["blocked"] + assert summary["errors"] == expected_summary["errors"] + + if mode == "json" and expected_operation_subset: + operations: list[dict[str, t.Any]] = [] + for workspace in payload["workspaces"]: + operations.extend(workspace["operations"]) + assert operations, "Expected at least one operation payload" + for key, value in expected_operation_subset.items(): + assert operations[0][key] == value + + +def test_sync_dry_run_plan_progress( + tmp_path: pathlib.Path, + capsys: pytest.CaptureFixture[str], + monkeypatch: pytest.MonkeyPatch, + user_path: pathlib.Path, + config_path: pathlib.Path, + git_repo: GitSync, +) -> None: + """TTY dry-run should surface a live progress line.""" + config = { + "~/github_projects/": { + "repo_one": { + "url": f"git+file://{git_repo.path}", + "remotes": {"origin": f"git+file://{git_repo.path}"}, + }, + "repo_two": { + "url": f"git+file://{git_repo.path}", + "remotes": {"origin": f"git+file://{git_repo.path}"}, + }, + } + } + yaml_config = config_path / ".vcspull.yaml" + yaml_config.write_text( + yaml.dump(config, default_flow_style=False), + encoding="utf-8", + ) + + monkeypatch.chdir(tmp_path) + monkeypatch.setattr(sys.stdout, "isatty", lambda: True) + + workspace_root = pathlib.Path(user_path) / "github_projects" + for name in ("repo_one", "repo_two"): + candidate = workspace_root / name + if candidate.exists(): + shutil.rmtree(candidate) + + with contextlib.suppress(SystemExit): + cli(["sync", "--dry-run", "repo_*"]) + + captured = capsys.readouterr() + output = "".join([captured.out, captured.err]) + assert "Progress:" in output + assert "Plan:" in output diff --git a/tests/test_config.py b/tests/test_config.py index 9baaea13..c61d231b 100644 --- a/tests/test_config.py +++ b/tests/test_config.py @@ -2,6 +2,7 @@ from __future__ import annotations +import pathlib import typing as t import pytest @@ -9,9 +10,7 @@ from vcspull import config if t.TYPE_CHECKING: - import pathlib - - from vcspull.types import ConfigDict + from vcspull.types import ConfigDict, RawConfigDict class LoadYAMLFn(t.Protocol): @@ -82,3 +81,64 @@ def test_relative_dir(load_yaml: LoadYAMLFn) -> None: assert path / "relativedir" == repo["path"].parent assert path / "relativedir" / "docutils" == repo["path"] + + +class ExtractWorkspaceFixture(t.NamedTuple): + """Fixture capturing workspace root injection scenarios.""" + + test_id: str + raw_config: dict[str, dict[str, str | dict[str, str]]] + expected_roots: dict[str, str] + + +EXTRACT_WORKSPACE_FIXTURES: list[ExtractWorkspaceFixture] = [ + ExtractWorkspaceFixture( + test_id="tilde-workspace", + raw_config={ + "~/code/": { + "alpha": {"repo": "git+https://example.com/alpha.git"}, + }, + }, + expected_roots={"alpha": "~/code/"}, + ), + ExtractWorkspaceFixture( + test_id="relative-workspace", + raw_config={ + "./projects": { + "beta": "git+https://example.com/beta.git", + }, + }, + expected_roots={"beta": "./projects"}, + ), +] + + +@pytest.mark.parametrize( + list(ExtractWorkspaceFixture._fields), + EXTRACT_WORKSPACE_FIXTURES, + ids=[fixture.test_id for fixture in EXTRACT_WORKSPACE_FIXTURES], +) +def test_extract_repos_injects_workspace_root( + test_id: str, + raw_config: dict[str, dict[str, str | dict[str, str]]], + expected_roots: dict[str, str], + tmp_path: pathlib.Path, + monkeypatch: pytest.MonkeyPatch, +) -> None: + """Ensure extract_repos assigns workspace_root consistently.""" + import pathlib as pl + + monkeypatch.setenv("HOME", str(tmp_path)) + monkeypatch.chdir(tmp_path) + + typed_raw_config = t.cast("RawConfigDict", raw_config) + repos = config.extract_repos(typed_raw_config, cwd=tmp_path) + + assert len(repos) == len(expected_roots) + + for repo in repos: + name = repo["name"] + expected_root = expected_roots[name] + assert repo["workspace_root"] == expected_root + expected_path = config.expand_dir(pl.Path(expected_root), cwd=tmp_path) / name + assert repo["path"] == expected_path diff --git a/tests/test_log.py b/tests/test_log.py index 22eef6a0..6631b272 100644 --- a/tests/test_log.py +++ b/tests/test_log.py @@ -426,8 +426,14 @@ def test_get_cli_logger_names_includes_base() -> None: names = get_cli_logger_names(include_self=True) expected = [ "vcspull.cli", - "vcspull.cli._import", + "vcspull.cli._colors", + "vcspull.cli._output", + "vcspull.cli._workspaces", + "vcspull.cli.add", + "vcspull.cli.discover", "vcspull.cli.fmt", + "vcspull.cli.list", + "vcspull.cli.status", "vcspull.cli.sync", ] assert names == expected diff --git a/tests/test_sync.py b/tests/test_sync.py index e7a379ed..9ba9deed 100644 --- a/tests/test_sync.py +++ b/tests/test_sync.py @@ -268,6 +268,7 @@ def test_updating_remote( "name": "myclone", "path": tmp_path / "study/myrepo/myclone", "url": f"git+file://{dummy_repo}", + "workspace_root": str(tmp_path / "study/myrepo/"), "remotes": { mirror_name: GitRemote( name=mirror_name, diff --git a/uv.lock b/uv.lock index b7e72028..ac793425 100644 --- a/uv.lock +++ b/uv.lock @@ -68,6 +68,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/b7/b8/3fe70c75fe32afc4bb507f75563d39bc5642255d1d94f1f23604725780bf/babel-2.17.0-py3-none-any.whl", hash = "sha256:4d0b53093fdfb4b21c92b5213dba5a1b23885afa8383709427046b21c366e5f2", size = 10182537, upload-time = "2025-02-01T15:17:37.39Z" }, ] +[[package]] +name = "backports-asyncio-runner" +version = "1.2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/8e/ff/70dca7d7cb1cbc0edb2c6cc0c38b65cba36cccc491eca64cabd5fe7f8670/backports_asyncio_runner-1.2.0.tar.gz", hash = "sha256:a5aa7b2b7d8f8bfcaa2b57313f70792df84e32a2a746f585213373f900b42162", size = 69893, upload-time = "2025-07-02T02:27:15.685Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a0/59/76ab57e3fe74484f48a53f8e337171b4a2349e506eabe136d7e01d059086/backports_asyncio_runner-1.2.0-py3-none-any.whl", hash = "sha256:0da0a936a8aeb554eccb426dc55af3ba63bcdc69fa1a600b5bb305413a4477b5", size = 12313, upload-time = "2025-07-02T02:27:14.263Z" }, +] + [[package]] name = "beautifulsoup4" version = "4.14.2" @@ -932,6 +941,20 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/a8/a4/20da314d277121d6534b3a980b29035dcd51e6744bd79075a6ce8fa4eb8d/pytest-8.4.2-py3-none-any.whl", hash = "sha256:872f880de3fc3a5bdc88a11b39c9710c3497a547cfa9320bc3c5e62fbf272e79", size = 365750, upload-time = "2025-09-04T14:34:20.226Z" }, ] +[[package]] +name = "pytest-asyncio" +version = "1.2.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "backports-asyncio-runner", marker = "python_full_version < '3.11'" }, + { name = "pytest" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/42/86/9e3c5f48f7b7b638b216e4b9e645f54d199d7abbbab7a64a13b4e12ba10f/pytest_asyncio-1.2.0.tar.gz", hash = "sha256:c609a64a2a8768462d0c99811ddb8bd2583c33fd33cf7f21af1c142e824ffb57", size = 50119, upload-time = "2025-09-12T07:33:53.816Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/93/2fa34714b7a4ae72f2f8dad66ba17dd9a2c793220719e736dda28b7aec27/pytest_asyncio-1.2.0-py3-none-any.whl", hash = "sha256:8e17ae5e46d8e7efe51ab6494dd2010f4ca8dae51652aa3c8d55acf50bfb2e99", size = 15095, upload-time = "2025-09-12T07:33:52.639Z" }, +] + [[package]] name = "pytest-cov" version = "7.0.0" @@ -1572,6 +1595,7 @@ dev = [ { name = "myst-parser", version = "3.0.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.10'" }, { name = "myst-parser", version = "4.0.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.10'" }, { name = "pytest" }, + { name = "pytest-asyncio" }, { name = "pytest-cov" }, { name = "pytest-mock" }, { name = "pytest-rerunfailures", version = "16.0.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.10'" }, @@ -1618,6 +1642,7 @@ lint = [ testing = [ { name = "gp-libs" }, { name = "pytest" }, + { name = "pytest-asyncio" }, { name = "pytest-mock" }, { name = "pytest-rerunfailures", version = "16.0.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.10'" }, { name = "pytest-rerunfailures", version = "16.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.10'" }, @@ -1651,6 +1676,7 @@ dev = [ { name = "mypy" }, { name = "myst-parser" }, { name = "pytest" }, + { name = "pytest-asyncio" }, { name = "pytest-cov" }, { name = "pytest-mock" }, { name = "pytest-rerunfailures" }, @@ -1689,6 +1715,7 @@ lint = [ testing = [ { name = "gp-libs" }, { name = "pytest" }, + { name = "pytest-asyncio" }, { name = "pytest-mock" }, { name = "pytest-rerunfailures" }, { name = "pytest-watcher" },