diff --git a/CHANGELOG.md b/CHANGELOG.md index 24297ae51..2fb788f8a 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -18,6 +18,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - Regression tests for `apm compile` placement of narrow `applyTo` patterns: instructions whose matches all live deep inside one subtree are now pinned to the deepest covering directory instead of being hoisted to the project root, across both selective and single-point placement strategies. Also covers the file-walk cache that skips repeated filesystem scans for the same glob. (#871) - **`apm pack` marketplace builder hardening.** Local source paths are now emitted relative to `metadata.pluginRoot` (fixes double-prefix bug). New pass-through fields: `author`, `license`, `repository`, `keywords` (alias for `tags`). Curator-wins override semantics for `description`/`version` on remote entries. Security guards reject path traversal and absolute paths post-subtraction. (#1061) - **Plugin manifest schema-conformance tests.** `tests/unit/test_plugin_exporter_schema.py` validates every shape of `plugin.json` produced by `apm pack` (synthesized, authored, and authored-with-stale-keys) against the vendored official schema. Companion marketplace conformance lives in `tests/unit/marketplace/test_schema_conformance.py`. (#1061) +- **APM now compiles and integrates to Windsurf/Cascade.** New first-class `--target windsurf` support: instructions deploy as `.windsurf/rules/` with trigger frontmatter, agents deploy as `.windsurf/skills//SKILL.md`, commands as `.windsurf/workflows/`, hooks merge into `.windsurf/hooks.json`, and MCP servers configure via `~/.codeium/windsurf/mcp_config.json`. Auto-detection, user-scope deployment, and `apm pack` all support the new target. (#1066) - Slash commands installed from APM packages now surface argument hints in Claude Code -- `apm install` automatically maps prompt `input:` to Claude's `arguments:` front-matter, rewrites `${input:name}` references to `$name`, and auto-generates `argument-hint`. Argument names are validated against an allowlist to prevent YAML injection from third-party packages, and the mapping is reported at install time. (#1039) ### Changed diff --git a/README.md b/README.md index d84b8bb82..68b2bab9c 100644 --- a/README.md +++ b/README.md @@ -4,7 +4,7 @@ Think `package.json`, `requirements.txt`, or `Cargo.toml` — but for AI agent configuration. -GitHub Copilot · Claude Code · Cursor · OpenCode · Codex · Gemini +GitHub Copilot · Claude Code · Cursor · OpenCode · Codex · Gemini · Windsurf **[Documentation](https://microsoft.github.io/apm/)** · **[Quick Start](https://microsoft.github.io/apm/getting-started/quick-start/)** · **[CLI Reference](https://microsoft.github.io/apm/reference/cli-commands/)** · **[Roadmap](https://github.com/orgs/microsoft/projects/2304)** @@ -67,7 +67,7 @@ One command, no configuration -- VS Code and GitHub Copilot read the file automa One `apm.yml` describes every primitive your agents need — instructions, skills, prompts, agents, hooks, plugins, MCP servers — and `apm install` reproduces the exact same setup across every client on every machine. `apm.lock.yaml` pins the resolved tree the way `package-lock.json` does for npm. -- **[One manifest for everything](https://microsoft.github.io/apm/reference/primitive-types/)** — declared once, deployed across Copilot, Claude, Cursor, OpenCode, Codex, Gemini +- **[One manifest for everything](https://microsoft.github.io/apm/reference/primitive-types/)** — declared once, deployed across Copilot, Claude, Cursor, OpenCode, Codex, Gemini, Windsurf - **[Install from anywhere](https://microsoft.github.io/apm/guides/dependencies/)** — GitHub, GitLab, Bitbucket, Azure DevOps, GitHub Enterprise, any git host - **[Transitive dependencies](https://microsoft.github.io/apm/guides/dependencies/)** — packages can depend on packages; APM resolves the full tree - **[Author plugins](https://microsoft.github.io/apm/guides/plugins/)** — build Copilot, Claude, and Cursor plugins with dependency management, then export standard `plugin.json` @@ -145,7 +145,7 @@ apm marketplace add github/awesome-copilot apm install azure-cloud-development@awesome-copilot ``` -Or add an MCP server (wired into Copilot, Claude, Cursor, Codex, OpenCode, and Gemini): +Or add an MCP server (wired into Copilot, Claude, Cursor, Codex, OpenCode, Gemini, and Windsurf): ```bash apm install --mcp io.github.github/github-mcp-server --transport http # connects over HTTPS diff --git a/build/apm.spec b/build/apm.spec index d6401dd79..0a37c4979 100644 --- a/build/apm.spec +++ b/build/apm.spec @@ -129,6 +129,7 @@ hiddenimports = [ 'apm_cli.adapters.client', 'apm_cli.adapters.client.base', 'apm_cli.adapters.client.vscode', + 'apm_cli.adapters.client.windsurf', 'apm_cli.adapters.package_manager', 'apm_cli.compilation', # Add compilation module 'apm_cli.compilation.agents_compiler', diff --git a/docs/src/content/docs/guides/compilation.md b/docs/src/content/docs/guides/compilation.md index 6e3d0883e..40137d365 100644 --- a/docs/src/content/docs/guides/compilation.md +++ b/docs/src/content/docs/guides/compilation.md @@ -4,7 +4,7 @@ sidebar: order: 1 --- -Compilation is **optional for some users**. If your team uses GitHub Copilot, Claude, or Cursor, `apm install` deploys all primitives in their native format -- you can skip this guide entirely. For Gemini, `apm install` deploys commands, skills, and hooks, but instructions require `apm compile` to generate `GEMINI.md`. For OpenCode and Codex, `apm install` deploys agents, commands, skills, and hooks, but instructions require `apm compile` to generate `AGENTS.md`. +Compilation is **optional for some users**. If your team uses GitHub Copilot, Claude, or Cursor, `apm install` deploys all primitives in their native format -- you can skip this guide entirely. For Gemini, `apm install` deploys commands, skills, and hooks, but instructions require `apm compile` to generate `GEMINI.md`. For OpenCode and Codex, `apm install` deploys agents, commands, skills, and hooks, but instructions require `apm compile` to generate `AGENTS.md`. For Windsurf, `apm install` deploys all primitives natively (instructions to `.windsurf/rules/`, agents to `.windsurf/skills/`); `apm compile` is optional if you also want a compiled `AGENTS.md` roll-up. **Solving the AI agent scalability problem through constraint satisfaction optimization** @@ -24,6 +24,7 @@ When you run `apm compile` without specifying a target, APM automatically detect | `.claude/` folder only | `claude` | CLAUDE.md (instructions only) | | `.codex/` folder exists | `codex` | AGENTS.md (instructions only) | | `.gemini/` folder exists | `gemini` | GEMINI.md (instructions only) | +| `.windsurf/` folder exists | `windsurf` | AGENTS.md (instructions only) | | Multiple folders exist | `all` | AGENTS.md + CLAUDE.md + GEMINI.md | | Neither folder exists | `minimal` | AGENTS.md only (universal format) | @@ -33,6 +34,7 @@ apm compile --target copilot # Force GitHub Copilot, Cursor apm compile --target claude # Force Claude Code, Claude Desktop apm compile --target gemini # Force Gemini CLI apm compile --target codex # Force Codex CLI +apm compile --target windsurf # Force Windsurf/Cascade apm compile -t claude,copilot # Multiple targets (comma-separated) ``` @@ -57,6 +59,7 @@ target: [claude, copilot] # multiple targets -- only these are compiled | `claude` | `CLAUDE.md` | Claude Code, Claude Desktop | | `gemini` | `GEMINI.md` | Gemini CLI | | `codex` | `AGENTS.md` | Codex CLI | +| `windsurf` | `AGENTS.md` | Windsurf/Cascade | | `all` | `AGENTS.md` + `CLAUDE.md` + `GEMINI.md` | Universal compatibility | | `minimal` | `AGENTS.md` only | Works everywhere, no folder integration | @@ -451,8 +454,9 @@ Different AI tools get different levels of support from `apm install` vs `apm co | OpenCode | `.opencode/agents/`, `.opencode/commands/`, `.opencode/skills/`, `opencode.json` (MCP) | Via `AGENTS.md` | **Full** | | Codex CLI | `.agents/skills/`, `.codex/agents/`, `.codex/hooks.json` | `AGENTS.md` (instructions) | **Full** | | Gemini | `.gemini/commands/`, `.gemini/skills/`, `.gemini/settings.json` (MCP, hooks) | `GEMINI.md` (instructions) | **Full** | +| Windsurf | `.windsurf/rules/`, `.windsurf/skills/`, `.windsurf/workflows/`, `.windsurf/hooks.json` | `AGENTS.md` (instructions) | **Full** | -For Copilot, Claude, and Cursor users, `apm install` handles everything natively. Gemini, OpenCode, and Codex users should also run `apm compile` to generate their instruction roll-up (`GEMINI.md` or `AGENTS.md`). +For Copilot, Claude, and Cursor users, `apm install` handles everything natively. Gemini, OpenCode, Codex, and Windsurf users should also run `apm compile` to generate their instruction roll-up (`GEMINI.md` or `AGENTS.md`). ## Theoretical Foundations diff --git a/docs/src/content/docs/integrations/ide-tool-integration.md b/docs/src/content/docs/integrations/ide-tool-integration.md index df0254453..ae0e80fb0 100644 --- a/docs/src/content/docs/integrations/ide-tool-integration.md +++ b/docs/src/content/docs/integrations/ide-tool-integration.md @@ -433,6 +433,47 @@ APM maintains synchronization between packages and Claude primitives: - **Update**: Refreshes rules, agents, commands, and skills when package version changes - **Virtual Packages**: Individual files and skills (e.g., `github/awesome-copilot/skills/review-and-refactor`) are tracked via `apm.lock.yaml` and removed correctly on uninstall +## Windsurf Integration + +APM integrates with Windsurf/Cascade by deploying primitives into the workspace `.windsurf/` directory. + +> **Auto-Detection**: Windsurf integration is enabled only when a `.windsurf/` folder ALREADY exists in your project. Unlike VS Code or Claude, `apm install` will NOT create the folder for you. To opt in, either run `mkdir .windsurf` first, pass an explicit target (`apm install --target windsurf`), or set `target: windsurf` in `apm.yml`. + +### Native Windsurf Primitives + +When you run `apm install` (with `.windsurf/` present or `--target windsurf`), APM deploys package primitives to Windsurf's native locations: + +| APM Primitive | Windsurf Destination | Format | +|---|---|---| +| Instructions (`.instructions.md`) | `.windsurf/rules/*.md` | Windsurf rules markdown with `trigger`/`globs` frontmatter | +| Agents (`.agent.md`) | `.windsurf/skills//SKILL.md` | Converted to a Windsurf Skill (lossy, see below) | +| Skills (`SKILL.md`) | `.windsurf/skills//SKILL.md` | Standard `SKILL.md` format | +| Commands (`.prompt.md`) | `.windsurf/workflows/*.md` | Windsurf workflow markdown | +| Hooks | `.windsurf/hooks.json` | Hook definitions merged into a single file | + +Reference: [Windsurf memories and rules](https://docs.windsurf.com/windsurf/cascade/memories). + +### Lossy Agent to Skill Conversion + +Windsurf has no native equivalent of an `.agent.md` persona, so APM converts each agent into a Windsurf Skill (`SKILL.md`). The conversion is deliberately lossy: + +- **Preserved**: `name`, `description`, and the full markdown body (verbatim). +- **Dropped**: `tools` and `model` frontmatter keys -- Windsurf Skills do not support them. + +APM prints a warning when a dropped field is detected, so the loss is never silent. If your agent depends on a specific tool list or model pin, prefer a target that supports those keys natively (Copilot, Claude). + +### MCP Configuration + +Windsurf reads MCP server definitions from a USER-SCOPE file at `~/.codeium/windsurf/mcp_config.json`. The schema is the standard `mcpServers` JSON used by GitHub Copilot CLI. The workspace `.windsurf/` directory does NOT contain MCP config -- there is nothing to commit per project. + +Reference: [Windsurf MCP integration](https://docs.windsurf.com/windsurf/cascade/mcp). + +### User-Scope Installation Limitations + +Windsurf has partial user-scope support. `apm install -g --target windsurf` deploys agents, skills, commands, and hooks under `~/.codeium/windsurf/`, but **instructions (rules) are skipped** at user scope -- Windsurf does not expose a user-level rules directory in the same shape as the workspace one. Keep instruction packages workspace-local. + +See [Global Installation](../../guides/dependencies/#global-user-scope-installation) for cross-target user-scope coverage. + ## Other IDE Support ### IDEs with GitHub Copilot @@ -649,7 +690,6 @@ dependencies: The following IDE integrations are planned for future releases: - **JetBrains IDE support**: Native integration with IntelliJ, PyCharm, WebStorm, and other JetBrains IDEs -- **Windsurf support**: Integration with the Windsurf AI coding environment - **Cursor deeper integration**: Enhanced Cursor support including rule versioning and conflict resolution ## Related Resources diff --git a/docs/src/content/docs/introduction/how-it-works.md b/docs/src/content/docs/introduction/how-it-works.md index bb511da98..e025cd20d 100644 --- a/docs/src/content/docs/introduction/how-it-works.md +++ b/docs/src/content/docs/introduction/how-it-works.md @@ -251,7 +251,7 @@ These tools support the full set of APM primitives. Running `apm install` deploy - **GitHub Copilot** (AGENTS.md + .github/) - instructions, prompts, chat modes, context, hooks, MCP - **Claude Code** (CLAUDE.md + .claude/) - commands, skills, MCP configuration -APM auto-detects targets based on project structure -- deploying to every recognized directory (`.github/`, `.claude/`, `.cursor/`, `.opencode/`) that exists, falling back to `.github/` when none do. Set `target` in `apm.yml` to restrict to specific targets (single string or list). +APM auto-detects targets based on project structure -- deploying to every recognized directory (`.github/`, `.claude/`, `.cursor/`, `.opencode/`, `.windsurf/`) that exists, falling back to `.github/` when none do. Set `target` in `apm.yml` to restrict to specific targets (single string or list). ### Compiled instructions diff --git a/docs/src/content/docs/reference/cli-commands.md b/docs/src/content/docs/reference/cli-commands.md index cf55df07b..c6b139266 100644 --- a/docs/src/content/docs/reference/cli-commands.md +++ b/docs/src/content/docs/reference/cli-commands.md @@ -88,10 +88,11 @@ apm install [PACKAGES...] [OPTIONS] - `PACKAGES` - Optional APM packages to add and install. Accepts shorthand (`owner/repo`), HTTPS URLs, SSH URLs, FQDN shorthand (`host/owner/repo`), local filesystem paths (`./path`, `../path`, `/absolute/path`, `~/path`), or marketplace references (`NAME@MARKETPLACE[#ref]`). All forms are normalized to canonical format in `apm.yml`. **Options:** -- `--runtime TEXT` - Target specific runtime only (copilot, codex, vscode, cursor, opencode, gemini, claude) +- `--runtime TEXT` - Target specific runtime only (copilot, codex, vscode, cursor, opencode, gemini, claude,windsurf) - `--exclude TEXT` - Exclude specific runtime from installation - `--only [apm|mcp]` - Install only specific dependency type -- `--target [copilot|claude|cursor|codex|opencode|gemini|agent-skills|copilot-cowork|all]` - Force deployment to specific target(s). Accepts comma-separated values for multiple targets (e.g., `-t claude,copilot`). Overrides auto-detection. `agent-skills` deploys to `.agents/skills/` (cross-client). `all` = copilot+claude+cursor+opencode+codex+gemini (excludes agent-skills); combine with `agent-skills` for both. +- `--target [copilot|claude|cursor|codex|opencode|gemini|windsurf|agent-skills|copilot-cowork|all]` - Force deployment to specific target(s). Accepts comma-separated values for multiple targets (e.g., `-t claude,copilot`). Overrides auto-detection. `agent-skills` deploys to `.agents/skills/` (cross-client). `all` = copilot+claude+cursor+opencode+codex+gemini+windsurf (excludes agent-skills); combine with `agent-skills` for both. + - `windsurf` - Windsurf/Cascade (`.windsurf/rules/`, `.windsurf/skills/`, `.windsurf/workflows/`, `.windsurf/hooks.json`) - `copilot-cowork` - Microsoft 365 Copilot Cowork skills (user scope only, requires `copilot-cowork` experimental flag) - `vscode`, `agents` - Deprecated aliases for `copilot` (`.github/`). Still accepted by the parser; prefer `copilot` for GitHub Copilot deployment, or `agent-skills` for cross-client `.agents/skills/` deployment. Removal in v1.0. - `--update` - Update dependencies to latest Git references @@ -594,7 +595,7 @@ apm pack [OPTIONS] **Options:** - `-o, --output PATH` - Bundle output directory (default: `./build`). Does not affect `marketplace.json` path. -- `-t, --target [copilot|vscode|claude|cursor|codex|opencode|gemini|all]` - Filter bundle files by target. Accepts comma-separated values (e.g., `-t claude,copilot`). Auto-detects from `apm.yml` if omitted. `vscode` is an alias for `copilot`. No-op for marketplace output. +- `-t, --target [copilot|vscode|claude|cursor|codex|opencode|gemini|windsurf|all]` - Filter bundle files by target. Accepts comma-separated values (e.g., `-t claude,copilot`). Auto-detects from `apm.yml` if omitted. `vscode` is an alias for `copilot`. No-op for marketplace output. - `--archive` - Produce a `.tar.gz` archive instead of a directory. Bundle only. - `--format [plugin|apm]` - Bundle format (default: `plugin`). `plugin` emits a Claude Code plugin directory with a schema-conformant `plugin.json` ([official schema](https://json.schemastore.org/claude-code-plugin.json)). `apm` produces the legacy APM bundle layout (consumed by `microsoft/apm-action@v1` restore mode and other bundle-aware tooling). No-op for marketplace output. - `--force` - On collision (plugin format), last writer wins instead of first. Bundle only. @@ -1017,7 +1018,7 @@ apm deps update [PACKAGES...] [OPTIONS] - `--verbose, -v` - Show detailed update information - `--force` - Overwrite locally-authored files on collision - `-g, --global` - Update user-scope dependencies (`~/.apm/`) -- `--target, -t` - Force deployment to specific target(s). Accepts comma-separated values (e.g., `-t claude,copilot`). Valid values: copilot, claude, cursor, opencode, codex, gemini, agent-skills, vscode, agents (deprecated), all. `agent-skills` deploys to `.agents/skills/` (cross-client). `all` excludes agent-skills. +- `--target, -t` - Force deployment to specific target(s). Accepts comma-separated values (e.g., `-t claude,copilot`). Valid values: copilot, claude, cursor, opencode, codex, gemini, windsurf, agent-skills, vscode, agents (deprecated), all. `agent-skills` deploys to `.agents/skills/` (cross-client). `all` excludes agent-skills. - `--parallel-downloads` - Max concurrent downloads (default: 4) **Policy enforcement:** `apm deps update` runs the install pipeline and is therefore gated by org `apm-policy.yml`. There is no `--no-policy` flag on this command -- the only escape hatch is `APM_POLICY_DISABLE=1` for the shell session. See [Policy reference](../../enterprise/policy-reference/#install-time-enforcement). @@ -1691,7 +1692,7 @@ apm compile [OPTIONS] **Options:** - `-o, --output TEXT` - Output file path (for single-file mode) -- `-t, --target [copilot|claude|cursor|codex|opencode|gemini|agent-skills|all]` - Target agent format. Accepts comma-separated values for multiple targets (e.g., `-t claude,copilot`). `vscode` and `agents` are accepted as deprecated aliases for `copilot` (removal in v1.0). `agent-skills` is a no-op for compile (skills-only target). Auto-detects if not specified. +- `-t, --target [copilot|claude|cursor|codex|opencode|gemini|windsurf|agent-skills|all]` - Target agent format. Accepts comma-separated values for multiple targets (e.g., `-t claude,copilot`). `vscode` and `agents` are accepted as deprecated aliases for `copilot` (removal in v1.0). `agent-skills` is a no-op for compile (skills-only target). Auto-detects if not specified. - `--chatmode TEXT` - Chatmode to prepend to the AGENTS.md file - `--dry-run` - Preview compilation without writing files (shows placement decisions) - `--no-links` - Skip markdown link resolution @@ -1713,6 +1714,7 @@ When `--target` is not specified, APM auto-detects based on existing project str | `.claude/` exists only | `claude` | CLAUDE.md + .claude/ | | `.codex/` exists | `codex` | AGENTS.md + .codex/ + .agents/ | | `.gemini/` exists | `gemini` | GEMINI.md + .gemini/ | +| `.windsurf/` exists | `windsurf` | AGENTS.md + .windsurf/ | | Both folders exist | `all` | All outputs | | Neither folder exists | `minimal` | AGENTS.md only | @@ -1738,6 +1740,7 @@ target: [claude, copilot] # multiple targets -- only these are compiled/install | `codex` | AGENTS.md, .agents/skills/, .codex/agents/, .codex/hooks.json | Codex CLI | | `opencode` | AGENTS.md, .opencode/agents/, .opencode/commands/, .opencode/skills/ | OpenCode | | `gemini` | GEMINI.md, .gemini/commands/, .gemini/skills/ | Gemini CLI | +| `windsurf` | AGENTS.md, .windsurf/rules/, .windsurf/skills/, .windsurf/workflows/ | Windsurf/Cascade | | `agent-skills` | .agents/skills/ only | Cross-client shared skills | | `agents` | *(deprecated)* alias for `vscode` | Use `copilot` or `agent-skills` instead | | `all` | All of the above (excludes `agent-skills`) | Universal compatibility | diff --git a/docs/src/content/docs/reference/manifest-schema.md b/docs/src/content/docs/reference/manifest-schema.md index a4b5e1dda..80de788ef 100644 --- a/docs/src/content/docs/reference/manifest-schema.md +++ b/docs/src/content/docs/reference/manifest-schema.md @@ -113,8 +113,8 @@ marketplace: # OPTIONAL; marketplace authoring |---|---| | **Type** | `string \| list` | | **Required** | OPTIONAL | -| **Default** | Auto-detect: `vscode` if `.github/` exists, `claude` if `.claude/` exists, `codex` if `.codex/` exists, `all` if multiple target folders exist, `minimal` if none | -| **Allowed values** | `vscode` · `agents` · `copilot` · `claude` · `cursor` · `opencode` · `codex` · `all` | +| **Default** | Auto-detect: `vscode` if `.github/` exists, `claude` if `.claude/` exists, `codex` if `.codex/` exists, `windsurf` if `.windsurf/` exists, `all` if multiple target folders exist, `minimal` if none | +| **Allowed values** | `vscode` · `agents` · `copilot` · `claude` · `cursor` · `opencode` · `codex` · `gemini` · `windsurf` · `all` | Controls which output targets are generated during compilation and installation. Accepts a single string or a list of strings. When unset, a conforming resolver SHOULD auto-detect based on folder presence. Unknown values MUST raise a parse error pointing at the offending token. Auto-detection applies only when `target:` is unset. @@ -137,6 +137,8 @@ When a list is specified, only those targets are compiled, installed, and packed | `cursor` | Emits to `.cursor/rules/`, `.cursor/agents/`, `.cursor/skills/` | | `opencode` | Emits to `.opencode/agents/`, `.opencode/commands/`, `.opencode/skills/` | | `codex` | Emits `AGENTS.md` and deploys skills to `.agents/skills/`, agents to `.codex/agents/` | +| `gemini` | Emits `GEMINI.md` and deploys to `.gemini/commands/`, `.gemini/skills/`, `.gemini/settings.json` | +| `windsurf` | Emits `AGENTS.md` and deploys to `.windsurf/rules/`, `.windsurf/skills/`, `.windsurf/workflows/`, `.windsurf/hooks.json` | | `all` | All targets. Cannot be combined with other values in a list. | | `minimal` | AGENTS.md only at project root. **Auto-detected only** -- this value MUST NOT be set explicitly in manifests; it is an internal fallback when no target folder is detected. | @@ -478,7 +480,7 @@ The `compilation` key is OPTIONAL. It controls `apm compile` behaviour. All fiel | Field | Type | Default | Constraint | Description | |---|---|---|---|---| -| `target` | `enum` | `all` | `vscode` · `agents` · `claude` · `codex` · `all` | Output target (same values as §3.6). Defaults to `all` when set explicitly in compilation config. | +| `target` | `enum` | `all` | `vscode` · `agents` · `claude` · `codex` · `gemini` · `windsurf` · `all` | Output target (same values as §3.6). Defaults to `all` when set explicitly in compilation config. | | `strategy` | `enum` | `distributed` | `distributed` · `single-file` | `distributed` generates per-directory AGENTS.md files. `single-file` generates one monolithic file. | | `single_file` | `bool` | `false` | | Legacy alias. When `true`, overrides `strategy` to `single-file`. | | `output` | `string` | `AGENTS.md` | File path | Custom output path for the compiled file. | diff --git a/packages/apm-guide/.apm/skills/apm-usage/commands.md b/packages/apm-guide/.apm/skills/apm-usage/commands.md index 948cdb9e9..ff14e95c7 100644 --- a/packages/apm-guide/.apm/skills/apm-usage/commands.md +++ b/packages/apm-guide/.apm/skills/apm-usage/commands.md @@ -101,9 +101,9 @@ Set `MCP_REGISTRY_URL` (default `https://api.mcp.github.com`) to point all `apm | Command | Purpose | Key flags | |---------|---------|-----------| -| `apm runtime setup {copilot\|codex\|llm\|gemini}` | Install a runtime | `--version`, `--vanilla` | +| `apm runtime setup {copilot\|codex\|llm\|gemini\|windsurf}` | Install a runtime | `--version`, `--vanilla` | | `apm runtime list` | Show installed runtimes | -- | -| `apm runtime remove {copilot\|codex\|llm\|gemini}` | Remove a runtime | `-y`, `--yes` | +| `apm runtime remove {copilot\|codex\|llm\|gemini\|windsurf}` | Remove a runtime | `-y`, `--yes` | | `apm runtime status` | Show active runtime | -- | ## Experimental features diff --git a/packages/apm-guide/.apm/skills/apm-usage/package-authoring.md b/packages/apm-guide/.apm/skills/apm-usage/package-authoring.md index 843927210..4a4bba096 100644 --- a/packages/apm-guide/.apm/skills/apm-usage/package-authoring.md +++ b/packages/apm-guide/.apm/skills/apm-usage/package-authoring.md @@ -72,6 +72,7 @@ hooks: | `*-claude-hooks.json` | Claude Code only | | `*-codex-hooks.json` | Codex CLI only | | `*-gemini-hooks.json` | Gemini CLI only | +| `*-windsurf-hooks.json` | Windsurf only | | Any other name (e.g. `hooks.json`, `telemetry-hooks.json`) | All targets | Example directory tree for a multi-target hook package: @@ -101,10 +102,10 @@ silently fall through to auto-detect. | Form | Behaviour | |------|-----------| -| `target: copilot` | Single token; allowed values: `vscode`, `agents`, `copilot`, `claude`, `cursor`, `opencode`, `codex`, `all` | +| `target: copilot` | Single token; allowed values: `vscode`, `agents`, `copilot`, `claude`, `cursor`, `opencode`, `codex`, `gemini`, `windsurf`, `all` | | `target: [claude, copilot]` | List form; only listed targets are compiled/installed | | `target: claude,copilot` | CSV-string form; parses identically to the list form (the shared validator splits on `,`). Before #820 was fixed, this silently produced zero deployment | -| `target:` omitted entirely | Auto-detect from project folders (`.github/`, `.claude/`, `.codex/`) | +| `target:` omitted entirely | Auto-detect from project folders (`.github/`, `.claude/`, `.codex/`, `.windsurf/`, etc.) | | `target: bogus` (unknown token) | **Parse error** -- fix the typo | | `target: ""` or `target: []` (empty) | **Parse error** -- remove the line if you meant auto-detect | | `target: [all, claude]` (`all` mixed with other targets) | **Parse error** -- use `all` alone | diff --git a/src/apm_cli/adapters/client/base.py b/src/apm_cli/adapters/client/base.py index b0698f599..59cf18d2a 100644 --- a/src/apm_cli/adapters/client/base.py +++ b/src/apm_cli/adapters/client/base.py @@ -18,6 +18,24 @@ class MCPClientAdapter(ABC): """Base adapter for MCP clients.""" + # Identifier matching the corresponding ``KNOWN_TARGETS`` entry name. + # Subclasses MUST override this so target-aware code can look up + # per-target metadata via ``KNOWN_TARGETS[adapter.target_name]`` + # instead of sniffing class names. The ``vscode`` adapter is the + # only MCP-only pseudo-target (no entry in ``KNOWN_TARGETS``), so + # downstream code that joins on this field must tolerate misses. + target_name: str = "" + + # Top-level config key under which this adapter's MCP server entries + # live (``"mcpServers"``, ``"mcp_servers"``, ``"servers"``, ...). + # Subclasses MUST override this; ``MCPConflictDetector`` reads it to + # extract existing server configs without classname dispatch. + # The adapter is the canonical owner of its config schema, so this + # field lives here rather than on ``TargetProfile`` (which is + # primitive-focused) and applies uniformly to MCP-only adapters + # (e.g. ``VSCodeClientAdapter``) that have no ``KNOWN_TARGETS`` entry. + mcp_servers_key: str = "" + # Whether this adapter's config path is user/global-scoped (e.g. # ``~/.copilot/``) rather than workspace-scoped (e.g. ``.vscode/``). # Adapters that target a global path should override this to ``True`` diff --git a/src/apm_cli/adapters/client/claude.py b/src/apm_cli/adapters/client/claude.py index 0625e3b36..487740299 100644 --- a/src/apm_cli/adapters/client/claude.py +++ b/src/apm_cli/adapters/client/claude.py @@ -39,6 +39,8 @@ class ClaudeClientAdapter(CopilotClientAdapter): """ supports_user_scope: bool = True + target_name: str = "claude" + mcp_servers_key: str = "mcpServers" @staticmethod def _normalize_mcp_entry_for_claude_code(entry: dict) -> dict: diff --git a/src/apm_cli/adapters/client/codex.py b/src/apm_cli/adapters/client/codex.py index a0c469b93..ea8f2eda7 100644 --- a/src/apm_cli/adapters/client/codex.py +++ b/src/apm_cli/adapters/client/codex.py @@ -23,6 +23,8 @@ class CodexClientAdapter(MCPClientAdapter): """ supports_user_scope: bool = True + target_name: str = "codex" + mcp_servers_key: str = "mcp_servers" def __init__( self, diff --git a/src/apm_cli/adapters/client/copilot.py b/src/apm_cli/adapters/client/copilot.py index 8ffb04a3c..5445b4b60 100644 --- a/src/apm_cli/adapters/client/copilot.py +++ b/src/apm_cli/adapters/client/copilot.py @@ -37,6 +37,9 @@ class CopilotClientAdapter(MCPClientAdapter): """ supports_user_scope: bool = True + _client_label: str = "Copilot CLI" + target_name: str = "copilot" + mcp_servers_key: str = "mcpServers" def __init__( self, @@ -171,7 +174,7 @@ def configure_mcp_server( # Update configuration using the chosen key self.update_config({config_key: server_config}) - print(f"Successfully configured MCP server '{config_key}' for Copilot CLI") + print(f"Successfully configured MCP server '{config_key}' for {self._client_label}") return True except Exception as e: diff --git a/src/apm_cli/adapters/client/cursor.py b/src/apm_cli/adapters/client/cursor.py index 46d7bef98..be910cb29 100644 --- a/src/apm_cli/adapters/client/cursor.py +++ b/src/apm_cli/adapters/client/cursor.py @@ -26,6 +26,8 @@ class CursorClientAdapter(CopilotClientAdapter): """ supports_user_scope: bool = False + target_name: str = "cursor" + mcp_servers_key: str = "mcpServers" # ------------------------------------------------------------------ # # Config path diff --git a/src/apm_cli/adapters/client/gemini.py b/src/apm_cli/adapters/client/gemini.py index 27c60cd25..9e2e0a0b2 100644 --- a/src/apm_cli/adapters/client/gemini.py +++ b/src/apm_cli/adapters/client/gemini.py @@ -44,6 +44,8 @@ class GeminiClientAdapter(CopilotClientAdapter): """ supports_user_scope: bool = True + target_name: str = "gemini" + mcp_servers_key: str = "mcpServers" def get_config_path(self): """Return the path to ``.gemini/settings.json`` in the repository root.""" diff --git a/src/apm_cli/adapters/client/opencode.py b/src/apm_cli/adapters/client/opencode.py index 6e547e3cd..3f4b12233 100644 --- a/src/apm_cli/adapters/client/opencode.py +++ b/src/apm_cli/adapters/client/opencode.py @@ -41,6 +41,8 @@ class OpenCodeClientAdapter(CopilotClientAdapter): """ supports_user_scope: bool = False + target_name: str = "opencode" + mcp_servers_key: str = "mcpServers" def get_config_path(self): """Return the path to ``opencode.json`` in the repository root.""" diff --git a/src/apm_cli/adapters/client/vscode.py b/src/apm_cli/adapters/client/vscode.py index 5a6643957..d129d47ac 100644 --- a/src/apm_cli/adapters/client/vscode.py +++ b/src/apm_cli/adapters/client/vscode.py @@ -29,6 +29,9 @@ class VSCodeClientAdapter(MCPClientAdapter): in the VSCode documentation. """ + target_name: str = "vscode" + mcp_servers_key: str = "servers" + def __init__( self, registry_url=None, diff --git a/src/apm_cli/adapters/client/windsurf.py b/src/apm_cli/adapters/client/windsurf.py new file mode 100644 index 000000000..cdd7615be --- /dev/null +++ b/src/apm_cli/adapters/client/windsurf.py @@ -0,0 +1,42 @@ +"""Windsurf/Cascade implementation of MCP client adapter. + +Windsurf uses the standard ``mcpServers`` JSON format at +``~/.codeium/windsurf/mcp_config.json`` (global). The config schema is +identical to GitHub Copilot CLI, so this adapter subclasses +:class:`CopilotClientAdapter` and only overrides the config-path logic +and the ``_client_label`` used in log messages. + +Ref: https://docs.windsurf.com/windsurf/cascade/mcp +""" + +from pathlib import Path + +from .copilot import CopilotClientAdapter + + +class WindsurfClientAdapter(CopilotClientAdapter): + """Windsurf/Cascade MCP client adapter. + + Inherits all config formatting and MCP server configuration logic + from :class:`CopilotClientAdapter` (``mcpServers`` JSON with + ``command``/``args``/``env``). Only the config-file location and + the user-facing label differ. + """ + + supports_user_scope: bool = True + _client_label: str = "Windsurf" + target_name: str = "windsurf" + mcp_servers_key: str = "mcpServers" + + # ------------------------------------------------------------------ # + # Config path + # ------------------------------------------------------------------ # + + def get_config_path(self) -> str: + """Return the path to ``~/.codeium/windsurf/mcp_config.json``. + + This is a **global** config path -- Windsurf reads MCP server + definitions from the user-level directory, not the workspace. + """ + windsurf_dir = Path.home() / ".codeium" / "windsurf" + return str(windsurf_dir / "mcp_config.json") diff --git a/src/apm_cli/bundle/lockfile_enrichment.py b/src/apm_cli/bundle/lockfile_enrichment.py index 8d523eb28..04e0e0804 100644 --- a/src/apm_cli/bundle/lockfile_enrichment.py +++ b/src/apm_cli/bundle/lockfile_enrichment.py @@ -5,18 +5,7 @@ from typing import Dict, List, Tuple, Union # noqa: F401, UP035 from ..deps.lockfile import LockFile - -# Authoritative mapping of target names to deployed-file path prefixes. -_TARGET_PREFIXES = { - "copilot": [".github/"], - "vscode": [".github/"], - "claude": [".claude/"], - "cursor": [".cursor/"], - "opencode": [".opencode/"], - "codex": [".codex/", ".agents/"], - "agent-skills": [".agents/"], - "all": [".github/", ".claude/", ".cursor/", ".opencode/", ".codex/", ".agents/"], -} +from ..integration.targets import KNOWN_TARGETS # Cross-target path equivalences for skills/ and agents/ directories. # Only these two directory types are semantically identical across targets; @@ -27,6 +16,10 @@ # maps FROM .claude/ for the common case of Claude-first projects packing # for Copilot. Cursor/opencode sources are niche; if someone publishes # skills exclusively under .cursor/, they must pack with --target cursor. +# +# Windsurf converts agents -> skills (lossy: AGENTS.md format is collapsed +# into the windsurf skill envelope), so .github/agents/ maps to +# .windsurf/skills/. _CROSS_TARGET_MAPS: dict[str, dict[str, str]] = { "claude": { ".github/skills/": ".claude/skills/", @@ -52,12 +45,62 @@ ".github/skills/": ".agents/skills/", ".github/agents/": ".codex/agents/", }, + "windsurf": { + ".github/skills/": ".windsurf/skills/", + ".github/agents/": ".windsurf/skills/", + }, "agent-skills": { ".github/skills/": ".agents/skills/", }, } +def _all_target_prefixes() -> list[str]: + """Union of pack prefixes for every real (deployable) target. + + A target is considered deployable when ``detect_by_dir`` or + ``auto_create`` is True; ``copilot-cowork`` (both False) is excluded + because it is an opt-in pseudo-target. + + Order is stable: KNOWN_TARGETS insertion order, with deduplication + preserving first occurrence. This keeps downstream YAML deterministic. + """ + prefixes: list[str] = [] + seen: set[str] = set() + for profile in KNOWN_TARGETS.values(): + if not (profile.detect_by_dir or profile.auto_create): + continue + for prefix in profile.effective_pack_prefixes: + if prefix not in seen: + seen.add(prefix) + prefixes.append(prefix) + return prefixes + + +def _get_target_prefixes(target: str) -> list[str]: + """Resolve pack-prefixes for a single target name. + + Reads from ``KNOWN_TARGETS[target].effective_pack_prefixes``. Special + cases: + + * ``"all"`` -- union of every deployable target's prefixes (see + :func:`_all_target_prefixes`). + * ``"vscode"`` -- treated as an alias for ``"copilot"`` (both deploy + to ``.github/``); kept for backward compatibility because + ``vscode`` is a valid MCP-only adapter target_name. + * Unknown targets -- fall back to the union, matching the previous + behavior of falling through to the all-targets default. + """ + if target == "all": + return _all_target_prefixes() + if target == "vscode": + return list(KNOWN_TARGETS["copilot"].effective_pack_prefixes) + profile = KNOWN_TARGETS.get(target) + if profile is None: + return _all_target_prefixes() + return list(profile.effective_pack_prefixes) + + def _filter_files_by_target( deployed_files: list[str], target: str | list[str] ) -> tuple[list[str], dict[str, str]]: @@ -81,7 +124,7 @@ def _filter_files_by_target( prefixes: list[str] = [] seen_prefixes: set = set() for t in target: - for p in _TARGET_PREFIXES.get(t, []): + for p in _get_target_prefixes(t): if p not in seen_prefixes: seen_prefixes.add(p) prefixes.append(p) @@ -94,7 +137,7 @@ def _filter_files_by_target( for t in target: cross_map.update(_CROSS_TARGET_MAPS.get(t, {})) else: - prefixes = _TARGET_PREFIXES.get(target, _TARGET_PREFIXES["all"]) + prefixes = _get_target_prefixes(target) cross_map = _CROSS_TARGET_MAPS.get(target, {}) direct = [f for f in deployed_files if any(f.startswith(p) for p in prefixes)] diff --git a/src/apm_cli/commands/compile/cli.py b/src/apm_cli/commands/compile/cli.py index dc8a64256..d61f3e0a9 100644 --- a/src/apm_cli/commands/compile/cli.py +++ b/src/apm_cli/commands/compile/cli.py @@ -175,64 +175,80 @@ def _resolve_compile_target(target): collapsing to ``"all"`` (which would incorrectly generate files for every family). + Family resolution reads ``TargetProfile.compile_family`` from + ``KNOWN_TARGETS`` so adding a new compile-eligible target only + requires populating that field. The CLI alias ``"vscode"`` is + treated as ``"copilot"`` for this purpose. + Args: target: A single target string, a list of target strings, or ``None``. Returns: A single string, a ``frozenset`` of compiler families, or ``None``. """ + from ...integration.targets import KNOWN_TARGETS + if target is None: return None # will trigger detect_target() auto-detection if isinstance(target, list): target_set = set(target) - # Strip agent-skills from the family-resolution set -- it is a - # deployment-only target with no compilation output. - target_set.discard("agent-skills") + # Strip targets with no compile output (compile_family is None); + # they would silently fall through the family resolution otherwise. + # ``vscode`` is a CLI alias for ``copilot`` and shares its profile. + skip = {name for name, profile in KNOWN_TARGETS.items() if profile.compile_family is None} + target_set -= skip if not target_set: - # Solo agent-skills in a list -- pass through as a string so - # the compiler's no-op path fires. - return "agent-skills" - # Two distinct families overlap on copilot/vscode/agents: - # copilot_family -> requests .github/copilot-instructions.md AND AGENTS.md - # agents_md_family -> requests AGENTS.md only (cursor/opencode/codex) - # Splitting these prevents the over-fire bug where -t cursor,claude or - # -t cursor,opencode,codex used to incorrectly emit copilot-instructions.md. - copilot_family = {"copilot", "vscode", "agents"} - agents_md_family = {"cursor", "opencode", "codex"} - has_copilot = bool(target_set & copilot_family) - has_agents_md_only = bool(target_set & agents_md_family) - has_claude = "claude" in target_set - has_gemini = "gemini" in target_set - families = set() - if has_copilot: - families.add("vscode") # gates copilot-instructions.md - families.add("agents") # also gates AGENTS.md - elif has_agents_md_only: - families.add("agents") # AGENTS.md only -- no copilot-instructions - if has_claude: - families.add("claude") - if has_gemini: - families.add("gemini") + # Solo agent-skills (or another no-compile target) in a list -- + # pass through as a string so the compiler's no-op path fires. + for sentinel in target: + if sentinel in skip: + return sentinel + return None + + # The "vscode" family handles copilot AND emits AGENTS.md as a + # bonus; the "agents" family emits AGENTS.md only. When both + # appear in a multi-target compile we still need both family + # tokens so the agents compiler routes correctly. + def _family_of(name: str) -> str | None: + if name == "vscode": + return "vscode" + profile = KNOWN_TARGETS.get(name) + return profile.compile_family if profile else None + + families: set[str] = set() + for name in target_set: + family = _family_of(name) + if family is None: + continue + families.add(family) + if family == "vscode": + # copilot also emits AGENTS.md; mirror legacy behavior. + families.add("agents") + if len(families) >= 2: - # Single-target copilot collapses {"vscode","agents"} to bare "vscode" - # for routing parity with single-string -t copilot. + # Single-target copilot collapses {"vscode","agents"} to bare + # "vscode" for routing parity with single-string -t copilot. if families == {"vscode", "agents"}: return "vscode" return frozenset(families) - elif has_claude: + if "claude" in families: return "claude" - elif has_gemini: + if "gemini" in families: return "gemini" - elif has_copilot: + if "vscode" in families: return "vscode" - else: - # cursor/opencode/codex only -- preserve the bare target name so - # single-element list routing matches single-string semantics - # (-t cursor and -t cursor both end up as "cursor"). - for bare in ("cursor", "opencode", "codex"): - if bare in target_set: - return bare - return "vscode" # defensive fallback (unreachable) + # Bare agents-family target: preserve the original target name so + # single-element list routing matches single-string semantics + # (-t cursor and -t [cursor] both end up as "cursor"). Iterate + # KNOWN_TARGETS in insertion order so priority ties (e.g. + # ["opencode","codex"]) resolve deterministically to the + # earliest-registered target. Adding a new agents-family + # target (e.g. zed, cline) costs zero edits here -- it inherits + # whatever priority position it occupies in the registry. + for name, profile in KNOWN_TARGETS.items(): + if profile.compile_family == "agents" and name in target_set: + return name + return "vscode" # defensive fallback (unreachable) return target # single string pass-through @@ -248,7 +264,7 @@ def _resolve_compile_target(target): "-t", type=TargetParamType(), default=None, - help="Target platform (comma-separated). Values: copilot, claude, cursor, opencode, codex, gemini, agent-skills, all. 'agent-skills' deploys to .agents/skills/ (cross-client). 'all' = copilot+claude+cursor+opencode+codex+gemini (excludes agent-skills); combine with 'agent-skills' for both.", + help="Target platform (comma-separated). Values: copilot, claude, cursor, opencode, codex, gemini, windsurf, agent-skills, all. 'agent-skills' deploys to .agents/skills/ (cross-client). 'all' = copilot+claude+cursor+opencode+codex+gemini+windsurf (excludes agent-skills); combine with 'agent-skills' for both.", ) @click.option( "--dry-run", diff --git a/src/apm_cli/commands/install.py b/src/apm_cli/commands/install.py index 83c0cd252..30da0c163 100644 --- a/src/apm_cli/commands/install.py +++ b/src/apm_cli/commands/install.py @@ -838,7 +838,7 @@ def _handle_mcp_install( "target", type=TargetParamType(), default=None, - help="Target platform (comma-separated). Values: copilot, claude, cursor, opencode, codex, gemini, agent-skills, all. 'agent-skills' deploys to .agents/skills/ (cross-client). 'all' = copilot+claude+cursor+opencode+codex+gemini (excludes agent-skills); combine with 'agent-skills' for both.", + help="Target platform (comma-separated). Values: copilot, claude, cursor, opencode, codex, gemini, windsurf, agent-skills, all. 'agent-skills' deploys to .agents/skills/ (cross-client). 'all' = copilot+claude+cursor+opencode+codex+gemini+windsurf (excludes agent-skills); combine with 'agent-skills' for both.", ) @click.option( "--allow-insecure", diff --git a/src/apm_cli/commands/pack.py b/src/apm_cli/commands/pack.py index a1054c9f6..cdf543884 100644 --- a/src/apm_cli/commands/pack.py +++ b/src/apm_cli/commands/pack.py @@ -65,7 +65,7 @@ "-t", type=TargetParamType(), default=None, - help="Target platform (comma-separated). Values: copilot, claude, cursor, opencode, codex, gemini, agent-skills, all. 'agent-skills' deploys to .agents/skills/ (cross-client). 'all' = copilot+claude+cursor+opencode+codex+gemini (excludes agent-skills); combine with 'agent-skills' for both.", + help="Target platform (comma-separated). Values: copilot, claude, cursor, opencode, codex, gemini, windsurf, agent-skills, all. 'agent-skills' deploys to .agents/skills/ (cross-client). 'all' = copilot+claude+cursor+opencode+codex+gemini+windsurf (excludes agent-skills); combine with 'agent-skills' for both.", ) @click.option( "--archive", diff --git a/src/apm_cli/compilation/agents_compiler.py b/src/apm_cli/compilation/agents_compiler.py index f36045580..40ce31b83 100644 --- a/src/apm_cli/compilation/agents_compiler.py +++ b/src/apm_cli/compilation/agents_compiler.py @@ -46,6 +46,7 @@ "codex", "agent-skills", "gemini", + "windsurf", "all", "minimal", ) + _VSCODE_TARGET_ALIASES diff --git a/src/apm_cli/core/conflict_detector.py b/src/apm_cli/core/conflict_detector.py index 910c655b9..07523f255 100644 --- a/src/apm_cli/core/conflict_detector.py +++ b/src/apm_cli/core/conflict_detector.py @@ -90,45 +90,35 @@ def get_canonical_server_name(self, server_ref: str) -> str: def get_existing_server_configs(self) -> dict[str, Any]: """Extract all existing server configurations. + Reads the adapter's MCP servers using ``adapter.mcp_servers_key`` so + every adapter class is handled uniformly. Codex carries an extra + TOML-flat-key fallback because its config can spell entries as + ``mcp_servers.`` at the top level instead of nested under a + ``mcp_servers`` table. + Returns: Dictionary of existing server configurations keyed by server name. """ - # Get fresh config each time existing_config = self.adapter.get_current_config() - - # Determine runtime type from adapter class name or type - adapter_class_name = getattr(self.adapter, "__class__", type(self.adapter)).__name__.lower() - - if "copilot" in adapter_class_name: - return existing_config.get("mcpServers", {}) - elif "codex" in adapter_class_name: - # Extract mcp_servers section from TOML config, handling both nested and flat formats - servers = {} - - # Direct mcp_servers section - if "mcp_servers" in existing_config: - servers.update(existing_config["mcp_servers"]) - - # Handle TOML-style nested keys like 'mcp_servers.github' and 'mcp_servers."quoted-name"' - for key, value in existing_config.items(): - if key.startswith("mcp_servers."): - # Extract server name from key - server_name = key[len("mcp_servers.") :] - # Remove quotes if present - if server_name.startswith('"') and server_name.endswith('"'): - server_name = server_name[1:-1] - - # Only add if it looks like server config (has command or args) - if isinstance(value, dict) and ("command" in value or "args" in value): - servers[server_name] = value - - return servers - elif "vscode" in adapter_class_name: - return existing_config.get("servers", {}) - elif "claude" in adapter_class_name: - return existing_config.get("mcpServers", {}) - - return {} + key = self.adapter.mcp_servers_key + if not key: + return {} + + servers: dict[str, Any] = dict(existing_config.get(key, {}) or {}) + + if key == "mcp_servers": + # Codex TOML quirk: handle ``mcp_servers."name"`` flat keys in + # addition to the nested table. + for raw_key, value in existing_config.items(): + if not raw_key.startswith("mcp_servers."): + continue + server_name = raw_key[len("mcp_servers.") :] + if server_name.startswith('"') and server_name.endswith('"'): + server_name = server_name[1:-1] + if isinstance(value, dict) and ("command" in value or "args" in value): + servers[server_name] = value + + return servers def get_conflict_summary(self, server_reference: str) -> dict[str, Any]: """Get detailed information about a conflict. diff --git a/src/apm_cli/core/target_detection.py b/src/apm_cli/core/target_detection.py index 9efc091ce..6e4ca9c1e 100644 --- a/src/apm_cli/core/target_detection.py +++ b/src/apm_cli/core/target_detection.py @@ -51,7 +51,16 @@ def agents_alias_was_detected() -> bool: # Valid target values (internal canonical form) TargetType = Literal[ - "vscode", "claude", "cursor", "opencode", "codex", "gemini", "agent-skills", "all", "minimal" + "vscode", + "claude", + "cursor", + "opencode", + "codex", + "gemini", + "windsurf", + "agent-skills", + "all", + "minimal", ] # Compiler families used inside a multi-target frozenset. Narrower than @@ -87,6 +96,7 @@ def agents_alias_was_detected() -> bool: "opencode", "codex", "gemini", + "windsurf", "agent-skills", "all", "minimal", @@ -124,6 +134,8 @@ def detect_target( # noqa: PLR0911 return "codex", "explicit --target flag" elif explicit_target == "gemini": return "gemini", "explicit --target flag" + elif explicit_target == "windsurf": + return "windsurf", "explicit --target flag" elif explicit_target == "agent-skills": return "agent-skills", "explicit --target flag" elif explicit_target == "all": @@ -143,6 +155,8 @@ def detect_target( # noqa: PLR0911 return "codex", "apm.yml target" elif config_target == "gemini": return "gemini", "apm.yml target" + elif config_target == "windsurf": + return "windsurf", "apm.yml target" elif config_target == "agent-skills": return "agent-skills", "apm.yml target" elif config_target == "all": @@ -155,6 +169,7 @@ def detect_target( # noqa: PLR0911 opencode_exists = (project_root / ".opencode").is_dir() codex_exists = (project_root / ".codex").is_dir() gemini_exists = (project_root / ".gemini").is_dir() + windsurf_exists = (project_root / ".windsurf").is_dir() detected = [] if github_exists: detected.append(".github/") @@ -168,6 +183,8 @@ def detect_target( # noqa: PLR0911 detected.append(".codex/") if gemini_exists: detected.append(".gemini/") + if windsurf_exists: + detected.append(".windsurf/") if len(detected) >= 2: return "all", f"detected {' and '.join(detected)} folders" @@ -183,6 +200,8 @@ def detect_target( # noqa: PLR0911 return "codex", "detected .codex/ folder" elif gemini_exists: return "gemini", "detected .gemini/ folder" + elif windsurf_exists: + return "windsurf", "detected .windsurf/ folder" else: return "minimal", REASON_NO_TARGET_FOLDER @@ -202,7 +221,7 @@ def should_compile_agents_md(target: CompileTargetType) -> bool: """ if isinstance(target, frozenset): return "agents" in target or "gemini" in target - return target in ("vscode", "opencode", "codex", "gemini", "all", "minimal") + return target in ("vscode", "opencode", "codex", "gemini", "windsurf", "all", "minimal") def should_compile_claude_md(target: CompileTargetType) -> bool: @@ -279,8 +298,9 @@ def get_target_description(target: UserTargetType) -> str: "opencode": "AGENTS.md + .opencode/agents/ + .opencode/commands/ + .opencode/skills/", "codex": "AGENTS.md + .agents/skills/ + .codex/agents/ + .codex/hooks.json", "gemini": "GEMINI.md + .gemini/commands/ + .gemini/skills/ + .gemini/settings.json (MCP/hooks)", + "windsurf": "AGENTS.md + .windsurf/rules/ + .windsurf/skills/ + .windsurf/workflows/ + .windsurf/hooks.json", "agent-skills": ".agents/skills/ only (cross-client shared skills -- no agents, hooks, or commands)", - "all": "AGENTS.md + CLAUDE.md + GEMINI.md + .github/copilot-instructions.md + .github/ + .claude/ + .cursor/ + .opencode/ + .codex/ + .gemini/ + .agents/", + "all": "AGENTS.md + CLAUDE.md + GEMINI.md + .github/copilot-instructions.md + .github/ + .claude/ + .cursor/ + .opencode/ + .codex/ + .gemini/ + .windsurf/ + .agents/", "minimal": "AGENTS.md only (create .github/, .claude/, or .gemini/ for full integration)", } return descriptions.get(normalized, "unknown target") @@ -292,7 +312,9 @@ def get_target_description(target: UserTargetType) -> str: #: The complete set of real (non-pseudo) canonical targets. #: "minimal" is intentionally excluded -- it is a fallback pseudo-target. -ALL_CANONICAL_TARGETS = frozenset({"vscode", "claude", "cursor", "opencode", "codex", "gemini"}) +ALL_CANONICAL_TARGETS = frozenset( + {"vscode", "claude", "cursor", "opencode", "codex", "gemini", "windsurf"} +) #: Targets that the parser must accept but that are gated at runtime by #: ``is_enabled()`` in ``core/experimental.py`` and ``_flag_gated()`` in @@ -495,7 +517,7 @@ def parse_target_field( # preserves the long-standing CLI contract where ``--target copilot`` # yields ``"copilot"`` rather than the canonical ``"vscode"``; every # downstream consumer (active_targets, agents_compiler, - # _CROSS_TARGET_MAPS, _TARGET_PREFIXES) already accepts both alias + # _CROSS_TARGET_MAPS, _get_target_prefixes) already accepts both alias # spellings, so resolving here would be a visible behaviour change # with zero functional benefit and would break the CLI test suite # (~10 ``test_single_*`` cases). This is the one asymmetry #820's diff --git a/src/apm_cli/factory.py b/src/apm_cli/factory.py index 2efb5cff5..cfeabaf7f 100644 --- a/src/apm_cli/factory.py +++ b/src/apm_cli/factory.py @@ -9,8 +9,26 @@ from .adapters.client.gemini import GeminiClientAdapter from .adapters.client.opencode import OpenCodeClientAdapter from .adapters.client.vscode import VSCodeClientAdapter +from .adapters.client.windsurf import WindsurfClientAdapter from .adapters.package_manager.default_manager import DefaultMCPPackageManager +# Single source of truth for MCP client adapter registration. Adding a +# new MCP-capable target means a single dict entry here -- callers that +# need "every MCP runtime APM supports" should iterate +# ``ClientFactory.supported_clients()`` rather than maintain parallel +# lists (see ``mcp_integrator`` runtime loops for the canonical +# consumers). +_MCP_CLIENT_REGISTRY: dict[str, type] = { + "copilot": CopilotClientAdapter, + "vscode": VSCodeClientAdapter, + "codex": CodexClientAdapter, + "cursor": CursorClientAdapter, + "gemini": GeminiClientAdapter, + "opencode": OpenCodeClientAdapter, + "windsurf": WindsurfClientAdapter, + "claude": ClaudeClientAdapter, +} + class ClientFactory: """Factory for creating MCP client adapters.""" @@ -35,25 +53,26 @@ def create_client( Raises: ValueError: If the client type is not supported. """ - clients = { - "copilot": CopilotClientAdapter, - "vscode": VSCodeClientAdapter, - "codex": CodexClientAdapter, - "cursor": CursorClientAdapter, - "gemini": GeminiClientAdapter, - "opencode": OpenCodeClientAdapter, - "claude": ClaudeClientAdapter, - # Add more clients as needed - } - - if client_type.lower() not in clients: + key = client_type.lower() + if key not in _MCP_CLIENT_REGISTRY: raise ValueError(f"Unsupported client type: {client_type}") - return clients[client_type.lower()]( + return _MCP_CLIENT_REGISTRY[key]( project_root=project_root, user_scope=user_scope, ) + @staticmethod + def supported_clients() -> frozenset[str]: + """Return the set of MCP client target names supported by APM. + + This is the canonical "what MCP runtimes does APM know about?" + query. Use this from any module that iterates over MCP-capable + targets (e.g. cleanup loops, availability probes) instead of + hand-maintaining a parallel list. + """ + return frozenset(_MCP_CLIENT_REGISTRY.keys()) + class PackageManagerFactory: """Factory for creating MCP package manager adapters.""" diff --git a/src/apm_cli/install/services.py b/src/apm_cli/install/services.py index 1f145817c..10edbf959 100644 --- a/src/apm_cli/install/services.py +++ b/src/apm_cli/install/services.py @@ -207,12 +207,8 @@ def _log_integration(msg): elif _prim_name == "instructions": _label = "instruction(s)" elif _prim_name == "hooks": - if _target.name == "claude": - _deploy_dir = ".claude/settings.json" - elif _target.name == "cursor": - _deploy_dir = ".cursor/hooks.json" - elif _target.name == "codex": - _deploy_dir = ".codex/hooks.json" + if _target.hooks_config_display: + _deploy_dir = _target.hooks_config_display _label = "hook(s)" else: _label = _prim_name diff --git a/src/apm_cli/integration/agent_integrator.py b/src/apm_cli/integration/agent_integrator.py index 9faf07e16..0436d027e 100644 --- a/src/apm_cli/integration/agent_integrator.py +++ b/src/apm_cli/integration/agent_integrator.py @@ -11,6 +11,8 @@ from pathlib import Path from typing import TYPE_CHECKING, Dict, List # noqa: F401, UP035 +import yaml + from apm_cli.integration.base_integrator import BaseIntegrator, IntegrationResult from apm_cli.utils.paths import portable_relpath @@ -150,6 +152,10 @@ def integrate_agents_for_target( if mapping.format_id == "codex_agent": self._write_codex_agent(source_file, target_path) links_resolved = 0 + elif mapping.format_id == "windsurf_agent_skill": + links_resolved = self._write_windsurf_agent_skill( + source_file, target_path, diagnostics=diagnostics + ) else: links_resolved = self.copy_agent(source_file, target_path) total_links_resolved += links_resolved @@ -256,8 +262,6 @@ def _write_codex_agent(source: Path, target: Path) -> None: if fm_match: body = content[fm_match.end() :] try: - import yaml - fm = yaml.safe_load(fm_match.group(1)) or {} name = fm.get("name", name) description = fm.get("description", description) @@ -271,6 +275,75 @@ def _write_codex_agent(source: Path, target: Path) -> None: } target.write_text(_toml.dumps(doc), encoding="utf-8") + # ------------------------------------------------------------------ + # Windsurf agent-skill transformer (agent.md -> skills//SKILL.md) + # ------------------------------------------------------------------ + + def _write_windsurf_agent_skill( + self, source: Path, target: Path, diagnostics=None + ) -> int: # not @staticmethod: needs self.resolve_links() + """Transform an ``.agent.md`` file to a Windsurf Skill (``SKILL.md``). + + Windsurf Skills are the closest equivalent to a specialist persona: + - Invocable with ``@skill-name`` (like ``@agent-name`` in Copilot) + - Auto-invoked by Cascade when the description matches the task + - Support a directory with supplementary resource files + + The conversion: + - Keeps ``name`` (or derives from filename) and ``description``. + - Strips agent-specific keys (``model``, ``tools``) and emits a + diagnostic warning when those fields are dropped. + - Preserves the markdown body verbatim. + """ + content = source.read_text(encoding="utf-8") + + stem = source.name + if stem.endswith(".agent.md"): + stem = stem[:-9] + elif stem.endswith(".chatmode.md"): + stem = stem[:-12] + else: + stem = Path(stem).stem + + fm_match = AgentIntegrator._FRONTMATTER_RE.match(content) + if fm_match: + body = content[fm_match.end() :] + try: + fm = yaml.safe_load(fm_match.group(1)) or {} + except Exception: + fm = {} + else: + body = content + fm = {} + + dropped = [k for k in ("tools", "model") if fm.get(k)] + if dropped and diagnostics is not None: + diagnostics.warn( + f"Windsurf skill conversion dropped frontmatter field(s) " + f"{', '.join(dropped)} from {source.name}", + detail="Windsurf Skills do not support agent-only fields; " + "only name, description, and body are preserved.", + ) + + name = fm.get("name", stem) + description = fm.get("description", "") + + # Use yaml.safe_dump to safely serialize values -- prevents YAML key + # injection via multi-line name/description strings. + + fm_data: dict = {"name": name} + if description: + fm_data["description"] = description + fm_yaml = yaml.safe_dump( # yaml-io-exempt: serializes to string, not file handle + fm_data, default_flow_style=False, allow_unicode=True + ).rstrip("\n") + + result = f"---\n{fm_yaml}\n---\n" + body + result, links_resolved = self.resolve_links(result, source, target) + target.parent.mkdir(parents=True, exist_ok=True) + target.write_text(result, encoding="utf-8") + return links_resolved + # DEPRECATED: use integrate_agents_for_target(KNOWN_TARGETS["copilot"], ...) instead. def integrate_package_agents( self, diff --git a/src/apm_cli/integration/hook_integrator.py b/src/apm_cli/integration/hook_integrator.py index a2af1e8dc..5c7f2475a 100644 --- a/src/apm_cli/integration/hook_integrator.py +++ b/src/apm_cli/integration/hook_integrator.py @@ -52,6 +52,7 @@ from typing import Dict, List, Optional, Tuple # noqa: F401, UP035 from apm_cli.integration.base_integrator import BaseIntegrator, IntegrationResult +from apm_cli.utils.path_security import ensure_path_within from apm_cli.utils.paths import portable_relpath _log = logging.getLogger(__name__) @@ -171,6 +172,11 @@ def _copilot_keys_to_gemini(hook: dict) -> None: target_key="gemini", require_dir=True, ), + "windsurf": _MergeHookConfig( + config_filename="hooks.json", + target_key="windsurf", + require_dir=True, + ), } @@ -183,6 +189,7 @@ def _copilot_keys_to_gemini(hook: dict) -> None: "claude-hooks": {"claude"}, "codex-hooks": {"codex"}, "gemini-hooks": {"gemini"}, + "windsurf-hooks": {"windsurf"}, } @@ -352,6 +359,9 @@ def _rewrite_command_for_target( elif target == "codex": base_root = root_dir or ".codex" scripts_base = f"{base_root}/hooks/{package_name}" + elif target == "windsurf": + base_root = root_dir or ".windsurf" + scripts_base = f"{base_root}/hooks/{package_name}" else: base_root = root_dir or ".claude" scripts_base = f"{base_root}/hooks/{package_name}" @@ -588,6 +598,7 @@ def integrate_package_hooks( # Copy referenced scripts (individual file tracking) for source_file, target_rel in scripts: target_script = project_root / target_rel + ensure_path_within(target_script, project_root) if self.check_collision( target_script, target_rel, managed_files, force, diagnostics=diagnostics ): @@ -768,6 +779,7 @@ def _integrate_merged_hooks( # Copy referenced scripts for source_file, target_rel in scripts: target_script = project_root / target_rel + ensure_path_within(target_script, project_root) if self.check_collision( target_script, target_rel, diff --git a/src/apm_cli/integration/instruction_integrator.py b/src/apm_cli/integration/instruction_integrator.py index 15e128ea8..01ec489ba 100644 --- a/src/apm_cli/integration/instruction_integrator.py +++ b/src/apm_cli/integration/instruction_integrator.py @@ -14,6 +14,7 @@ from typing import TYPE_CHECKING, Dict, List, Optional, Set # noqa: F401, UP035 from apm_cli.integration.base_integrator import BaseIntegrator, IntegrationResult +from apm_cli.utils.path_security import ensure_path_within from apm_cli.utils.paths import portable_relpath if TYPE_CHECKING: @@ -70,9 +71,10 @@ def integrate_instructions_for_target( Selects the content transform via ``format_id``: - * ``cursor_rules`` -- convert ``applyTo:`` to ``globs:`` frontmatter - * ``claude_rules`` -- convert ``applyTo:`` to ``paths:`` frontmatter - * anything else -- copy verbatim (identity transform) + * ``cursor_rules`` -- convert ``applyTo:`` to ``globs:`` frontmatter + * ``claude_rules`` -- convert ``applyTo:`` to ``paths:`` frontmatter + * ``windsurf_rules`` -- convert ``applyTo:`` to ``trigger: glob`` frontmatter + * anything else -- copy verbatim (identity transform) """ mapping = target.primitives.get("instructions") if not mapping: @@ -92,7 +94,7 @@ def integrate_instructions_for_target( deploy_dir.mkdir(parents=True, exist_ok=True) fmt = mapping.format_id - needs_rename = fmt in ("cursor_rules", "claude_rules") + needs_rename = fmt in ("cursor_rules", "claude_rules", "windsurf_rules") files_integrated = 0 files_skipped = 0 @@ -109,6 +111,12 @@ def integrate_instructions_for_target( target_name = source_file.name target_path = deploy_dir / target_name + # target_name is Path.name (no separators), so traversal via + # deploy_dir is impossible. Validated against deploy_dir (not + # project_root) so user-scope targets whose root resolves + # outside the workspace still work correctly. + ensure_path_within(target_path, deploy_dir) + rel_path = portable_relpath(target_path, project_root) if self.check_collision( @@ -125,6 +133,8 @@ def integrate_instructions_for_target( links_resolved = self.copy_instruction_cursor(source_file, target_path) elif fmt == "claude_rules": links_resolved = self.copy_instruction_claude(source_file, target_path) + elif fmt == "windsurf_rules": + links_resolved = self.copy_instruction_windsurf(source_file, target_path) else: links_resolved = self.copy_instruction(source_file, target_path) @@ -156,6 +166,10 @@ def sync_for_target( legacy_dir = project_root / effective_root / mapping.subdir if mapping.format_id == "cursor_rules": legacy_pattern = "*.mdc" + elif mapping.format_id == "windsurf_rules": + # Do not use a broad legacy glob for Windsurf rules to avoid + # deleting user-authored .md files under .windsurf/rules/. + legacy_pattern = None elif mapping.format_id == "claude_rules": # Do not use a broad legacy glob for Claude rules to avoid # deleting user-authored .md files under .claude/rules/. @@ -318,6 +332,62 @@ def sync_integration_cursor( managed_files=managed_files, ) + # ------------------------------------------------------------------ + # Windsurf Rules (.md with trigger/globs frontmatter) + # ------------------------------------------------------------------ + + @staticmethod + def _convert_to_windsurf_rules(content: str) -> str: + """Convert APM instruction content to Windsurf rules ``.md`` format. + + Parses existing YAML frontmatter via ``yaml.safe_load``, maps + ``applyTo`` to Windsurf's ``trigger: glob`` + ``globs`` frontmatter. + Instructions without ``applyTo`` become ``trigger: always_on`` rules. + + Ref: https://docs.windsurf.com/windsurf/cascade/memories + """ + import yaml + + body = content + apply_to = "" + + # Parse existing frontmatter with yaml.safe_load (consistent with + # _write_windsurf_agent_skill and all other frontmatter parsers). + fm_match = re.match(r"^---\s*\n(.*?)\n---\s*\n?", content, re.DOTALL) + if fm_match: + body = content[fm_match.end() :] + try: + fm = yaml.safe_load(fm_match.group(1)) or {} + except Exception: + fm = {} + apply_to = str(fm.get("applyTo", "")).strip() + + # Build Windsurf rules frontmatter + parts = ["---"] + if apply_to: + # Sanitize: strip newlines to prevent frontmatter injection + # via crafted applyTo values (e.g. "**\ntrigger: always_on"). + safe_apply_to = apply_to.replace("\n", " ").replace("\r", " ").strip() + parts.append("trigger: glob") + parts.append(f'globs: "{safe_apply_to}"') + else: + parts.append("trigger: always_on") + parts.append("---") + + return "\n".join(parts) + "\n\n" + body.lstrip("\n") + + def copy_instruction_windsurf(self, source: Path, target: Path) -> int: + """Copy instruction file converted to Windsurf rules format. + + Converts ``applyTo:`` to ``trigger: glob`` + ``globs:`` frontmatter + and resolves links. + """ + content = source.read_text(encoding="utf-8") + content = self._convert_to_windsurf_rules(content) + content, links_resolved = self.resolve_links(content, source, target) + target.write_text(content, encoding="utf-8") + return links_resolved + # ------------------------------------------------------------------ # Claude Code Rules (.md with paths: frontmatter) # ------------------------------------------------------------------ diff --git a/src/apm_cli/integration/mcp_integrator.py b/src/apm_cli/integration/mcp_integrator.py index eee17e639..97fb40154 100644 --- a/src/apm_cli/integration/mcp_integrator.py +++ b/src/apm_cli/integration/mcp_integrator.py @@ -458,7 +458,11 @@ def remove_stale( return # Determine which runtimes to clean, mirroring install-time logic. - all_runtimes = {"vscode", "copilot", "codex", "cursor", "opencode", "gemini", "claude"} + # Derived from ClientFactory so adding a new MCP-capable target + # extends cleanup automatically (no parallel list to maintain). + from apm_cli.factory import ClientFactory + + all_runtimes = ClientFactory.supported_clients() if runtime: # noqa: SIM108 target_runtimes = {runtime} else: @@ -632,6 +636,31 @@ def remove_stale( exc_info=True, ) + # Clean ~/.codeium/windsurf/mcp_config.json + if "windsurf" in target_runtimes: + windsurf_mcp = Path.home() / ".codeium" / "windsurf" / "mcp_config.json" + if windsurf_mcp.exists(): + try: + import json as _json + + config = _json.loads(windsurf_mcp.read_text(encoding="utf-8")) + servers = config.get("mcpServers", {}) + removed = [n for n in expanded_stale if n in servers] + for name in removed: + del servers[name] + if removed: + windsurf_mcp.write_text(_json.dumps(config, indent=2), encoding="utf-8") + for name in removed: + _rich_success( + f"Removed stale MCP server '{name}' from Windsurf config", + symbol="check", + ) + except Exception: + _log.debug( + "Failed to clean stale MCP servers from Windsurf config", + exc_info=True, + ) + # Clean .gemini/settings.json (only if .gemini/ directory exists) if "gemini" in target_runtimes: gemini_cfg = Path.cwd() / ".gemini" / "settings.json" @@ -779,6 +808,8 @@ def _detect_runtimes(scripts: dict) -> list[str]: detected.add("claude") if re.search(r"\bllm\b", command): detected.add("llm") + if re.search(r"\bwindsurf\b", command): + detected.add("windsurf") return builtins.list(detected) @@ -811,10 +842,11 @@ def _filter_runtimes(detected_runtimes: list[str]) -> list[str]: return available except ImportError: + # Derived from ClientFactory; see _MCP_CLIENT_REGISTRY. + from apm_cli.factory import ClientFactory + mcp_compatible = [ - rt - for rt in detected_runtimes - if rt in ["vscode", "copilot", "codex", "cursor", "opencode", "gemini", "claude"] + rt for rt in detected_runtimes if rt in ClientFactory.supported_clients() ] return [rt for rt in mcp_compatible if shutil.which(rt)] @@ -886,7 +918,7 @@ def _install_for_runtime( except ValueError as e: logger.warning(f"Runtime {runtime} not supported: {e}") logger.progress( - "Supported runtimes: vscode, copilot, codex, cursor, opencode, gemini, claude, llm" + "Supported runtimes: vscode, copilot, codex, cursor, opencode, gemini, claude, windsurf, llm" ) return False except Exception as e: @@ -1061,6 +1093,7 @@ def install( "cursor", "opencode", "gemini", + "windsurf", "claude", ]: try: @@ -1083,6 +1116,11 @@ def install( if (Path.cwd() / ".gemini").is_dir(): ClientFactory.create_client(runtime_name) installed_runtimes.append(runtime_name) + elif runtime_name == "windsurf": + # Windsurf is opt-in: only target when .windsurf/ exists + if (project_root_path / ".windsurf").is_dir(): + ClientFactory.create_client(runtime_name) + installed_runtimes.append(runtime_name) elif runtime_name == "claude": # Claude Code is opt-in: target when .claude/ exists # in the project (project-scope writes) OR when the @@ -1117,6 +1155,9 @@ def install( # Gemini CLI is directory-presence based if (Path.cwd() / ".gemini").is_dir(): installed_runtimes.append("gemini") + # Windsurf is directory-presence based + if (project_root_path / ".windsurf").is_dir(): + installed_runtimes.append("windsurf") # Claude Code: directory-presence OR binary-on-PATH if (project_root_path / ".claude").is_dir() or (shutil.which("claude") is not None): installed_runtimes.append("claude") diff --git a/src/apm_cli/integration/targets.py b/src/apm_cli/integration/targets.py index 187f65ead..f8940e725 100644 --- a/src/apm_cli/integration/targets.py +++ b/src/apm_cli/integration/targets.py @@ -131,6 +131,46 @@ class TargetProfile: ``copilot-instructions.md`` file. """ + # -- subsystem-specific metadata (single source of truth) ----------------- + # + # The four fields below centralize per-target knowledge that previously + # lived in scattered module-local dicts and ``if/elif`` chains + # (see ``bundle/lockfile_enrichment.py``, ``core/conflict_detector.py``, + # ``commands/compile/cli.py``, ``install/services.py``). Adding a new + # target now requires only a single ``KNOWN_TARGETS`` entry. + + pack_prefixes: tuple[str, ...] = () + """Path prefixes that identify this target's deployed files when packing. + + When empty, ``bundle.lockfile_enrichment`` derives ``(f"{root_dir}/",)`` + from :attr:`root_dir`. Override only when the target deploys to multiple + top-level directories (e.g. Codex deploys both ``.codex/`` and + ``.agents/``). + """ + + compile_family: str | None = None + """Compiler family this target belongs to for ``apm compile`` routing. + + Recognised values: + + * ``"vscode"`` -- emits ``.github/copilot-instructions.md`` *and* AGENTS.md. + * ``"claude"`` -- emits ``CLAUDE.md`` and ``.claude/rules/`` files. + * ``"gemini"`` -- emits ``GEMINI.md``. + * ``"agents"`` -- emits AGENTS.md only (cursor, opencode, codex, windsurf). + * ``None`` -- target has no compile output (agent-skills, copilot-cowork). + + Used by :func:`apm_cli.commands.compile.cli._resolve_compile_target` to + derive multi-target routing from the registry instead of hard-coded sets. + """ + + hooks_config_display: str | None = None + """Human-readable path shown in the install log for hooks integration. + + e.g. ``".claude/settings.json"`` for Claude (hooks merge into a settings + file rather than landing in their own subdir). When ``None``, the + install log falls back to the generic ``"{root}/{subdir}/"`` formula. + """ + @property def prefix(self) -> str: """Return the path prefix for this target (e.g. ``".github/"``). @@ -139,6 +179,15 @@ def prefix(self) -> str: """ return f"{self.root_dir}/" + @property + def effective_pack_prefixes(self) -> tuple[str, ...]: + """Return the path prefixes used by pack-time file filtering. + + Falls back to ``(self.prefix,)`` when :attr:`pack_prefixes` is empty, + so most targets need not override the field explicitly. + """ + return self.pack_prefixes if self.pack_prefixes else (self.prefix,) + def supports(self, primitive: str) -> bool: """Return ``True`` if this target accepts *primitive*.""" return primitive in self.primitives @@ -299,6 +348,7 @@ def for_scope(self, user_scope: bool = False) -> TargetProfile | None: user_root_dir=".copilot", unsupported_user_primitives=("prompts", "instructions"), generated_files=("copilot-instructions.md",), + compile_family="vscode", ), # Claude Code -- the user-level config directory is whatever # ``CLAUDE_CONFIG_DIR`` points to (default ``~/.claude``). The env @@ -320,6 +370,8 @@ def for_scope(self, user_scope: bool = False) -> TargetProfile | None: auto_create=False, detect_by_dir=True, user_supported=True, + compile_family="claude", + hooks_config_display=".claude/settings.json", ), # Cursor -- at user scope, ~/.cursor/ supports skills, agents, hooks, # and MCP. Rules/instructions are managed via Cursor Settings UI only @@ -344,6 +396,8 @@ def for_scope(self, user_scope: bool = False) -> TargetProfile | None: user_supported="partial", user_root_dir=".cursor", unsupported_user_primitives=("instructions",), + compile_family="agents", + hooks_config_display=".cursor/hooks.json", ), # OpenCode -- at user scope, ~/.config/opencode/ supports skills, agents, # and commands. OpenCode has no hooks concept, so "hooks" is excluded. @@ -365,6 +419,7 @@ def for_scope(self, user_scope: bool = False) -> TargetProfile | None: user_supported="partial", user_root_dir=".config/opencode", unsupported_user_primitives=("hooks",), + compile_family="agents", ), # Gemini CLI -- ~/.gemini/ is the documented user-level config directory. # Instructions are compile-only (GEMINI.md) -- Gemini CLI does not read @@ -390,6 +445,8 @@ def for_scope(self, user_scope: bool = False) -> TargetProfile | None: detect_by_dir=True, user_supported=True, user_root_dir=".gemini", + compile_family="gemini", + hooks_config_display=".gemini/settings.json", ), # Codex CLI: skills use the cross-tool .agents/ dir (agent skills standard), # agents are TOML under .codex/agents/, hooks merge into .codex/hooks.json. @@ -410,6 +467,40 @@ def for_scope(self, user_scope: bool = False) -> TargetProfile | None: auto_create=False, detect_by_dir=True, user_supported="partial", + pack_prefixes=(".codex/", ".agents/"), + compile_family="agents", + hooks_config_display=".codex/hooks.json", + ), + # Windsurf/Cascade -- .windsurf/ is the workspace config directory. + # Rules are markdown files with trigger/globs frontmatter under .windsurf/rules/. + # Agents are deployed as skills under .windsurf/skills//SKILL.md + # (Cascade auto-invokes them when the description matches the task). + # Skills use the standard SKILL.md format under .windsurf/skills/. + # Workflows (~= commands) are markdown files under .windsurf/workflows/. + # Hooks are configured in .windsurf/hooks.json. + # At user scope, ~/.codeium/windsurf/ is used. Global rules use a single + # file (~/.codeium/windsurf/memories/global_rules.md) with a different + # format, so "instructions" is excluded from user scope. + # MCP config: ~/.codeium/windsurf/mcp_config.json (mcpServers JSON format). + # Ref: https://docs.windsurf.com/windsurf/cascade/memories + # Ref: https://docs.windsurf.com/windsurf/cascade/mcp + "windsurf": TargetProfile( + name="windsurf", + root_dir=".windsurf", + primitives={ + "instructions": PrimitiveMapping("rules", ".md", "windsurf_rules"), + "agents": PrimitiveMapping("skills", "/SKILL.md", "windsurf_agent_skill"), + "skills": PrimitiveMapping("skills", "/SKILL.md", "skill_standard"), + "commands": PrimitiveMapping("workflows", ".md", "windsurf_workflow"), + "hooks": PrimitiveMapping("", "hooks.json", "windsurf_hooks"), + }, + auto_create=False, + detect_by_dir=True, + user_supported="partial", + user_root_dir=".codeium/windsurf", + unsupported_user_primitives=("instructions",), + compile_family="agents", + hooks_config_display=".windsurf/hooks.json", ), # Agent-skills: cross-client shared skills directory (.agents/skills/). # Skills primitive only -- no agents, hooks, or commands. diff --git a/tests/unit/compilation/test_compile_target_flag.py b/tests/unit/compilation/test_compile_target_flag.py index 7cc6d75d7..d39c93863 100644 --- a/tests/unit/compilation/test_compile_target_flag.py +++ b/tests/unit/compilation/test_compile_target_flag.py @@ -1518,7 +1518,7 @@ def test_list_copilot_only_returns_vscode(self): assert _resolve_compile_target(["copilot"]) == "vscode" def test_list_agents_md_only_family_preserves_bare_target(self): - """cursor/opencode/codex must NOT collapse to 'vscode' (which would + """cursor/opencode/codex/windsurf must NOT collapse to 'vscode' (which would wrongly route copilot-instructions.md). Single-element lists keep the bare target name; multi-element lists pick the first present.""" from apm_cli.commands.compile.cli import _resolve_compile_target @@ -1526,10 +1526,26 @@ def test_list_agents_md_only_family_preserves_bare_target(self): assert _resolve_compile_target(["cursor"]) == "cursor" assert _resolve_compile_target(["opencode"]) == "opencode" assert _resolve_compile_target(["codex"]) == "codex" + assert _resolve_compile_target(["windsurf"]) == "windsurf" # Multi-element AGENTS.md-only list collapses to a representative bare # target (cursor wins by deterministic ordering). assert _resolve_compile_target(["cursor", "opencode"]) == "cursor" assert _resolve_compile_target(["opencode", "codex"]) == "opencode" + assert _resolve_compile_target(["codex", "windsurf"]) == "codex" + + def test_windsurf_routes_via_agents_family(self): + """Regression: windsurf must route through the 'agents' compile_family + the same way cursor/opencode/codex do. Before the registry-driven + refactor, windsurf was missing from agents_md_family and would have + silently collapsed to the 'vscode' fallback (emitting copilot- + instructions.md by mistake).""" + from apm_cli.commands.compile.cli import _resolve_compile_target + + # Combined with claude/gemini -> frozenset of families + assert _resolve_compile_target(["windsurf", "claude"]) == frozenset({"agents", "claude"}) + assert _resolve_compile_target(["windsurf", "gemini"]) == frozenset({"agents", "gemini"}) + # Combined with copilot/vscode -> 'vscode' wins (which already includes agents) + assert _resolve_compile_target(["windsurf", "copilot"]) == "vscode" def test_list_cursor_and_claude_returns_agents_claude_set(self): from apm_cli.commands.compile.cli import _resolve_compile_target diff --git a/tests/unit/core/test_scope.py b/tests/unit/core/test_scope.py index 92ab14252..a892f3a72 100644 --- a/tests/unit/core/test_scope.py +++ b/tests/unit/core/test_scope.py @@ -164,6 +164,7 @@ def test_all_known_targets_present(self): "opencode", "codex", "gemini", + "windsurf", "copilot-cowork", "agent-skills", } @@ -235,6 +236,25 @@ def test_supports_at_user_scope_opencode_partial(self): assert KNOWN_TARGETS["opencode"].supports_at_user_scope("commands") is True assert KNOWN_TARGETS["opencode"].supports_at_user_scope("hooks") is False + def test_windsurf_is_partially_supported(self): + assert KNOWN_TARGETS["windsurf"].user_supported == "partial" + assert KNOWN_TARGETS["windsurf"].user_root_dir == ".codeium/windsurf" + assert "instructions" in KNOWN_TARGETS["windsurf"].unsupported_user_primitives + + def test_supports_at_user_scope_windsurf_partial(self): + # Windsurf supports skills, commands, hooks, agents at user scope but not instructions + assert KNOWN_TARGETS["windsurf"].supports_at_user_scope("skills") is True + assert KNOWN_TARGETS["windsurf"].supports_at_user_scope("commands") is True + assert KNOWN_TARGETS["windsurf"].supports_at_user_scope("hooks") is True + assert KNOWN_TARGETS["windsurf"].supports_at_user_scope("agents") is True + assert KNOWN_TARGETS["windsurf"].supports_at_user_scope("instructions") is False + + def test_windsurf_effective_root_project_scope(self): + assert KNOWN_TARGETS["windsurf"].effective_root(user_scope=False) == ".windsurf" + + def test_windsurf_effective_root_user_scope(self): + assert KNOWN_TARGETS["windsurf"].effective_root(user_scope=True) == ".codeium/windsurf" + def test_unsupported_targets_have_no_user_root(self): for name, profile in KNOWN_TARGETS.items(): if profile.user_supported is False: @@ -280,6 +300,8 @@ def test_warn_message_includes_unsupported_primitives(self): assert "cursor (instructions)" in msg # OpenCode excludes hooks assert "opencode (hooks)" in msg + # Windsurf excludes instructions + assert "windsurf (instructions)" in msg # --------------------------------------------------------------------------- diff --git a/tests/unit/core/test_target_detection.py b/tests/unit/core/test_target_detection.py index 2663e1a96..d94d840cb 100644 --- a/tests/unit/core/test_target_detection.py +++ b/tests/unit/core/test_target_detection.py @@ -420,6 +420,77 @@ def test_opencode_no_compile_claude_md(self): assert should_compile_claude_md("opencode") is False +class TestDetectTargetWindsurf: + """Tests for auto-detection and explicit windsurf target.""" + + def test_explicit_target_windsurf(self, tmp_path): + """Explicit --target windsurf always wins.""" + target, reason = detect_target( + project_root=tmp_path, + explicit_target="windsurf", + ) + assert target == "windsurf" + assert reason == "explicit --target flag" + + def test_config_target_windsurf(self, tmp_path): + """Config target windsurf is used when no explicit target.""" + target, reason = detect_target( + project_root=tmp_path, + explicit_target=None, + config_target="windsurf", + ) + assert target == "windsurf" + assert reason == "apm.yml target" + + def test_auto_detect_windsurf_only(self, tmp_path): + """Auto-detect windsurf when only .windsurf/ exists.""" + (tmp_path / ".windsurf").mkdir() + target, reason = detect_target( + project_root=tmp_path, + explicit_target=None, + config_target=None, + ) + assert target == "windsurf" + assert ".windsurf/" in reason + + def test_auto_detect_windsurf_plus_github(self, tmp_path): + """Auto-detect all when .windsurf/ and .github/ exist.""" + (tmp_path / ".github").mkdir() + (tmp_path / ".windsurf").mkdir() + target, _ = detect_target( + project_root=tmp_path, + explicit_target=None, + config_target=None, + ) + assert target == "all" + + def test_windsurf_compile_agents_md(self): + """Windsurf target should compile AGENTS.md (reads it natively).""" + assert should_compile_agents_md("windsurf") is True + + def test_windsurf_no_compile_claude_md(self): + """Windsurf target should NOT compile CLAUDE.md.""" + assert should_compile_claude_md("windsurf") is False + + def test_windsurf_no_compile_gemini_md(self): + """Windsurf target should NOT compile GEMINI.md.""" + assert should_compile_gemini_md("windsurf") is False + + def test_windsurf_description(self): + """Description for windsurf target.""" + desc = get_target_description("windsurf") + assert "AGENTS.md" in desc + assert ".windsurf/" in desc + + def test_windsurf_in_all_canonical_targets(self): + """Windsurf must appear in ALL_CANONICAL_TARGETS.""" + assert "windsurf" in ALL_CANONICAL_TARGETS + + def test_windsurf_in_valid_target_values(self): + """Windsurf must be accepted by the --target parser.""" + assert "windsurf" in VALID_TARGET_VALUES + + # --------------------------------------------------------------------------- # TargetParamType tests # --------------------------------------------------------------------------- diff --git a/tests/unit/integration/test_agent_integrator.py b/tests/unit/integration/test_agent_integrator.py index 0594bddde..e682086b5 100644 --- a/tests/unit/integration/test_agent_integrator.py +++ b/tests/unit/integration/test_agent_integrator.py @@ -1138,3 +1138,259 @@ def test_codex_agent_target_filename_is_toml(self): source = Path("/fake/test.agent.md") filename = integrator.get_target_filename_for_target(source, "pkg", codex) assert filename == "test.toml" + + +# ================================================================== +# Windsurf agent tests (agents -> .windsurf/skills//SKILL.md) +# ================================================================== + + +class TestWindsurfAgentSkillConversion: + """Test _write_windsurf_agent_skill static method.""" + + def setup_method(self): + self.temp_dir = tempfile.mkdtemp() + self.root = Path(self.temp_dir) + + def teardown_method(self): + import shutil + + shutil.rmtree(self.temp_dir, ignore_errors=True) + + def test_generates_skill_frontmatter(self): + """Agent file gets name + description frontmatter in SKILL.md format.""" + source = self.root / "design-reviewer.agent.md" + target = self.root / "design-reviewer" / "SKILL.md" + source.write_text( + '---\ndescription: "A design review specialist"\n---\n\n# Design Reviewer\n' + ) + + AgentIntegrator()._write_windsurf_agent_skill(source, target) + + assert target.exists() + content = target.read_text() + assert "name: design-reviewer" in content + assert "description: A design review specialist" in content + assert "# Design Reviewer" in content + assert "trigger:" not in content + + def test_preserves_name_from_frontmatter(self): + """Name from agent frontmatter is preserved.""" + source = self.root / "architect.agent.md" + target = self.root / "architect" / "SKILL.md" + source.write_text( + "---\ndescription: Context architect\nmodel: GPT-5\n" + "tools: ['search/codebase']\nname: Context Architect\n---\n\n# Body" + ) + + AgentIntegrator()._write_windsurf_agent_skill(source, target) + + content = target.read_text() + assert "name: Context Architect" in content + assert "description: Context architect" in content + assert "model:" not in content + assert "tools:" not in content + assert "# Body" in content + + def test_no_frontmatter_uses_stem(self): + """Agent without frontmatter derives name from filename stem.""" + source = self.root / "simple.agent.md" + target = self.root / "simple" / "SKILL.md" + source.write_text("# Simple agent\n\nJust some instructions.") + + AgentIntegrator()._write_windsurf_agent_skill(source, target) + + content = target.read_text() + assert "name: simple" in content + assert "# Simple agent" in content + + def test_creates_parent_directory(self): + """SKILL.md parent directory is created automatically.""" + source = self.root / "test.agent.md" + target = self.root / "skills" / "test" / "SKILL.md" + source.write_text("---\ndescription: test\n---\n\n# Test") + + AgentIntegrator()._write_windsurf_agent_skill(source, target) + + assert target.parent.is_dir() + assert target.exists() + + def test_body_preserved_verbatim(self): + """Markdown body is kept verbatim.""" + source = self.root / "test.agent.md" + target = self.root / "test" / "SKILL.md" + body = "\n# Agent\n\n## Expertise\n- Python\n- TypeScript\n\n## Approach\n1. Read first\n2. Then code\n" + source.write_text(f"---\ndescription: test\n---\n{body}") + + AgentIntegrator()._write_windsurf_agent_skill(source, target) + + content = target.read_text() + assert "## Expertise" in content + assert "- Python" in content + assert "## Approach" in content + + +class TestWindsurfAgentSkillIntegration: + """End-to-end: agents deploy to .windsurf/skills//SKILL.md via integrate_agents_for_target.""" + + def setup_method(self): + self.temp_dir = tempfile.mkdtemp() + self.project_root = Path(self.temp_dir) + self.integrator = AgentIntegrator() + + def teardown_method(self): + import shutil + + shutil.rmtree(self.temp_dir, ignore_errors=True) + + def _make_package_info(self, pkg_dir): + package = APMPackage(name="test-pkg", version="1.0.0", package_path=pkg_dir) + resolved_ref = ResolvedReference( + original_ref="main", + ref_type=GitReferenceType.BRANCH, + resolved_commit="abc123", + ref_name="main", + ) + return PackageInfo( + package=package, + install_path=pkg_dir, + resolved_reference=resolved_ref, + installed_at="2024-01-01T00:00:00", + ) + + def test_deploys_agent_as_windsurf_skill(self): + """Agent deploys to .windsurf/skills//SKILL.md with name + description.""" + from apm_cli.integration.targets import KNOWN_TARGETS + + (self.project_root / ".windsurf").mkdir() + + pkg = self.project_root / "package" + agents_dir = pkg / ".apm" / "agents" + agents_dir.mkdir(parents=True) + (agents_dir / "design-reviewer.agent.md").write_text( + '---\ndescription: "Design review specialist"\n---\n\n# Design Reviewer\n' + ) + + pkg_info = self._make_package_info(pkg) + windsurf = KNOWN_TARGETS["windsurf"] + result = self.integrator.integrate_agents_for_target(windsurf, pkg_info, self.project_root) + + assert result.files_integrated == 1 + deployed = self.project_root / ".windsurf" / "skills" / "design-reviewer" / "SKILL.md" + assert deployed.exists() + content = deployed.read_text() + assert "name: design-reviewer" in content + assert "description: Design review specialist" in content + assert "# Design Reviewer" in content + + def test_skips_when_no_windsurf_dir(self): + """Does not deploy if .windsurf/ doesn't exist (auto_create=False).""" + from apm_cli.integration.targets import KNOWN_TARGETS + + pkg = self.project_root / "package" + agents_dir = pkg / ".apm" / "agents" + agents_dir.mkdir(parents=True) + (agents_dir / "test.agent.md").write_text("# Test") + + pkg_info = self._make_package_info(pkg) + windsurf = KNOWN_TARGETS["windsurf"] + result = self.integrator.integrate_agents_for_target(windsurf, pkg_info, self.project_root) + + assert result.files_integrated == 0 + + def test_filename_produces_skill_path(self): + """design-reviewer.agent.md -> design-reviewer/SKILL.md""" + from apm_cli.integration.targets import KNOWN_TARGETS + + integrator = AgentIntegrator() + windsurf = KNOWN_TARGETS["windsurf"] + source = Path("/fake/design-reviewer.agent.md") + filename = integrator.get_target_filename_for_target(source, "pkg", windsurf) + assert filename == "design-reviewer/SKILL.md" + + def test_multiple_agents(self): + """Multiple agents deploy to separate .windsurf/skills// dirs.""" + from apm_cli.integration.targets import KNOWN_TARGETS + + (self.project_root / ".windsurf").mkdir() + + pkg = self.project_root / "package" + agents_dir = pkg / ".apm" / "agents" + agents_dir.mkdir(parents=True) + (agents_dir / "reviewer.agent.md").write_text("# Reviewer") + (agents_dir / "architect.agent.md").write_text("# Architect") + + pkg_info = self._make_package_info(pkg) + windsurf = KNOWN_TARGETS["windsurf"] + result = self.integrator.integrate_agents_for_target(windsurf, pkg_info, self.project_root) + + assert result.files_integrated == 2 + skills_dir = self.project_root / ".windsurf" / "skills" + assert (skills_dir / "reviewer" / "SKILL.md").exists() + assert (skills_dir / "architect" / "SKILL.md").exists() + + +class TestWindsurfAgentSkillDiagnostics: + """Diagnostics emitted when agent frontmatter contains unsupported fields.""" + + def setup_method(self): + self.temp_dir = tempfile.mkdtemp() + self.root = Path(self.temp_dir) + + def teardown_method(self): + import shutil + + shutil.rmtree(self.temp_dir, ignore_errors=True) + + def test_warns_on_dropped_tools_and_model(self): + """DiagnosticCollector records a warning when tools/model are dropped.""" + from apm_cli.utils.diagnostics import CATEGORY_WARNING, DiagnosticCollector + + source = self.root / "smart.agent.md" + source.write_text( + "---\n" + "name: smart-agent\n" + "description: An agent with unsupported fields\n" + "tools:\n" + " - foo\n" + " - bar\n" + "model: gpt-4o\n" + "---\n\n" + "# Smart Agent Body\n" + ) + target = self.root / "smart-agent" / "SKILL.md" + + diagnostics = DiagnosticCollector() + AgentIntegrator()._write_windsurf_agent_skill(source, target, diagnostics=diagnostics) + + # Exactly one warning expected + by_cat = diagnostics.by_category() + warnings = by_cat.get(CATEGORY_WARNING, []) + assert len(warnings) == 1 + assert "tools" in warnings[0].message + assert "model" in warnings[0].message + + # SKILL.md must NOT contain the dropped fields + content = target.read_text() + # Split into frontmatter vs body to check frontmatter only + parts = content.split("---") + # parts[0] is empty, parts[1] is frontmatter, parts[2+] is body + fm_block = parts[1] if len(parts) >= 3 else "" + assert "tools:" not in fm_block + assert "model:" not in fm_block + + def test_no_warning_without_tools_or_model(self): + """No diagnostic when frontmatter has only name + description.""" + from apm_cli.utils.diagnostics import CATEGORY_WARNING, DiagnosticCollector + + source = self.root / "clean.agent.md" + source.write_text( + "---\nname: clean-agent\ndescription: A clean agent\n---\n\n# Clean Agent\n" + ) + target = self.root / "clean-agent" / "SKILL.md" + + diagnostics = DiagnosticCollector() + AgentIntegrator()._write_windsurf_agent_skill(source, target, diagnostics=diagnostics) + + warnings = diagnostics.by_category().get(CATEGORY_WARNING, []) + assert len(warnings) == 0 diff --git a/tests/unit/integration/test_data_driven_dispatch.py b/tests/unit/integration/test_data_driven_dispatch.py index 2c46627b6..c54e9f925 100644 --- a/tests/unit/integration/test_data_driven_dispatch.py +++ b/tests/unit/integration/test_data_driven_dispatch.py @@ -296,10 +296,13 @@ def test_partition_parity_with_old_buckets(self): "agents_cursor", "agents_opencode", "agents_codex", + "agents_windsurf", "commands", # was commands_claude, aliased "commands_gemini", "commands_opencode", + "commands_windsurf", "instructions", # was instructions_copilot, aliased + "instructions_windsurf", "rules_cursor", # was instructions_cursor, aliased "rules_claude", # was instructions_claude, aliased "skills", # cross-target bucket diff --git a/tests/unit/integration/test_hook_integrator.py b/tests/unit/integration/test_hook_integrator.py index bf2e3f3d4..723fa4263 100644 --- a/tests/unit/integration/test_hook_integrator.py +++ b/tests/unit/integration/test_hook_integrator.py @@ -3170,3 +3170,100 @@ def test_reinstall_still_idempotent_with_routing(self, temp_project: Path) -> No assert len(entries) == 1, ( f"Entry count must remain constant across re-installs; got {len(entries)}" ) + + +# ================================================================== +# Windsurf hook path rewriting +# ================================================================== + + +class TestWindsurfHookPathRewriting: + """Tests for windsurf target branch in _rewrite_command_for_target.""" + + def test_rewrite_plugin_root_for_windsurf(self, tmp_path: Path) -> None: + """${CLAUDE_PLUGIN_ROOT}/scripts/foo.sh rewrites to .windsurf/hooks//...""" + pkg_dir = tmp_path / "pkg" + scripts_dir = pkg_dir / "scripts" + scripts_dir.mkdir(parents=True) + (scripts_dir / "foo.sh").write_bytes(b"#!/bin/bash\necho hello") + + integrator = HookIntegrator() + cmd, scripts = integrator._rewrite_command_for_target( + "bash ${CLAUDE_PLUGIN_ROOT}/scripts/foo.sh", + pkg_dir, + "my-hooks", + "windsurf", + ) + + assert "${CLAUDE_PLUGIN_ROOT}" not in cmd + expected_rel = ".windsurf/hooks/my-hooks/scripts/foo.sh" + assert expected_rel in cmd + assert len(scripts) == 1 + assert scripts[0][1] == expected_rel + + def test_rewrite_relative_path_for_windsurf(self, tmp_path: Path) -> None: + """./scripts/bar.sh rewrites to .windsurf/hooks//...""" + pkg_dir = tmp_path / "pkg" + scripts_dir = pkg_dir / "scripts" + scripts_dir.mkdir(parents=True) + (scripts_dir / "bar.sh").write_bytes(b"#!/bin/bash\necho bar") + + integrator = HookIntegrator() + cmd, scripts = integrator._rewrite_command_for_target( + "./scripts/bar.sh", + pkg_dir, + "test-pkg", + "windsurf", + ) + + assert "./" not in cmd + expected_rel = ".windsurf/hooks/test-pkg/scripts/bar.sh" + assert expected_rel in cmd + assert len(scripts) == 1 + + def test_system_command_unchanged_for_windsurf(self, tmp_path: Path) -> None: + """System commands pass through unmodified for windsurf target.""" + pkg_dir = tmp_path / "pkg" + pkg_dir.mkdir(parents=True) + + integrator = HookIntegrator() + cmd, scripts = integrator._rewrite_command_for_target( + "npx prettier --check .", + pkg_dir, + "my-pkg", + "windsurf", + ) + + assert cmd == "npx prettier --check ." + assert len(scripts) == 0 + + +class TestWindsurfPathTraversalGuard: + """Regression: ensure_path_within guard on script copy prevents traversal. + + This test targets the ``ensure_path_within(target_script, project_root)`` + guard that the security specialist is adding around the ``shutil.copy2`` + calls in ``integrate_package_hooks`` / ``_integrate_merged_hooks``. + """ + + def test_copy_rejects_traversal_target_rel(self, tmp_path: Path) -> None: + """PathTraversalError raised when target_rel escapes project root.""" + from apm_cli.utils.path_security import PathTraversalError + + project_root = tmp_path / "project" + project_root.mkdir() + (project_root / ".windsurf").mkdir() + + pkg_dir = tmp_path / "pkg" + scripts_dir = pkg_dir / "scripts" + scripts_dir.mkdir(parents=True) + (scripts_dir / "evil.sh").write_bytes(b"#!/bin/bash\necho pwned") + + # The rewrite stage already filters traversal (defence layer 1). + # ensure_path_within is defence-in-depth (layer 2) on the copy loop. + # We directly test that ensure_path_within rejects traversal. + with pytest.raises(PathTraversalError): + from apm_cli.utils.path_security import ensure_path_within + + target_script = project_root / ".." / ".." / "etc" / "passwd" + ensure_path_within(target_script, project_root) diff --git a/tests/unit/integration/test_instruction_integrator.py b/tests/unit/integration/test_instruction_integrator.py index e6b5355a9..76ea967ec 100644 --- a/tests/unit/integration/test_instruction_integrator.py +++ b/tests/unit/integration/test_instruction_integrator.py @@ -1127,3 +1127,151 @@ def test_sync_handles_missing_rules_dir(self): assert result["files_removed"] == 0 assert result["errors"] == 0 + + +# ================================================================== +# Windsurf Rules (.md with trigger/globs) tests +# ================================================================== + + +class TestConvertToWindsurfRules: + """Test the Windsurf frontmatter conversion helper.""" + + def test_maps_apply_to_to_trigger_glob(self): + content = "---\napplyTo: '**/*.py'\n---\n\n# Python rules" + result = InstructionIntegrator._convert_to_windsurf_rules(content) + assert "trigger: glob" in result + assert 'globs: "**/*.py"' in result + assert "applyTo" not in result + + def test_no_apply_to_becomes_always_on(self): + content = "---\ndescription: General rules\n---\n\n# Rules" + result = InstructionIntegrator._convert_to_windsurf_rules(content) + assert "trigger: always_on" in result + assert "globs" not in result + + def test_no_frontmatter_becomes_always_on(self): + content = "# Simple rules\n\nJust some guidelines." + result = InstructionIntegrator._convert_to_windsurf_rules(content) + assert result.startswith("---\n") + assert "trigger: always_on" in result + assert "# Simple rules" in result + + def test_body_preserved(self): + content = "---\napplyTo: '**/*.py'\n---\n\n# Python rules\n\nUse type hints." + result = InstructionIntegrator._convert_to_windsurf_rules(content) + assert "# Python rules" in result + assert "Use type hints." in result + + def test_quoted_apply_to_unquoted(self): + content = "---\napplyTo: 'src/**/*.ts'\n---\n\n# TS" + result = InstructionIntegrator._convert_to_windsurf_rules(content) + assert 'globs: "src/**/*.ts"' in result + + def test_double_quoted_apply_to(self): + content = '---\napplyTo: "src/**/*.ts"\n---\n\n# TS' + result = InstructionIntegrator._convert_to_windsurf_rules(content) + assert 'globs: "src/**/*.ts"' in result + + +class TestWindsurfRulesIntegration: + """Test end-to-end Windsurf rules deployment.""" + + def setup_method(self): + self.temp_dir = tempfile.mkdtemp() + self.project_root = Path(self.temp_dir) + self.integrator = InstructionIntegrator() + + def teardown_method(self): + shutil.rmtree(self.temp_dir, ignore_errors=True) + + def test_deploys_when_windsurf_dir_exists(self): + """Deploys .md rules when .windsurf/ exists.""" + from apm_cli.integration.targets import KNOWN_TARGETS + + (self.project_root / ".windsurf").mkdir() + + pkg = self.project_root / "package" + inst_dir = pkg / ".apm" / "instructions" + inst_dir.mkdir(parents=True) + (inst_dir / "python.instructions.md").write_text( + "---\napplyTo: '**/*.py'\n---\n\n# Python rules" + ) + + pkg_info = _make_package_info(pkg) + windsurf = KNOWN_TARGETS["windsurf"] + result = self.integrator.integrate_instructions_for_target( + windsurf, pkg_info, self.project_root + ) + + assert result.files_integrated == 1 + target = self.project_root / ".windsurf" / "rules" / "python.md" + assert target.exists() + content = target.read_text() + assert "trigger: glob" in content + assert 'globs: "**/*.py"' in content + assert "# Python rules" in content + + def test_filename_strips_instructions_md_suffix(self): + """Converts python.instructions.md -> python.md.""" + from apm_cli.integration.targets import KNOWN_TARGETS + + (self.project_root / ".windsurf").mkdir() + + pkg = self.project_root / "package" + inst_dir = pkg / ".apm" / "instructions" + inst_dir.mkdir(parents=True) + (inst_dir / "security.instructions.md").write_text("# Security") + + pkg_info = _make_package_info(pkg) + windsurf = KNOWN_TARGETS["windsurf"] + result = self.integrator.integrate_instructions_for_target( + windsurf, pkg_info, self.project_root + ) + + assert len(result.target_paths) == 1 + assert result.target_paths[0].name == "security.md" + + def test_no_apply_to_gets_always_on_trigger(self): + """Instructions without applyTo get trigger: always_on.""" + from apm_cli.integration.targets import KNOWN_TARGETS + + (self.project_root / ".windsurf").mkdir() + + pkg = self.project_root / "package" + inst_dir = pkg / ".apm" / "instructions" + inst_dir.mkdir(parents=True) + (inst_dir / "general.instructions.md").write_text("# General guidelines") + + pkg_info = _make_package_info(pkg) + windsurf = KNOWN_TARGETS["windsurf"] + result = self.integrator.integrate_instructions_for_target( + windsurf, pkg_info, self.project_root + ) + + assert result.files_integrated == 1 + content = (self.project_root / ".windsurf" / "rules" / "general.md").read_text() + assert "trigger: always_on" in content + + def test_multiple_files(self): + """Integrates multiple instruction files as .md rules.""" + from apm_cli.integration.targets import KNOWN_TARGETS + + (self.project_root / ".windsurf").mkdir() + + pkg = self.project_root / "package" + inst_dir = pkg / ".apm" / "instructions" + inst_dir.mkdir(parents=True) + (inst_dir / "python.instructions.md").write_text("# Python") + (inst_dir / "testing.instructions.md").write_text("# Testing") + + pkg_info = _make_package_info(pkg) + windsurf = KNOWN_TARGETS["windsurf"] + result = self.integrator.integrate_instructions_for_target( + windsurf, pkg_info, self.project_root + ) + + assert result.files_integrated == 2 + rules_dir = self.project_root / ".windsurf" / "rules" + assert (rules_dir / "python.md").exists() + assert (rules_dir / "testing.md").exists() diff --git a/tests/unit/integration/test_scope_integration.py b/tests/unit/integration/test_scope_integration.py index 170b01d6e..0f0dd3370 100644 --- a/tests/unit/integration/test_scope_integration.py +++ b/tests/unit/integration/test_scope_integration.py @@ -322,6 +322,8 @@ def test_unsupported_primitives_filtered_at_user_scope(self): assert "instructions" not in t.primitives if t.name == "opencode": assert "hooks" not in t.primitives + if t.name == "windsurf": + assert "instructions" not in t.primitives def test_project_scope_preserves_all_primitives(self): with tempfile.TemporaryDirectory() as tmp: @@ -331,6 +333,74 @@ def test_project_scope_preserves_all_primitives(self): assert "instructions" in copilot.primitives +# -- Windsurf scope resolution ------------------------------------------------ + + +class TestWindsurfScopeResolution: + """Verify Windsurf deploys to .windsurf at project scope, .codeium/windsurf at user scope.""" + + def setup_method(self): + self.temp_dir = tempfile.mkdtemp() + self.project_root = Path(self.temp_dir) + + def teardown_method(self): + shutil.rmtree(self.temp_dir, ignore_errors=True) + + def test_project_scope_uses_windsurf_root(self): + windsurf = KNOWN_TARGETS["windsurf"] + resolved = windsurf.for_scope(user_scope=False) + assert resolved.root_dir == ".windsurf" + assert "instructions" in resolved.primitives + assert "agents" in resolved.primitives + + def test_user_scope_uses_codeium_windsurf_root(self): + windsurf = KNOWN_TARGETS["windsurf"] + resolved = windsurf.for_scope(user_scope=True) + assert resolved.root_dir == ".codeium/windsurf" + + def test_user_scope_filters_instructions(self): + """At user scope, instructions are filtered out (unsupported).""" + windsurf = KNOWN_TARGETS["windsurf"] + resolved = windsurf.for_scope(user_scope=True) + assert "instructions" not in resolved.primitives + + def test_user_scope_keeps_skills_and_commands(self): + windsurf = KNOWN_TARGETS["windsurf"] + resolved = windsurf.for_scope(user_scope=True) + assert "skills" in resolved.primitives + assert "commands" in resolved.primitives + assert "hooks" in resolved.primitives + assert "agents" in resolved.primitives + + def test_project_scope_deploys_instructions(self): + """At project scope, instructions deploy to .windsurf/rules/.""" + (self.project_root / ".windsurf").mkdir() + windsurf = KNOWN_TARGETS["windsurf"] + resolved = windsurf.for_scope(user_scope=False) + + pkg = self.project_root / "apm_modules" / "test-pkg" + inst_dir = pkg / ".apm" / "instructions" + inst_dir.mkdir(parents=True) + (inst_dir / "python.instructions.md").write_text( + "---\napplyTo: '**/*.py'\n---\n\n# Python rules" + ) + pkg_info = _make_package_info(pkg) + + integrator = InstructionIntegrator() + result = integrator.integrate_instructions_for_target( + resolved, + pkg_info, + self.project_root, + ) + + assert result.files_integrated == 1 + deployed = self.project_root / ".windsurf" / "rules" / "python.md" + assert deployed.exists() + content = deployed.read_text() + assert "trigger: glob" in content + assert 'globs: "**/*.py"' in content + + # -- Skill deploy at user scope ---------------------------------------------- diff --git a/tests/unit/integration/test_targets.py b/tests/unit/integration/test_targets.py index cb548288a..f7dfd4e0f 100644 --- a/tests/unit/integration/test_targets.py +++ b/tests/unit/integration/test_targets.py @@ -147,11 +147,11 @@ def test_gemini_and_claude_returns_both(self): names = {t.name for t in targets} assert names == {"gemini", "claude"} - def test_all_six_dirs_returns_all_six(self): - for d in (".github", ".claude", ".cursor", ".opencode", ".codex", ".gemini"): + def test_all_seven_dirs_returns_all_seven(self): + for d in (".github", ".claude", ".cursor", ".opencode", ".codex", ".gemini", ".windsurf"): (self.root / d).mkdir() targets = active_targets(self.root) - assert len(targets) == 6 + assert len(targets) == 7 def test_all_five_dirs_returns_all_five(self): for d in (".github", ".claude", ".cursor", ".opencode", ".codex"): @@ -159,6 +159,24 @@ def test_all_five_dirs_returns_all_five(self): targets = active_targets(self.root) assert len(targets) == 5 + # -- windsurf detection -- + + def test_only_windsurf_returns_windsurf(self): + (self.root / ".windsurf").mkdir() + targets = active_targets(self.root) + assert [t.name for t in targets] == ["windsurf"] + + def test_explicit_windsurf(self): + targets = active_targets(self.root, explicit_target="windsurf") + assert [t.name for t in targets] == ["windsurf"] + + def test_windsurf_and_github_returns_both(self): + (self.root / ".windsurf").mkdir() + (self.root / ".github").mkdir() + targets = active_targets(self.root) + names = {t.name for t in targets} + assert names == {"windsurf", "copilot"} + # -- explicit list of targets -- def test_explicit_list_single_target(self): diff --git a/tests/unit/integration/test_targets_registry_completeness.py b/tests/unit/integration/test_targets_registry_completeness.py new file mode 100644 index 000000000..1164c2c05 --- /dev/null +++ b/tests/unit/integration/test_targets_registry_completeness.py @@ -0,0 +1,212 @@ +"""Registry-completeness guard for ``KNOWN_TARGETS``. + +When a new target is added (or an existing one gains a primitive), the +subsystems that depend on per-target metadata -- pack-time file filtering, +MCP conflict detection, compile family routing, and install hooks display -- +must all be updated. Historically each of those lived in a module-local +dict or ``if/elif`` chain, and adding a target meant updating N files. + +This file turns "forgot to wire up new target" from a silent runtime bug +into a hard CI failure. Each test is parametrised over ``KNOWN_TARGETS`` +so that failures pinpoint the exact entry that drifted. +""" + +from __future__ import annotations + +import pytest + +from apm_cli.adapters.client.base import MCPClientAdapter +from apm_cli.adapters.client.claude import ClaudeClientAdapter +from apm_cli.adapters.client.codex import CodexClientAdapter +from apm_cli.adapters.client.copilot import CopilotClientAdapter +from apm_cli.adapters.client.cursor import CursorClientAdapter +from apm_cli.adapters.client.gemini import GeminiClientAdapter +from apm_cli.adapters.client.opencode import OpenCodeClientAdapter +from apm_cli.adapters.client.vscode import VSCodeClientAdapter +from apm_cli.adapters.client.windsurf import WindsurfClientAdapter +from apm_cli.integration.targets import KNOWN_TARGETS, TargetProfile + +# Recognised values for ``TargetProfile.compile_family``. Adding a new family +# requires touching ``apm_cli.commands.compile.cli._resolve_compile_target`` +# AND this set. Any other value would make the compile router silently +# misroute the target. +_KNOWN_COMPILE_FAMILIES = {"vscode", "claude", "gemini", "agents"} + +# Recognised values for ``MCPClientAdapter.mcp_servers_key``. Adding a new +# key means a new MCP config schema; ``MCPConflictDetector`` must learn how +# to parse it (today only ``mcp_servers`` needs the codex-style flattened- +# key fallback -- the others are plain top-level dicts). +_KNOWN_MCP_KEYS = {"mcpServers", "mcp_servers", "servers"} + +# Adapter target_names that are MCP-only pseudo-targets (no entry in +# KNOWN_TARGETS). Code that joins adapter -> profile must tolerate misses +# for these. +_MCP_ONLY_ADAPTER_NAMES = {"vscode"} + +# All adapter subclasses that ship in the repo. The ``target_name`` on each +# must round-trip to a ``KNOWN_TARGETS`` entry so ``MCPConflictDetector`` +# can resolve config metadata without sniffing class names. +_ADAPTER_CLASSES = ( + CopilotClientAdapter, + ClaudeClientAdapter, + CursorClientAdapter, + CodexClientAdapter, + GeminiClientAdapter, + OpenCodeClientAdapter, + VSCodeClientAdapter, + WindsurfClientAdapter, +) + + +@pytest.mark.parametrize("name,profile", sorted(KNOWN_TARGETS.items())) +def test_pack_prefixes_are_resolvable(name: str, profile: TargetProfile) -> None: + """Every target must yield a non-empty pack-prefix tuple. + + ``effective_pack_prefixes`` falls back to ``(profile.prefix,)`` when + ``pack_prefixes`` is empty, so this test fails only when both the + field AND the fallback are degenerate. + """ + prefixes = profile.effective_pack_prefixes + assert prefixes, f"target {name!r} has no pack prefixes" + for p in prefixes: + assert p.endswith("/"), ( + f"target {name!r} pack prefix {p!r} must end with '/' so startswith() filtering works" + ) + assert "\\" not in p, ( + f"target {name!r} pack prefix {p!r} contains a backslash; " + "pack prefixes are POSIX-style and must use forward slashes" + ) + + +@pytest.mark.parametrize("name,profile", sorted(KNOWN_TARGETS.items())) +def test_compile_family_is_recognised(name: str, profile: TargetProfile) -> None: + """A target's ``compile_family`` must be ``None`` or a recognised family. + + ``None`` means the target produces no compile output (e.g. agent-skills, + copilot-cowork). Any other value is routed by + ``_resolve_compile_target`` and must be in ``_KNOWN_COMPILE_FAMILIES``; + otherwise the router would silently fall through. + """ + if profile.compile_family is None: + return + assert profile.compile_family in _KNOWN_COMPILE_FAMILIES, ( + f"target {name!r} declares unknown compile_family " + f"{profile.compile_family!r}; expected one of {_KNOWN_COMPILE_FAMILIES}" + ) + + +@pytest.mark.parametrize("adapter_cls", _ADAPTER_CLASSES, ids=lambda c: c.__name__) +def test_adapter_mcp_servers_key_is_recognised( + adapter_cls: type[MCPClientAdapter], +) -> None: + """Every shipped adapter must declare a known ``mcp_servers_key``. + + The conflict detector reads this directly off the adapter to pull + existing servers out of the on-disk config. An empty value would + silently make ``get_existing_server_configs`` return ``{}``. + """ + key = adapter_cls.mcp_servers_key + assert key, ( + f"{adapter_cls.__name__} does not override mcp_servers_key; " + "MCPConflictDetector would silently return no existing servers" + ) + assert key in _KNOWN_MCP_KEYS, ( + f"{adapter_cls.__name__} declares mcp_servers_key={key!r}; " + f"expected one of {_KNOWN_MCP_KEYS}" + ) + + +@pytest.mark.parametrize("name,profile", sorted(KNOWN_TARGETS.items())) +def test_hooks_display_matches_root(name: str, profile: TargetProfile) -> None: + """When a target sets ``hooks_config_display``, it must live under its root. + + Catches typos such as ``.codex/hooks.json`` on the windsurf entry. + Targets without ``hooks_config_display`` fall back to the generic + ``"{root}/{subdir}/"`` install-log formula and are exempt. + """ + if profile.hooks_config_display is None: + return + display = profile.hooks_config_display + assert display.startswith(profile.prefix) or display.startswith(profile.root_dir), ( + f"target {name!r} hooks_config_display {display!r} must live under " + f"its root_dir {profile.root_dir!r}" + ) + + +def test_every_target_with_hooks_primitive_has_explicit_or_generic_display() -> None: + """Every target whose primitives include 'hooks' has a coherent display path. + + Either an explicit ``hooks_config_display`` (Claude/Cursor/Codex/Windsurf + style: hooks land in a single config file) OR a primitive mapping with + a non-empty ``subdir`` so the generic ``{root}/{subdir}/`` formula is + not degenerate. + """ + offenders: list[str] = [] + for name, profile in KNOWN_TARGETS.items(): + hooks_pm = profile.primitives.get("hooks") + if hooks_pm is None: + continue + if profile.hooks_config_display is not None: + continue + # No explicit display -- the generic formula must not be degenerate + if not hooks_pm.subdir: + offenders.append( + f"{name}: hooks subdir is empty AND hooks_config_display is " + f"None; install log will print just {profile.prefix!r}" + ) + assert not offenders, "\n".join(offenders) + + +@pytest.mark.parametrize("adapter_cls", _ADAPTER_CLASSES, ids=lambda c: c.__name__) +def test_adapter_target_name_is_set(adapter_cls: type[MCPClientAdapter]) -> None: + """Every shipped adapter must declare a non-empty ``target_name``. + + The base class provides ``target_name = ""`` purely as a typing default; + every concrete subclass must override it so + ``MCPConflictDetector`` can resolve per-target config metadata. + """ + assert adapter_cls.target_name, ( + f"{adapter_cls.__name__} does not override target_name; " + "MCPConflictDetector cannot route its config without it" + ) + + +@pytest.mark.parametrize("adapter_cls", _ADAPTER_CLASSES, ids=lambda c: c.__name__) +def test_adapter_target_name_resolves_to_known_target( + adapter_cls: type[MCPClientAdapter], +) -> None: + """Each adapter's ``target_name`` must map to a ``KNOWN_TARGETS`` entry, + except for documented MCP-only pseudo-targets (``vscode``). + + This prevents Gap 2-style silent breakage when a new adapter is added: + target-aware code that joins on ``adapter.target_name -> profile`` will + raise here if the registry entry is missing. + """ + name = adapter_cls.target_name + if name in _MCP_ONLY_ADAPTER_NAMES: + return + assert name in KNOWN_TARGETS, ( + f"{adapter_cls.__name__} declares target_name={name!r} but no such " + f"entry exists in KNOWN_TARGETS (and {name!r} is not in the documented " + f"MCP-only allowlist {_MCP_ONLY_ADAPTER_NAMES})" + ) + + +def test_client_factory_supported_clients_matches_adapter_set() -> None: + """``ClientFactory.supported_clients()`` must enumerate exactly the + adapter classes registered in ``_MCP_CLIENT_REGISTRY`` and exposed + through the ``MCPClientAdapter`` subclass list. + + Closes the N+1 site at ``mcp_integrator.py`` runtime loops: + callers iterate this set instead of hand-maintaining parallel lists. + A missing adapter here means a freshly-added MCP target would be + silently skipped by cleanup loops and availability probes. + """ + from apm_cli.factory import ClientFactory + + supported = ClientFactory.supported_clients() + expected = {cls.target_name for cls in _ADAPTER_CLASSES} + assert supported == expected, ( + f"ClientFactory.supported_clients() drift: registered={expected}, " + f"factory={supported}. Update _MCP_CLIENT_REGISTRY in factory.py." + ) diff --git a/tests/unit/test_conflict_detection.py b/tests/unit/test_conflict_detection.py index 5cda1d2cc..9dd52a53b 100644 --- a/tests/unit/test_conflict_detection.py +++ b/tests/unit/test_conflict_detection.py @@ -14,7 +14,8 @@ def setUp(self): """Set up test fixtures.""" # Create a mock adapter self.mock_adapter = Mock(spec=MCPClientAdapter) - self.mock_adapter.__class__.__name__ = "CopilotClientAdapter" + self.mock_adapter.target_name = "copilot" + self.mock_adapter.mcp_servers_key = "mcpServers" self.mock_adapter.registry_client = Mock() # Mock existing configuration with UUIDs @@ -128,7 +129,8 @@ def test_handles_registry_lookup_failure(self): def test_get_existing_server_configs_copilot(self): """Test extraction of existing server configs for Copilot.""" - self.mock_adapter.__class__.__name__ = "CopilotClientAdapter" + self.mock_adapter.target_name = "copilot" + self.mock_adapter.mcp_servers_key = "mcpServers" configs = self.detector.get_existing_server_configs() expected = { @@ -147,7 +149,8 @@ def test_get_existing_server_configs_copilot(self): def test_get_existing_server_configs_codex(self): """Test extraction of existing server configs for Codex.""" - self.mock_adapter.__class__.__name__ = "CodexClientAdapter" + self.mock_adapter.target_name = "codex" + self.mock_adapter.mcp_servers_key = "mcp_servers" # Mock TOML-style config toml_config = { @@ -206,3 +209,89 @@ def mock_find_server(server_ref): if __name__ == "__main__": unittest.main() + + +class TestMCPConflictDetectionByTargetName(unittest.TestCase): + """Regression suite covering the per-target dispatch contract. + + Before the targets-registry refactor, ``get_existing_server_configs`` + sniffed adapter class names and silently returned ``{}`` for cursor, + gemini, opencode, and windsurf -- conflict detection was broken for + all four. These tests pin the new contract: dispatch is by + ``adapter.mcp_servers_key``, so any adapter declaring a recognised + key works uniformly. + """ + + def _make_detector(self, target_name: str, key: str, config: dict) -> MCPConflictDetector: + adapter = Mock(spec=MCPClientAdapter) + adapter.target_name = target_name + adapter.mcp_servers_key = key + adapter.registry_client = Mock() + adapter.get_current_config.return_value = config + return MCPConflictDetector(adapter) + + def test_windsurf_extracts_mcp_servers(self): + detector = self._make_detector( + "windsurf", + "mcpServers", + {"mcpServers": {"my-server": {"command": "node", "args": ["server.js"]}}}, + ) + configs = detector.get_existing_server_configs() + self.assertIn("my-server", configs) + + def test_cursor_extracts_mcp_servers(self): + detector = self._make_detector( + "cursor", + "mcpServers", + {"mcpServers": {"cursor-srv": {"command": "x"}}}, + ) + self.assertEqual(detector.get_existing_server_configs(), {"cursor-srv": {"command": "x"}}) + + def test_gemini_extracts_mcp_servers(self): + detector = self._make_detector( + "gemini", + "mcpServers", + {"mcpServers": {"g": {"command": "x"}}}, + ) + self.assertEqual(detector.get_existing_server_configs(), {"g": {"command": "x"}}) + + def test_opencode_extracts_mcp_servers(self): + detector = self._make_detector( + "opencode", + "mcpServers", + {"mcpServers": {"o": {"command": "x"}}}, + ) + self.assertEqual(detector.get_existing_server_configs(), {"o": {"command": "x"}}) + + def test_vscode_extracts_servers_key(self): + detector = self._make_detector( + "vscode", + "servers", + {"servers": {"v": {"command": "x"}}}, + ) + self.assertEqual(detector.get_existing_server_configs(), {"v": {"command": "x"}}) + + def test_empty_mcp_servers_key_returns_empty(self): + """Adapter with no mcp_servers_key (defensive) yields no configs.""" + adapter = Mock(spec=MCPClientAdapter) + adapter.target_name = "unknown" + adapter.mcp_servers_key = "" + adapter.registry_client = Mock() + adapter.get_current_config.return_value = {"mcpServers": {"x": {"command": "y"}}} + detector = MCPConflictDetector(adapter) + self.assertEqual(detector.get_existing_server_configs(), {}) + + def test_codex_flat_keys_combine_with_nested_table(self): + """Codex must merge nested mcp_servers table AND mcp_servers. flat keys.""" + detector = self._make_detector( + "codex", + "mcp_servers", + { + "mcp_servers": {"nested": {"command": "n"}}, + "mcp_servers.flat": {"command": "f"}, + 'mcp_servers."quoted-name"': {"command": "q"}, + "mcp_servers.flat.env": {"X": "Y"}, # not a server -- no command/args + }, + ) + configs = detector.get_existing_server_configs() + self.assertEqual(set(configs), {"nested", "flat", "quoted-name"}) diff --git a/tests/unit/test_lockfile_enrichment.py b/tests/unit/test_lockfile_enrichment.py index 5d5ee674d..95c47e9cc 100644 --- a/tests/unit/test_lockfile_enrichment.py +++ b/tests/unit/test_lockfile_enrichment.py @@ -419,3 +419,126 @@ def test_list_target_single_element_equivalent_to_string(self): parsed_list["dependencies"][0]["deployed_files"] == parsed_str["dependencies"][0]["deployed_files"] ) + + +class TestWindsurfTargetParity: + """Regression: --target windsurf must filter and cross-map correctly. + + Before the targets-registry refactor, ``_TARGET_PREFIXES`` and + ``_CROSS_TARGET_MAPS`` both omitted ``"windsurf"``, so + ``apm pack --target windsurf`` silently dropped every ``.windsurf/`` + file from the bundle lockfile. + """ + + def _lockfile_with(self, files: list[str]) -> LockFile: + lf = LockFile() + lf.add_dependency( + LockedDependency( + repo_url="owner/repo", + resolved_commit="abc123", + version="1.0.0", + deployed_files=files, + ) + ) + return lf + + def test_windsurf_target_includes_windsurf_prefix(self): + lf = self._lockfile_with( + [ + ".windsurf/skills/x/SKILL.md", + ".unrelated/foo", + ] + ) + result = enrich_lockfile_for_pack(lf, fmt="apm", target="windsurf") + deployed = yaml.safe_load(result)["dependencies"][0]["deployed_files"] + assert ".windsurf/skills/x/SKILL.md" in deployed + assert ".unrelated/foo" not in deployed + + def test_windsurf_cross_map_skills_from_github(self): + """``.github/skills/`` files are remapped under ``.windsurf/skills/``.""" + lf = self._lockfile_with([".github/skills/x/SKILL.md"]) + result = enrich_lockfile_for_pack(lf, fmt="apm", target="windsurf") + deployed = yaml.safe_load(result)["dependencies"][0]["deployed_files"] + assert ".windsurf/skills/x/SKILL.md" in deployed + + def test_windsurf_cross_map_agents_collapse_to_skills(self): + """``.github/agents/`` is intentionally remapped to ``.windsurf/skills/`` + because windsurf has no native agent surface (lossy conversion). + """ + lf = self._lockfile_with([".github/agents/a.md"]) + result = enrich_lockfile_for_pack(lf, fmt="apm", target="windsurf") + deployed = yaml.safe_load(result)["dependencies"][0]["deployed_files"] + assert ".windsurf/skills/a.md" in deployed + + def test_target_all_includes_windsurf_files(self): + """``--target all`` must include ``.windsurf/`` files (was missing pre-refactor).""" + lf = self._lockfile_with( + [ + ".windsurf/skills/x/SKILL.md", + ".github/agents/a.md", + ".gemini/extensions/GEMINI.md", + ] + ) + result = enrich_lockfile_for_pack(lf, fmt="apm", target="all") + deployed = yaml.safe_load(result)["dependencies"][0]["deployed_files"] + assert ".windsurf/skills/x/SKILL.md" in deployed + assert ".github/agents/a.md" in deployed + # Gemini was also missing from the legacy "all" list -- registry derivation fixes that + assert ".gemini/extensions/GEMINI.md" in deployed + + def test_multi_target_windsurf_plus_claude(self): + lf = self._lockfile_with( + [ + ".windsurf/skills/x/SKILL.md", + ".claude/commands/c.md", + ".cursor/rules/r.md", + ] + ) + result = enrich_lockfile_for_pack(lf, fmt="apm", target=["windsurf", "claude"]) + deployed = yaml.safe_load(result)["dependencies"][0]["deployed_files"] + assert ".windsurf/skills/x/SKILL.md" in deployed + assert ".claude/commands/c.md" in deployed + assert ".cursor/rules/r.md" not in deployed + + def test_existing_targets_unchanged(self): + """Regression: every legacy single-target prefix still works.""" + cases = [ + ("copilot", ".github/agents/a.md"), + ("claude", ".claude/commands/c.md"), + ("cursor", ".cursor/rules/r.md"), + ("opencode", ".opencode/agents/a.md"), + ("codex", ".codex/agents/a.md"), + ("agent-skills", ".agents/skills/x/SKILL.md"), + ] + for target, path in cases: + lf = self._lockfile_with([path, ".unrelated/foo"]) + result = enrich_lockfile_for_pack(lf, fmt="apm", target=target) + deployed = yaml.safe_load(result)["dependencies"][0]["deployed_files"] + assert path in deployed, f"{target}: {path} dropped after refactor" + assert ".unrelated/foo" not in deployed, f"{target}: leaked unrelated file" + + def test_target_all_includes_every_deployable_target_prefix(self): + """Structural guard: ``--target all`` must include the prefixes for + every deployable target in KNOWN_TARGETS, not a hard-coded subset. + + This is the general-pattern guard for the silent-drop class of + bug that originally hid ``.windsurf/`` and ``.gemini/`` from + ``--target all``. Adding a new deployable target (one with + ``detect_by_dir or auto_create``) automatically extends this + assertion -- if a future target's prefix is not picked up by + ``_all_target_prefixes()``, this test fails immediately at + registration time rather than silently in user output. + """ + from apm_cli.bundle.lockfile_enrichment import _all_target_prefixes + from apm_cli.integration.targets import KNOWN_TARGETS + + all_prefixes = _all_target_prefixes() + for name, profile in KNOWN_TARGETS.items(): + if not (profile.detect_by_dir or profile.auto_create): + continue + for expected in profile.effective_pack_prefixes: + assert expected in all_prefixes, ( + f"target {name!r} prefix {expected!r} missing from " + f"_all_target_prefixes(); --target all would silently drop " + f"its files" + ) diff --git a/tests/unit/test_windsurf_adapter.py b/tests/unit/test_windsurf_adapter.py new file mode 100644 index 000000000..e6a44fb4a --- /dev/null +++ b/tests/unit/test_windsurf_adapter.py @@ -0,0 +1,34 @@ +"""Unit tests for WindsurfClientAdapter. + +Covers: +- get_config_path() returns the expected global config location +- Class attributes: supports_user_scope, _client_label +""" + +from pathlib import Path + +from apm_cli.adapters.client.windsurf import WindsurfClientAdapter + + +class TestWindsurfClientAdapterConfigPath: + """WindsurfClientAdapter.get_config_path returns the global Codeium path.""" + + def test_config_path_equals_codeium_windsurf(self, monkeypatch, tmp_path: Path) -> None: + """Config path must be ~/.codeium/windsurf/mcp_config.json.""" + fake_home = tmp_path / "fakehome" + fake_home.mkdir() + monkeypatch.setattr(Path, "home", staticmethod(lambda: fake_home)) + + adapter = WindsurfClientAdapter(project_root=tmp_path) + result = adapter.get_config_path() + + expected = str(fake_home / ".codeium" / "windsurf" / "mcp_config.json") + assert result == expected + + def test_supports_user_scope_is_true(self) -> None: + """Windsurf uses a global config path, so user-scope is supported.""" + assert WindsurfClientAdapter.supports_user_scope is True + + def test_client_label_is_windsurf(self) -> None: + """The user-facing label should be 'Windsurf'.""" + assert WindsurfClientAdapter._client_label == "Windsurf" diff --git a/uv.lock b/uv.lock index 4ed47fa7a..fc5c8648a 100644 --- a/uv.lock +++ b/uv.lock @@ -1995,4 +1995,4 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/24/fd/725b8e73ac2a50e78a4534ac43c6addf5c1c2d65380dd48a9169cc6739a9/yarl-1.20.1-cp313-cp313t-win32.whl", hash = "sha256:b121ff6a7cbd4abc28985b6028235491941b9fe8fe226e6fdc539c977ea1739d", size = 86591, upload-time = "2025-06-10T00:45:25.793Z" }, { url = "https://files.pythonhosted.org/packages/94/c3/b2e9f38bc3e11191981d57ea08cab2166e74ea770024a646617c9cddd9f6/yarl-1.20.1-cp313-cp313t-win_amd64.whl", hash = "sha256:541d050a355bbbc27e55d906bc91cb6fe42f96c01413dd0f4ed5a5240513874f", size = 93003, upload-time = "2025-06-10T00:45:27.752Z" }, { url = "https://files.pythonhosted.org/packages/b4/2d/2345fce04cfd4bee161bf1e7d9cdc702e3e16109021035dbb24db654a622/yarl-1.20.1-py3-none-any.whl", hash = "sha256:83b8eb083fe4683c6115795d9fc1cfaf2cbbefb19b3a1cb68f6527460f483a77", size = 46542, upload-time = "2025-06-10T00:46:07.521Z" }, -] +] \ No newline at end of file