feat: dep watchdog — pncli deps command group#7
Conversation
Implements pncli deps with frisk as the primary command and scan, diff, outdated, license-check, and connectivity as auxiliary commands. Replaces the artifactory stub. - deps frisk: scans all deps for CVEs via OSV.dev querybatch, returns structured remediation paths in JSON for agent consumption (Tier 3) - deps scan: local-only dependency inventory across npm, NuGet, Maven - deps diff: dep changes between two git refs using git show - deps outdated: latest versions via Artifactory REST (Tier 2) - deps license-check: license data per package via Artifactory (Tier 2) - deps connectivity: diagnoses which tier is available Parsers handle package-lock.json (v2/v3), yarn.lock, pnpm-lock.yaml, .csproj/packages.lock.json/Directory.Packages.props/packages.config, pom.xml, build.gradle, and gradle.lockfile. Artifactory config uses flat npmRepo/nugetRepo/mavenRepo fields. Each ecosystem repo is independently optional — missing repos are skipped silently. config init updated with opt-in Artifactory wizard section. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
The build step regenerates copilot-instructions.md — stage it automatically so it's never left out of a commit. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
There was a problem hiding this comment.
Pull request overview
Adds a new pncli deps command group to support dependency inventory, CVE scanning via OSV.dev, and Artifactory-backed “outdated” + license reporting, replacing the previous artifactory stub.
Changes:
- Introduces
depssubcommands (frisk,scan,diff,outdated,license-check,connectivity) and shared output types. - Adds repo parsers for npm/NuGet/Maven family files and clients for OSV.dev + Artifactory.
- Extends global config to include optional per-ecosystem Artifactory repo settings and updates
config init+ pre-commit.
Reviewed changes
Copilot reviewed 21 out of 21 changed files in this pull request and generated 11 comments.
Show a summary per file
| File | Description |
|---|---|
| src/types/config.ts | Adds ArtifactoryConfig and wires it into GlobalConfig/ResolvedConfig. |
| src/lib/config.ts | Loads Artifactory config from env/config file and masks token in output. |
| src/services/config/commands.ts | Adds an opt-in Artifactory wizard section to config init. |
| src/cli.ts | Registers deps commands and updates CLI help text. |
| src/services/artifactory/commands.ts | Removes the old no-op artifactory command stub. |
| src/services/deps/types.ts | Defines data shapes for scan/diff/frisk/outdated/license/connectivity JSON output. |
| src/services/deps/commands.ts | Implements the pncli deps ... command group wiring and options. |
| src/services/deps/scan.ts | Implements local-only repo scan entrypoint. |
| src/services/deps/frisk.ts | Implements OSV-backed vulnerability scan flow. |
| src/services/deps/diff.ts | Implements dependency diff between git refs. |
| src/services/deps/outdated.ts | Implements Artifactory-backed outdated check flow. |
| src/services/deps/license-check.ts | Implements Artifactory-backed license reporting flow. |
| src/services/deps/connectivity.ts | Implements OSV/Artifactory reachability checks and tier reporting. |
| src/services/deps/parsers/index.ts | Provides git-backed file discovery + per-ecosystem parsing orchestration. |
| src/services/deps/parsers/npm.ts | Adds package.json + lockfile parsers for npm/yarn/pnpm. |
| src/services/deps/parsers/nuget.ts | Adds NuGet manifest/lock parsing including central package version support. |
| src/services/deps/parsers/maven.ts | Adds Maven/Gradle parsing for pom.xml/build.gradle/gradle.lockfile. |
| src/services/deps/clients/osv.ts | Implements OSV.dev batch query + vulnerability mapping. |
| src/services/deps/clients/artifactory.ts | Implements Artifactory connectivity, latest-version, and license queries. |
| copilot-instructions.md | Updates documented CLI surface area to include deps commands. |
| .husky/pre-commit | Auto-stages regenerated copilot-instructions.md after build. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| function readFileAtRef(repoRoot: string, ref: string, relPath: string): string | null { | ||
| try { | ||
| return execSync(`git show "${ref}":"${relPath}"`, { | ||
| encoding: 'utf8', | ||
| cwd: repoRoot, | ||
| stdio: ['pipe', 'pipe', 'pipe'] | ||
| }); |
There was a problem hiding this comment.
execSync is invoked with a shell command that interpolates ref and relPath directly into a template string. Since ref comes from CLI input (e.g., deps diff --from ...), this allows shell injection via quotes/escapes. Use execFileSync/spawnSync with argument arrays (no shell) to run git show safely, and pass ${ref}:${relPath} as a single argument.
| let files: string[] = []; | ||
| try { | ||
| const out = execSync(`git ls-tree -r --name-only "${ref}"`, { | ||
| encoding: 'utf8', | ||
| cwd: repoRoot, | ||
| stdio: ['pipe', 'pipe', 'pipe'] | ||
| }); | ||
| files = out.trim().split('\n').filter(Boolean); |
There was a problem hiding this comment.
Same issue as above: execSync is called with a shell command that interpolates the user-controlled ref into the command string. This is vulnerable to shell injection. Switch to execFileSync('git', ['ls-tree','-r','--name-only', ref], ...) (or equivalent) to avoid invoking a shell.
| function parseYarnLock( | ||
| content: string, | ||
| lockFilePath: string, | ||
| opts: ScanOptions, | ||
| packageJsonContent: string | ||
| ): Package[] { | ||
| let pkgJson: PackageJson; | ||
| try { | ||
| pkgJson = JSON.parse(packageJsonContent) as PackageJson; | ||
| } catch { | ||
| pkgJson = {}; | ||
| } | ||
|
|
||
| const devDeps = new Set(Object.keys(pkgJson.devDependencies ?? {})); | ||
| const packages: Package[] = []; | ||
| const seen = new Set<string>(); | ||
|
|
||
| // Matches: "express@^4.21.0", "express@^4.0.0, express@^4.1.0": | ||
| // version "4.21.0" | ||
| const blockRegex = /^"?([^@"\n][^@"\n]*)@[^:]+:?\n\s+version "([^"]+)"/gm; | ||
|
|
||
| for (const match of content.matchAll(blockRegex)) { | ||
| const name = match[1].trim().replace(/^"/, ''); | ||
| const version = match[2]; | ||
| const key = `${name}@${version}`; | ||
| if (seen.has(key)) continue; | ||
| seen.add(key); | ||
|
|
||
| const isDev = devDeps.has(name); | ||
| if (isDev && !opts.includeDev) continue; | ||
|
|
||
| packages.push({ | ||
| name, | ||
| version, | ||
| ecosystem: 'npm', | ||
| source: lockFilePath, | ||
| type: 'direct', | ||
| scope: isDev ? 'dev' : 'production' | ||
| }); | ||
| } |
There was a problem hiding this comment.
parseYarnLock marks every entry as type: 'direct' and never applies opts.includeTransitive. This breaks --direct-only (it will still include transitive packages from yarn.lock) and makes ScanData.summary.byType inaccurate. Consider classifying entries as direct vs transitive using package.json dependencies/devDependencies (and filtering when includeTransitive is false), or falling back to package.json parsing when direct-only is requested.
| function parsePnpmLock(content: string, lockFilePath: string, opts: ScanOptions): Package[] { | ||
| const packages: Package[] = []; | ||
| const seen = new Set<string>(); | ||
|
|
||
| // pnpm-lock.yaml v6+ format: | ||
| // packages: | ||
| // express@4.21.0: | ||
| // dev: false | ||
| const blockRegex = /^\s{2}(\/?@?[^@\s/][^@\s]*)@(\d[^:\s]*):\s*\n((?:\s{4}[^\n]+\n)*)/gm; | ||
|
|
||
| for (const match of content.matchAll(blockRegex)) { | ||
| const name = match[1].replace(/^\//, ''); | ||
| const version = match[2]; | ||
| const attrs = match[3] ?? ''; | ||
| const key = `${name}@${version}`; | ||
| if (seen.has(key)) continue; | ||
| seen.add(key); | ||
|
|
||
| const isDev = /^\s+dev:\s*true/m.test(attrs); | ||
| if (isDev && !opts.includeDev) continue; | ||
|
|
||
| packages.push({ | ||
| name, | ||
| version, | ||
| ecosystem: 'npm', | ||
| source: lockFilePath, | ||
| type: 'direct', | ||
| scope: isDev ? 'dev' : 'production' | ||
| }); |
There was a problem hiding this comment.
parsePnpmLock also returns every lock entry as type: 'direct' and ignores opts.includeTransitive, so --direct-only won’t work with pnpm-lock.yaml and type counts will be wrong. Similar to the yarn.lock case, either classify/filter based on package.json direct deps (preferred) or avoid using the lockfile path when includeTransitive is false.
| const propsFiles = new Map<string, string>(); // dir → Directory.Packages.props path | ||
|
|
||
| for (const f of files) { | ||
| if (path.basename(f) === 'packages.lock.json') { | ||
| lockFiles.set(path.dirname(f), f); | ||
| } | ||
| if (path.basename(f) === 'Directory.Packages.props') { | ||
| propsFiles.set(path.dirname(f), f); | ||
| } |
There was a problem hiding this comment.
propsFiles is populated but never used. With the repo’s ESLint config (@typescript-eslint/no-unused-vars via recommended rules), this will fail npm run lint. Remove propsFiles (or use it) to avoid unused-variable errors.
| const propsFiles = new Map<string, string>(); // dir → Directory.Packages.props path | |
| for (const f of files) { | |
| if (path.basename(f) === 'packages.lock.json') { | |
| lockFiles.set(path.dirname(f), f); | |
| } | |
| if (path.basename(f) === 'Directory.Packages.props') { | |
| propsFiles.set(path.dirname(f), f); | |
| } | |
| for (const f of files) { | |
| if (path.basename(f) === 'packages.lock.json') { | |
| lockFiles.set(path.dirname(f), f); | |
| } |
| // Build maps keyed by ecosystem:name (collapse source differences) | ||
| type PkgKey = string; | ||
| const fromMap = new Map<PkgKey, string>(); // key → version | ||
| const toMap = new Map<PkgKey, string>(); | ||
| const sourceMap = new Map<PkgKey, string>(); | ||
|
|
||
| for (const pkg of fromScan.packages) { | ||
| const key = `${pkg.ecosystem}:${pkg.name}`; | ||
| fromMap.set(key, pkg.version); | ||
| sourceMap.set(key, pkg.source); | ||
| } | ||
| for (const pkg of toScan.packages) { | ||
| const key = `${pkg.ecosystem}:${pkg.name}`; | ||
| toMap.set(key, pkg.version); | ||
| sourceMap.set(key, pkg.source); | ||
| } |
There was a problem hiding this comment.
The diff collapses packages into a single entry per ecosystem:name (overwriting versions) which loses information when multiple versions of the same transitive dependency exist (common in npm). This can produce incorrect upgrade/downgrade results and misleading output for deps diff (which enables transitive deps). Consider diffing on a key that preserves multiple versions (e.g., per ecosystem:name@version multiset, or ecosystem:name → set of versions) and computing changes accordingly.
|
|
||
| const summary = { added: 0, removed: 0, upgraded: 0, downgraded: 0, unchanged: 0 }; | ||
| for (const c of changes) summary[c.change]++; | ||
| summary.unchanged = fromScan.packages.length - changes.filter(c => c.change !== 'added').length; |
There was a problem hiding this comment.
summary.unchanged is derived from fromScan.packages.length and a filtered changes count, but changes is computed from the collapsed maps (one version per ecosystem:name). This makes unchanged inconsistent when there are duplicates/multiple versions or when map-collapsing changes the cardinality. If you keep the current model, compute unchanged directly from the same key space used to generate changes (e.g., count keys where both sides exist and versions match).
| summary.unchanged = fromScan.packages.length - changes.filter(c => c.change !== 'added').length; | |
| summary.unchanged = [...allKeys].filter((key) => { | |
| const fromVer = fromMap.get(key); | |
| const toVer = toMap.get(key); | |
| return fromVer !== undefined && toVer !== undefined && fromVer === toVer; | |
| }).length; |
| if (!groupId || !artifactId) continue; | ||
|
|
||
| const rawVersion = /<version>\s*([^<\s]+)\s*<\/version>/i.exec(inner)?.[1]; | ||
| const resolvedVersion = resolveProperty(rawVersion ?? null, props) |
There was a problem hiding this comment.
resolveProperty returns the original placeholder (e.g. ${missing.prop}) when a property can’t be resolved, which then gets treated as a real version and included in scan output. This can lead to invalid versions being emitted/queried. Consider returning null (or detecting unresolved placeholders after substitution) so the caller can fall back to dependencyManagement versions or skip the dependency when the version can’t be resolved.
| const resolvedVersion = resolveProperty(rawVersion ?? null, props) | |
| const directVersion = resolveProperty(rawVersion ?? null, props); | |
| const resolvedDirectVersion = directVersion && !/^\$\{[^}]+\}$/.test(directVersion) | |
| ? directVersion | |
| : null; | |
| const resolvedVersion = resolvedDirectVersion |
| // Matches: "express@^4.21.0", "express@^4.0.0, express@^4.1.0": | ||
| // version "4.21.0" | ||
| const blockRegex = /^"?([^@"\n][^@"\n]*)@[^:]+:?\n\s+version "([^"]+)"/gm; | ||
|
|
||
| for (const match of content.matchAll(blockRegex)) { | ||
| const name = match[1].trim().replace(/^"/, ''); | ||
| const version = match[2]; |
There was a problem hiding this comment.
parseYarnLock’s blockRegex excludes package names starting with @, so scoped packages like @types/node won’t be detected from yarn.lock. Adjust the regex to allow scoped names (and / in the name) so yarn-based repos don’t silently miss a large portion of dependencies.
| // pnpm-lock.yaml v6+ format: | ||
| // packages: | ||
| // express@4.21.0: | ||
| // dev: false | ||
| const blockRegex = /^\s{2}(\/?@?[^@\s/][^@\s]*)@(\d[^:\s]*):\s*\n((?:\s{4}[^\n]+\n)*)/gm; | ||
|
|
||
| for (const match of content.matchAll(blockRegex)) { | ||
| const name = match[1].replace(/^\//, ''); | ||
| const version = match[2]; |
There was a problem hiding this comment.
parsePnpmLock’s blockRegex doesn’t allow / in the captured package name, so scoped packages like @types/node (which appear as /@types/node@... in pnpm-lock.yaml) won’t be parsed. Update the regex (or switch to a YAML parser) so scoped packages are included.
Security:
- Replace execSync string interpolation with execFileSync + arg arrays in
readFileAtRef and scanRepoAtRef to prevent shell injection from
user-controlled --from/--to refs
Parser fixes:
- yarn.lock: fix blockRegex to match scoped packages (@scope/name);
classify direct vs transitive using package.json dep lists and respect
opts.includeTransitive
- pnpm-lock.yaml: same direct/transitive classification fix; pass
packageJsonContent through for cross-referencing
- nuget: remove dead propsFiles map that was populated but never read
- maven: resolveProperty now returns null when a ${placeholder} remains
unresolved, so callers correctly fall through to dependencyManagement
versions instead of emitting invalid version strings
Diff fixes:
- Key on ecosystem:name:source instead of ecosystem:name to preserve
multiple versions of the same transitive dep (common in npm)
- summary.unchanged now computed directly from the key set rather than
mixing collapsed/uncollapsed counts
Directory.Packages.props: walk up parent directories to repo root
instead of only checking the manifest's own directory (props files are
typically at repo root in central package management)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 21 out of 21 changed files in this pull request and generated 9 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| manifests: ManifestInfo[], | ||
| opts: ScanOptions, | ||
| readFn: (relPath: string) => string | null | ||
| ): Package[] { |
There was a problem hiding this comment.
parseManifests takes a repoRoot parameter but never uses it. With the repo’s ESLint recommended rules, this will be flagged as an unused variable and fail npm run lint. Rename it to _repoRoot, remove the parameter, or use it inside the function (and update the call sites accordingly).
| ): Package[] { | |
| ): Package[] { | |
| void repoRoot; |
| // prefer it over the individual project files | ||
| return manifests; |
There was a problem hiding this comment.
The comment says duplicate lock-backed entries are removed/preferred, but the function currently returns manifests without any de-duplication. Either implement the described filtering (to avoid repeated parsing of the same packages.lock.json via multiple project files) or adjust/remove the comment so it matches behavior.
| // prefer it over the individual project files | |
| return manifests; | |
| // prefer a single representative manifest over the individual project files. | |
| const dedupedManifests: ManifestInfo[] = []; | |
| const lockBackedIndex = new Map<string, number>(); | |
| for (const manifest of manifests) { | |
| if (!manifest.lockFile) { | |
| dedupedManifests.push(manifest); | |
| continue; | |
| } | |
| const existingIndex = lockBackedIndex.get(manifest.lockFile); | |
| if (existingIndex === undefined) { | |
| lockBackedIndex.set(manifest.lockFile, dedupedManifests.length); | |
| dedupedManifests.push(manifest); | |
| continue; | |
| } | |
| const existing = dedupedManifests[existingIndex]; | |
| const existingBase = path.basename(existing.file); | |
| const manifestBase = path.basename(manifest.file); | |
| if ( | |
| existingBase !== 'Directory.Packages.props' && | |
| manifestBase === 'Directory.Packages.props' | |
| ) { | |
| dedupedManifests[existingIndex] = manifest; | |
| } | |
| } | |
| return dedupedManifests; |
| function parseBuildGradle(content: string, filePath: string, opts: ScanOptions): Package[] { | ||
| const packages: Package[] = []; | ||
| const seen = new Set<string>(); | ||
|
|
||
| // String notation: implementation 'group:artifact:version' or "group:artifact:version" | ||
| const stringNotation = /\b(\w+)\s+['"]([^:'"]+):([^:'"]+):([^'"]+)['"]/g; |
There was a problem hiding this comment.
parseBuildGradle doesn’t handle common Kotlin DSL dependency declarations like implementation("group:artifact:version") / testImplementation("..."), even though findMavenManifests includes build.gradle.kts. As a result, many Gradle projects will scan as empty unless they have a lockfile or use the exact Groovy string/map forms handled here. Consider adding support for the parenthesized notation (both Groovy and Kotlin DSL).
| fix_available: boolean; | ||
| fixed_versions: string[]; |
There was a problem hiding this comment.
The remediation fields use snake_case (fix_available, fixed_versions) while the rest of the CLI data models in this repo use camelCase (e.g., duration_ms is confined to meta). Unless there’s a strong compatibility reason, consider switching these to camelCase (fixAvailable, fixedVersions) to keep the JSON shape consistent across commands.
| fix_available: boolean; | |
| fixed_versions: string[]; | |
| fixAvailable: boolean; | |
| fixedVersions: string[]; |
|
|
||
| for (const pkg of packages) { | ||
| const repoName = repoForEcosystem(config, pkg.ecosystem); | ||
| if (!repoName) continue; // ecosystem repo not configured — skip silently |
There was a problem hiding this comment.
In getOutdatedPackages, if an ecosystem repo isn’t configured you continue and silently drop the package from results. This makes deps outdated --ecosystem <x> confusing: it can report zero outdated packages even though it never checked that ecosystem. Consider returning an entry with availableInArtifactory: false (and/or a separate warning/error when the selected ecosystem’s repo is missing).
| for (const pkg of packages) { | |
| const repoName = repoForEcosystem(config, pkg.ecosystem); | |
| if (!repoName) continue; // ecosystem repo not configured — skip silently | |
| const warnedMissingRepos = new Set<Ecosystem>(); | |
| for (const pkg of packages) { | |
| const repoName = repoForEcosystem(config, pkg.ecosystem); | |
| if (!repoName) { | |
| if (!warnedMissingRepos.has(pkg.ecosystem)) { | |
| warnedMissingRepos.add(pkg.ecosystem); | |
| console.warn( | |
| `Artifactory repository is not configured for ecosystem "${pkg.ecosystem}", skipping outdated checks for that ecosystem.` | |
| ); | |
| } | |
| continue; | |
| } |
| source: pkg.source, | ||
| availableInArtifactory: true |
There was a problem hiding this comment.
availableInArtifactory is always set to true for every returned entry, and packages that aren’t checkable are skipped entirely. This makes the field effectively meaningless for consumers. Either populate it accurately (including false entries) or remove it from the type/output to avoid misleading downstream tooling.
| source: pkg.source, | |
| availableInArtifactory: true | |
| source: pkg.source |
| function parseSemver(v: string): [number, number, number] { | ||
| const clean = v.replace(/[^0-9.]/g, ''); | ||
| const parts = clean.split('.').map(Number); | ||
| return [parts[0] ?? 0, parts[1] ?? 0, parts[2] ?? 0]; | ||
| } |
There was a problem hiding this comment.
The parseSemver implementation strips non-numeric characters and compares only three numeric components. This can misclassify changes for pre-releases (e.g. 1.0.0-rc.1), 4-part versions, and Maven/NuGet version formats, which can flip upgraded vs downgraded. Consider using a real SemVer parser when the version is SemVer-valid (and falling back to a changed state or string comparison when it isn’t).
| function parseSemver(v: string): [number, number, number] { | ||
| const clean = v.replace(/[^0-9.]/g, ''); | ||
| const parts = clean.split('.').map(Number); | ||
| return [parts[0] ?? 0, parts[1] ?? 0, parts[2] ?? 0]; | ||
| } | ||
|
|
||
| function getUpdateType(current: string, latest: string): 'major' | 'minor' | 'patch' { | ||
| const [cMaj, cMin] = parseSemver(current); | ||
| const [lMaj, lMin] = parseSemver(latest); | ||
| if (lMaj > cMaj) return 'major'; | ||
| if (lMin > cMin) return 'minor'; | ||
| return 'patch'; | ||
| } |
There was a problem hiding this comment.
This semver parsing/comparison logic is very lossy (it strips non-numeric characters), which can incorrectly decide whether a version is newer and what the update type is (major/minor/patch), especially for pre-release versions and Maven-style versions. Consider using a proper SemVer comparison for SemVer inputs and a safer fallback strategy for non-SemVer ecosystems.
|
|
||
| process.stderr.write('\n── Artifactory ───────────────────────────────────\n'); | ||
| const useArtifactory = await confirm({ | ||
| message: 'Configure Artifactory for dependency scanning (deps frisk, outdated, license-check)?', |
There was a problem hiding this comment.
The Artifactory prompt text implies deps frisk depends on Artifactory configuration, but deps frisk uses OSV.dev and does not require Artifactory. Consider rewording this prompt so it only mentions the commands that actually depend on Artifactory (e.g. deps outdated, deps license-check).
| message: 'Configure Artifactory for dependency scanning (deps frisk, outdated, license-check)?', | |
| message: 'Configure Artifactory for dependency commands (deps outdated, deps license-check)?', |
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* fix: Jira error deserialization, Connection header, and exit codes - Parse Jira 400 responses correctly: read errorMessages (string[]) and errors (Record<string,string>) instead of broken array indexing - Add Connection: close header to Jira and Bitbucket requests - Introduce src/lib/exitCodes.ts with sysexits-style codes (69, 77, 78) - fail() now maps HTTP 401/403 → 77, network failures → 69, general → 1 - Replace all hardcoded exit code literals with named constants Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: address PR feedback — array errors shape and ExitCode return type - Handle errors-as-array (other APIs) alongside errors-as-object (Jira) - Tighten exitCodeFromStatus return type to ExitCode union Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> --------- Co-authored-by: Sunny Kolattukudy <sunny@imagile.dev> Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
- Add shared semver utility (semver.ts) with pre-release/build-metadata
aware comparison, replacing lossy strip-non-numeric logic in diff and
artifactory client
- Rename remediation snake_case fields to camelCase (fixAvailable,
fixedVersions) to match rest of CLI output shape
- Remove always-true availableInArtifactory field; add uncheckedEcosystems
to OutdatedData so callers know which ecosystems had no repo configured
- Drop unused repoRoot parameter from parseManifests
- Implement NuGet manifest deduplication: when multiple project files share
the same packages.lock.json, keep only one representative to avoid
inflated package counts
- Add Kotlin DSL / Groovy parenthesised form support to parseBuildGradle
(handles both implementation("g:a:v") and implementation 'g:a:v')
- Fix Artifactory config init prompt — deps frisk uses OSV.dev, not
Artifactory; prompt now only mentions outdated and license-check
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…on Windows process.exit() called synchronously after process.stdout/stderr.write() triggers a libuv assertion on Windows because the write handle is torn down before the kernel flushes the buffer. Move process.exit() into the write callback in output.ts fail() and http.ts dry-run paths. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Summary
pncli depscommand group replacing the no-opartifactorystubdeps friskis the primary command: scans all dependencies for CVEs via OSV.dev and returns structured remediation paths in JSON for agent consumptiondeps scan,deps diff,deps outdated,deps license-check,deps connectivityas auxiliary commandsnpmRepo/nugetRepo/mavenRepofields — each ecosystem repo is independently optionalconfig initupdated with opt-in Artifactory wizard sectioncopilot-instructions.mdafter build regenerates itTest plan
pncli deps frisk— scans deps and queries OSV.dev, returns CVEs + remediationpncli deps scan— inventories deps locally with no networkpncli deps diff --from <ref>— shows dep changes between refspncli deps connectivity— reports tier and what's reachablepncli deps outdatedwithout Artifactory configured — returns friendly error with setup instructionspncli config init— Artifactory section appears, per-ecosystem repos are individually opt-in🤖 Generated with Claude Code