Skip to content

refactor: replace SW + beacon monkey-patch with AST-based API rewriting#614

Merged
harlan-zw merged 4 commits intomainfrom
feat/ast-api-rewriting
Mar 2, 2026
Merged

refactor: replace SW + beacon monkey-patch with AST-based API rewriting#614
harlan-zw merged 4 commits intomainfrom
feat/ast-api-rewriting

Conversation

@harlan-zw
Copy link
Collaborator

@harlan-zw harlan-zw commented Mar 1, 2026

🔗 Linked issue

Related to #615

❓ Type of change

  • 📖 Documentation
  • 🐞 Bug fix
  • 👌 Enhancement
  • ✨ New feature
  • 🧹 Chore
  • ⚠️ Breaking change

📚 Description

The proxy interception had 3 runtime layers (Service Worker, navigator.sendBeacon monkey-patch, runtime JS URL rewriting). Since AST rewriting already transforms URL strings in bundled scripts, extending it to also transform API calls moves interception from the network/global level to the source code level — affecting only analytics scripts, not the entire page.

Core changes:

  • Extend AST rewriter with CallExpression visitors that transform navigator.sendBeacon()__nuxtScripts.sendBeacon() and fetch()/window.fetch()/self.fetch()/globalThis.fetch()__nuxtScripts.fetch()
  • Replace SW + beacon monkey-patch with a single __nuxtScripts client plugin providing sendBeacon/fetch wrappers
  • Remove runtime JS rewriting from proxy-handler.ts — URLs rewritten at build time
  • Delete SW files: proxy-sw.template.js, sw-register.client.ts, sw-handler.ts
  • Scope client intercept rules to only configured scripts (was generating rules for all vendors, causing 404s for unconfigured ones)

Test plan

  • Unit: 82 tests pass (71 proxy-configs + 11 self-closing tags)
  • Typecheck: clean
  • Lint: clean
  • Build: clean
  • E2E: 24 first-party proxy tests pass (proxy endpoint, script bundling, privacy stripping across all providers)

Move fetch/sendBeacon interception from runtime (Service Worker + global
monkey-patch) to build-time AST rewriting. The AST visitor transforms
`navigator.sendBeacon()` → `__nuxtScripts.sendBeacon()` and
`fetch()`/`window.fetch()`/etc → `__nuxtScripts.fetch()` in bundled
analytics scripts. A lightweight `__nuxtScripts` client plugin provides
the wrappers that route matching URLs through the first-party proxy.

- Extend rewrite-ast.ts with CallExpression visitor rules
- Replace beacon intercept + SW registration with __nuxtScripts plugin
- Remove runtime JS rewriting from proxy-handler.ts
- Remove rewriteScriptUrls regex from pure.ts, switch transform.ts to AST
- Delete SW files (proxy-sw.template.js, sw-register.client.ts, sw-handler.ts)
- Rename getSWInterceptRules → getInterceptRules
- Add comprehensive tests for API call rewriting and runtime rewriteUrl
@vercel
Copy link
Contributor

vercel bot commented Mar 1, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
scripts-docs Ready Ready Preview, Comment Mar 2, 2026 2:08am
scripts-playground Ready Ready Preview, Comment Mar 2, 2026 2:08am

@pkg-pr-new
Copy link

pkg-pr-new bot commented Mar 1, 2026

Open in StackBlitz

npm i https://pkg.pr.new/@nuxt/scripts@614

commit: d9b6174

@coderabbitai
Copy link

coderabbitai bot commented Mar 1, 2026

📝 Walkthrough

Walkthrough

Replaces the service-worker interception flow with an intercept/proxy-first approach: removes SW registration, generation, and template files; introduces InterceptRule and getInterceptRules, deferring rule computation until first-party routes are known; adds a client interceptor plugin and proxy handler wiring. Adds a build-time AST transformer rewriteScriptUrlsAST for script URL rewriting (replacing the previous string-based rewrite) and removes server-side script rewrite/caching. Adds logic to expand self-closing Script* tags and integrates it into Nuxt setup and HMR. Updates registry entries, templates, runtime config surface, and extensive tests/fixtures to reflect the new flow.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 50.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title 'refactor: replace SW + beacon monkey-patch with AST-based API rewriting' is clear, specific, and accurately summarizes the main architectural change: replacing a multi-layer runtime interception approach with build-time AST transformation.
Description check ✅ Passed The PR description is comprehensive and follows the required template structure with all major sections completed.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feat/ast-api-rewriting

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
test/unit/third-party-proxy-replacements.test.ts (1)

214-218: ⚠️ Potential issue | 🟡 Minor

Add assertion for config.rewrite existence before use.

Unlike the first test block (lines 76-80) which explicitly asserts that rewrite is defined, this test only checks that config exists. If a proxy config lacks rewrite rules, the non-null assertion would throw with an unclear error.

🛡️ Proposed fix
     it.each(Object.entries(syntheticScripts))('%s synthetic script rewrites correctly', (key, content) => {
       const config = proxyConfigs[key]
       expect(config, `Missing config for ${key}`).toBeDefined()
+      expect(config.rewrite, `Missing rewrite rules for ${key}`).toBeDefined()

       const rewritten = rewriteScriptUrlsAST(content, 'script.js', config.rewrite!)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/unit/third-party-proxy-replacements.test.ts` around lines 214 - 218, The
test calls rewriteScriptUrlsAST(content, 'script.js', config.rewrite!) without
ensuring config.rewrite exists; add an assertion like expect(config.rewrite,
`Missing rewrite for ${key}`).toBeDefined() (or similar) before invoking
rewriteScriptUrlsAST so the test fails with a clear message; reference
variables: syntheticScripts, proxyConfigs, config, and the function
rewriteScriptUrlsAST.
🧹 Nitpick comments (3)
test/unit/proxy-configs.test.ts (1)

455-468: The empty catch block on line 466 mirrors runtime behavior.

The rewriteUrl function in tests matches the actual runtime implementation embedded in the intercept plugin. The empty catch is intentional - non-URL strings should pass through unchanged rather than throwing.

Consider adding a brief comment for clarity:

       catch {}
+      // Non-URL strings pass through unchanged
       return url
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/unit/proxy-configs.test.ts` around lines 455 - 468, Add a short
clarifying comment above the empty catch in the rewriteUrl function to explain
that the empty catch is intentional: non-URL strings are expected to pass
through unchanged and should not throw, so errors from new URL(...) are
swallowed to mirror runtime intercept plugin behavior; reference the rewriteUrl
function and its URL parsing block so reviewers know exactly where the comment
belongs.
src/plugins/rewrite-ast.ts (1)

149-161: Consider anchoring the regex or using single-line mode for the GA dynamic URL pattern.

The .*? in the regex is non-greedy but could still span across concatenation boundaries in minified code. Consider whether this pattern is robust enough for all gtag.js outputs.

// Current pattern matches: "https://"+(...)+".google-analytics.com/g/collect"
/"https:\/\/"\+\(.*?\)\+"\.google-analytics\.com\/g\/collect"/g

This looks intentional for catching the specific gtag.js dynamic URL construction pattern. If real-world testing confirms it works correctly, this is acceptable.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/plugins/rewrite-ast.ts` around lines 149 - 161, The GA rewrite regex can
overrun concatenation boundaries because the inner pattern uses (.*?), so update
both regexes in rewrite-ast.ts where you build output (referencing output and
gaRewrite) to restrict the capture inside the parentheses (e.g. replace
"\(.*?\)" with "\([^)]*?\)" to stop at the closing paren) or enable single-line
dot-matches-newline and anchor the pattern; change the two occurrences of
/"https:\/\/"\+\(.*?\)\+"\.google-analytics\.com\/g\/collect"/g to a safer form
such as /"https:\/\/"\+\([^)]*?\)\+"\.google-analytics\.com\/g\/collect"/g (and
the analytics.google.com variant) so the regex can’t span unrelated code.
src/module.ts (1)

96-98: Potential regex backtracking on malformed input.

The regex [^>]*? followed by \s* can cause polynomial backtracking on pathological inputs. While Vue files are typically well-formed, consider using an atomic group or possessive quantifier if available, or restructuring the pattern.

// Current pattern
/<((?:Script[A-Z]|script-)\w[\w-]*)\b([^>]*?)\s*\/\s*>/g

// The [^>]*? and \s* can backtrack against each other

For typical Vue component usage this is unlikely to be exploited, but malformed templates could trigger slow parsing during dev HMR.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/module.ts` around lines 96 - 98, The SELF_CLOSING_SCRIPT_RE may suffer
pathological backtracking because the attribute capture uses ([^>]*?) followed
by \s*; change the pattern to avoid adjacent overlapping quantifiers by folding
trailing whitespace into the attribute matcher and by using a safer attribute
token that treats quoted strings as atomic. Update SELF_CLOSING_SCRIPT_RE (the
regex constant) to something like a pattern that captures attributes with
((?:[^>"']|"[^"]*"|'[^']*')*) and then directly matches /\s*\/>/ (i.e. replace
([^>]*?)\s*\/\s*> with ((?:[^>"']|"[^"]*"|'[^']*')*)\s*\/\s*>) so quoted
attributes don't trigger backtracking while still matching the same self-closing
Script/script- tags.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Outside diff comments:
In `@test/unit/third-party-proxy-replacements.test.ts`:
- Around line 214-218: The test calls rewriteScriptUrlsAST(content, 'script.js',
config.rewrite!) without ensuring config.rewrite exists; add an assertion like
expect(config.rewrite, `Missing rewrite for ${key}`).toBeDefined() (or similar)
before invoking rewriteScriptUrlsAST so the test fails with a clear message;
reference variables: syntheticScripts, proxyConfigs, config, and the function
rewriteScriptUrlsAST.

---

Nitpick comments:
In `@src/module.ts`:
- Around line 96-98: The SELF_CLOSING_SCRIPT_RE may suffer pathological
backtracking because the attribute capture uses ([^>]*?) followed by \s*; change
the pattern to avoid adjacent overlapping quantifiers by folding trailing
whitespace into the attribute matcher and by using a safer attribute token that
treats quoted strings as atomic. Update SELF_CLOSING_SCRIPT_RE (the regex
constant) to something like a pattern that captures attributes with
((?:[^>"']|"[^"]*"|'[^']*')*) and then directly matches /\s*\/>/ (i.e. replace
([^>]*?)\s*\/\s*> with ((?:[^>"']|"[^"]*"|'[^']*')*)\s*\/\s*>) so quoted
attributes don't trigger backtracking while still matching the same self-closing
Script/script- tags.

In `@src/plugins/rewrite-ast.ts`:
- Around line 149-161: The GA rewrite regex can overrun concatenation boundaries
because the inner pattern uses (.*?), so update both regexes in rewrite-ast.ts
where you build output (referencing output and gaRewrite) to restrict the
capture inside the parentheses (e.g. replace "\(.*?\)" with "\([^)]*?\)" to stop
at the closing paren) or enable single-line dot-matches-newline and anchor the
pattern; change the two occurrences of
/"https:\/\/"\+\(.*?\)\+"\.google-analytics\.com\/g\/collect"/g to a safer form
such as /"https:\/\/"\+\([^)]*?\)\+"\.google-analytics\.com\/g\/collect"/g (and
the analytics.google.com variant) so the regex can’t span unrelated code.

In `@test/unit/proxy-configs.test.ts`:
- Around line 455-468: Add a short clarifying comment above the empty catch in
the rewriteUrl function to explain that the empty catch is intentional: non-URL
strings are expected to pass through unchanged and should not throw, so errors
from new URL(...) are swallowed to mirror runtime intercept plugin behavior;
reference the rewriteUrl function and its URL parsing block so reviewers know
exactly where the comment belongs.

ℹ️ Review info

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c861945 and cef5ef7.

📒 Files selected for processing (13)
  • src/module.ts
  • src/plugins/rewrite-ast.ts
  • src/plugins/transform.ts
  • src/proxy-configs.ts
  • src/runtime/plugins/sw-register.client.ts
  • src/runtime/server/proxy-handler.ts
  • src/runtime/server/sw-handler.ts
  • src/runtime/sw/proxy-sw.template.js
  • src/runtime/utils/pure.ts
  • test/e2e/first-party.test.ts
  • test/fixtures/first-party/nuxt.config.ts
  • test/unit/proxy-configs.test.ts
  • test/unit/third-party-proxy-replacements.test.ts
💤 Files with no reviewable changes (4)
  • src/runtime/sw/proxy-sw.template.js
  • src/runtime/plugins/sw-register.client.ts
  • src/runtime/utils/pure.ts
  • src/runtime/server/sw-handler.ts

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 8

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/proxy-configs.ts (1)

346-364: ⚠️ Potential issue | 🟠 Major

Intercept rules are incomplete when derived from routes only.

getInterceptRules() ignores host/path aliases present only in rewrite. That can skip interception for valid outbound URLs and bypass first-party proxying/privacy for those calls.

Suggested patch
 export function getInterceptRules(collectPrefix: string): InterceptRule[] {
   const configs = buildProxyConfig(collectPrefix)
   const rules: InterceptRule[] = []
+  const seen = new Set<string>()
+  const pushRule = (pattern: string, pathPrefix: string, target: string) => {
+    const key = `${pattern}|${pathPrefix}|${target}`
+    if (!seen.has(key)) {
+      seen.add(key)
+      rules.push({ pattern, pathPrefix, target })
+    }
+  }

   // Extract unique domain -> target mappings from route rules
   for (const config of Object.values(configs)) {
+    for (const r of config.rewrite || []) {
+      const [rawHost, ...pathParts] = r.from
+        .replace(/^https?:\/\//, '')
+        .replace(/^\./, '')
+        .split('/')
+      const pathPrefix = pathParts.length ? `/${pathParts.join('/')}` : ''
+      const target = r.to.replace(/\/\*\*$/, '')
+      pushRule(rawHost, pathPrefix, target)
+    }
+
     if (!config.routes)
       continue
     for (const [localPath, { proxy }] of Object.entries(config.routes)) {
       const match = proxy.match(/^https?:\/\/([^/]+)(\/.*)?\/\*\*$/)
       if (match?.[1]) {
         const domain = match[1]
         const pathPrefix = match[2] || ''
         const target = localPath.replace(/\/\*\*$/, '')
-        rules.push({ pattern: domain, pathPrefix, target })
+        pushRule(domain, pathPrefix, target)
       }
     }
   }

   return rules
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/proxy-configs.ts` around lines 346 - 364, getInterceptRules currently
builds intercept rules by scanning config.routes only, missing host/path aliases
defined solely in config.rewrite; update the rule extraction to also iterate
Object.entries(config.rewrite) for each config (in the same loop over
Object.values(configs)) and apply the same proxy URL parsing logic (the
/^https?:\/\/([^/]+)(\/.*)?\/\*\$/ match) to push entries into the rules array
with keys pattern, pathPrefix and target (use the same target derivation as for
routes: strip trailing "/**" from the local rewrite key). Ensure you reference
and handle config.rewrite the same way as config.routes so aliases in rewrite
are included in the rules returned by getInterceptRules.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/module.ts`:
- Around line 112-118: The fixFile function currently unconditionally deletes
nuxt.vfs[filePath] when expandTags returns a falsy value, which can remove VFS
entries owned by other modules; change this to an ownership-safe cleanup by
tracking which entries this helper wrote (e.g. maintain a module-scoped Set like
ownedVfsEntries or store the last written value) and only delete
nuxt.vfs[filePath] if the entry is one this helper previously created
(ownedVfsEntries.has(filePath) or nuxt.vfs[filePath] === lastFixedValue),
otherwise leave it untouched; update the code paths where you assign
nuxt.vfs[filePath] = fixed to also record ownership and remove that record when
you legitimately delete the entry.

In `@test/e2e/__snapshots__/proxy/metaPixel.json`:
- Around line 11-12: The snapshot JSON contains trailing commas before closing
braces (e.g. the object with "v": "2.9.272") which makes the file invalid; open
test/e2e/__snapshots__/proxy/metaPixel.json and remove any trailing commas
immediately before a closing } or ] across the affected objects (instances
around the "v" key and other object endings), ensuring each object/array ends
with the last element not followed by a comma, then validate the file with a
JSON parser (jq or Python json.load) to confirm the file is valid JSON.

In `@test/e2e/__snapshots__/proxy/redditPixel.json`:
- Around line 37-44: The snapshot JSON contains a trailing comma in the
"privacy" object that makes the file invalid; open the snapshot
test/e2e/__snapshots__/proxy/redditPixel.json, locate the "privacy" object
(keys: hardware, ip, language, screen, timezone, userAgent) and remove the
trailing comma after "userAgent" so the JSON parses correctly.

In `@test/e2e/__snapshots__/proxy/segment.json`:
- Around line 9-16: Remove trailing commas from the JSON snapshot so it becomes
valid JSON: edit the "privacy" object (remove the trailing comma after
"userAgent": false), the "query" object (remove the trailing comma after "{}"),
and any other trailing commas in this snapshot file (e.g., the comma noted on
line 22) so all objects/arrays end without a comma on the last entry; then
validate the file with a JSON linter or by parsing it to ensure no other
trailing commas remain.

In `@test/e2e/__snapshots__/proxy/xPixel/t.co`~1~i~adsct.diff.json:
- Around line 5-20: The snapshot JSON contains invalid trailing commas (e.g.,
after the "original" and "anonymized" entries under "x-forwarded-for", inside
"dv" under "query", and before the closing object for "target"), so remove any
trailing commas that appear immediately before a closing brace or bracket;
specifically audit keys like "x-forwarded-for", "original", "anonymized",
"query", "dv", and "target" in this diff and delete the extraneous commas so the
file becomes valid JSON (no trailing commas anywhere).

In `@test/e2e/first-party.test.ts`:
- Line 1096: The RegExp assigned to thirdPartyUrlPattern uses a capturing group
that is never read; update the pattern in the test (thirdPartyUrlPattern) to use
a non-capturing group (?:...) for the domain alternatives so it satisfies the
regexp/no-unused-capturing-group rule while preserving the same matching
behavior.
- Around line 1000-1006: The console error handler currently calls
isKnownNoise(text) before checking for proxy-related failures, which causes
messages containing "/_proxy/" plus "Failed to load resource" or "net::ERR_*" to
be dropped; update the page.on('console', ...) handler so proxy-related messages
are classified/recorded before applying the isKnownNoise filter — e.g., inspect
msg.text() for "/_proxy/" and "Failed to load resource"/"net::ERR_" (or add that
logic into isKnownNoise) and push those into consoleErrors (or mark them as
non-noise) prior to returning early; reference the existing page.on('console',
(msg) => { ... }), isKnownNoise, and consoleErrors symbols when making the
change.

In `@test/unit/proxy-configs.test.ts`:
- Line 611: Replace the empty catch block in test/unit/proxy-configs.test.ts
with an explicit passthrough that documents intent: change "catch {}" to "catch
(err) { /* passthrough: intentionally ignore error in this test */ }" (or use
"catch (err) { void err; }" if you prefer a minimal no-op), so the no-empty rule
is satisfied and the error-path intent is clear; locate the empty catch in the
test code and update it accordingly.

---

Outside diff comments:
In `@src/proxy-configs.ts`:
- Around line 346-364: getInterceptRules currently builds intercept rules by
scanning config.routes only, missing host/path aliases defined solely in
config.rewrite; update the rule extraction to also iterate
Object.entries(config.rewrite) for each config (in the same loop over
Object.values(configs)) and apply the same proxy URL parsing logic (the
/^https?:\/\/([^/]+)(\/.*)?\/\*\$/ match) to push entries into the rules array
with keys pattern, pathPrefix and target (use the same target derivation as for
routes: strip trailing "/**" from the local rewrite key). Ensure you reference
and handle config.rewrite the same way as config.routes so aliases in rewrite
are included in the rules returned by getInterceptRules.

ℹ️ Review info

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between cef5ef7 and 22f5df1.

📒 Files selected for processing (32)
  • src/module.ts
  • src/proxy-configs.ts
  • src/registry.ts
  • src/templates.ts
  • test/e2e/__snapshots__/proxy/googleAnalytics.json
  • test/e2e/__snapshots__/proxy/googleAnalytics/google-analytics.com~g~collect.diff.json
  • test/e2e/__snapshots__/proxy/googleAnalytics/google-analytics.com~g~collect~2.diff.json
  • test/e2e/__snapshots__/proxy/googleAnalytics/google-analytics.com~g~collect~3.diff.json
  • test/e2e/__snapshots__/proxy/googleTagManager.json
  • test/e2e/__snapshots__/proxy/googleTagManager/googletagmanager.com~gtag~js.diff.json
  • test/e2e/__snapshots__/proxy/metaPixel.json
  • test/e2e/__snapshots__/proxy/redditPixel.json
  • test/e2e/__snapshots__/proxy/segment.json
  • test/e2e/__snapshots__/proxy/segment/cdn.segment.com~v1~projects~KBXOGxgqMFjm2mxtJDJg0iDn5AnGYb9C~settings.diff.json
  • test/e2e/__snapshots__/proxy/snapchatPixel.json
  • test/e2e/__snapshots__/proxy/snapchatPixel/tr.snapchat.com~p.diff.json
  • test/e2e/__snapshots__/proxy/tiktokPixel.json
  • test/e2e/__snapshots__/proxy/tiktokPixel/analytics.tiktok.com~api~v2~pixel~act.diff.json
  • test/e2e/__snapshots__/proxy/xPixel.json
  • test/e2e/__snapshots__/proxy/xPixel/t.co~1~i~adsct.diff.json
  • test/e2e/first-party.test.ts
  • test/fixtures/first-party/nuxt.config.ts
  • test/fixtures/first-party/pages/cfwa.vue
  • test/fixtures/first-party/pages/crisp-test.vue
  • test/fixtures/first-party/pages/databuddy.vue
  • test/fixtures/first-party/pages/fathom.vue
  • test/fixtures/first-party/pages/intercom-test.vue
  • test/fixtures/first-party/pages/plausible.vue
  • test/fixtures/first-party/pages/rybbit.vue
  • test/fixtures/first-party/pages/umami.vue
  • test/unit/proxy-configs.test.ts
  • test/unit/self-closing-script-tags.test.ts
💤 Files with no reviewable changes (3)
  • test/e2e/snapshots/proxy/googleAnalytics/google-analytics.comgcollect~2.diff.json
  • test/e2e/snapshots/proxy/googleAnalytics/google-analytics.comgcollect~3.diff.json
  • test/e2e/snapshots/proxy/googleAnalytics/google-analytics.comgcollect.diff.json

Comment on lines +11 to 12
"v": "2.9.272",
},
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

# First, let's read the JSON file to inspect the structure
head -150 test/e2e/__snapshots__/proxy/metaPixel.json | cat -n

Repository: nuxt/scripts

Length of output: 8608


🏁 Script executed:

# Validate the JSON file using jq
jq empty test/e2e/__snapshots__/proxy/metaPixel.json 2>&1 && echo "Valid JSON" || echo "Invalid JSON"

Repository: nuxt/scripts

Length of output: 134


🏁 Script executed:

# Also try Python's json parser
python3 -m json.tool test/e2e/__snapshots__/proxy/metaPixel.json > /dev/null 2>&1 && echo "Valid JSON (Python)" || echo "Invalid JSON (Python)"

Repository: nuxt/scripts

Length of output: 77


Remove trailing commas — snapshot is currently invalid JSON.

The file contains trailing commas before closing braces across multiple objects, which breaks JSON parsing. Both jq and Python's json module confirm parse errors:

parse error: Expected another key-value pair at line 12, column 7

Affected lines: 11–12, 21–22, 30–31, 55–56, 65–66, 85–86, 112–113, 122–123, 144–145, 33, 88, 147.

Fix pattern
-        "v": "2.9.272",
+        "v": "2.9.272"

-      "userAgent": true,
+      "userAgent": true

-    "targetUrl": "...",
+    "targetUrl": "..."
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"v": "2.9.272",
},
"v": "2.9.272"
},
🧰 Tools
🪛 Biome (2.4.4)

[error] 12-12: Expected a property but instead found '}'.

(parse)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/e2e/__snapshots__/proxy/metaPixel.json` around lines 11 - 12, The
snapshot JSON contains trailing commas before closing braces (e.g. the object
with "v": "2.9.272") which makes the file invalid; open
test/e2e/__snapshots__/proxy/metaPixel.json and remove any trailing commas
immediately before a closing } or ] across the affected objects (instances
around the "v" key and other object endings), ensuring each object/array ends
with the last element not followed by a comma, then validate the file with a
JSON parser (jq or Python json.load) to confirm the file is valid JSON.

Comment on lines +37 to +44
"privacy": {
"hardware": true,
"ip": true,
"language": true,
"screen": true,
"timezone": true,
"userAgent": true,
},
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

head -50 test/e2e/__snapshots__/proxy/redditPixel.json | tail -20

Repository: nuxt/scripts

Length of output: 770


🏁 Script executed:

jq . test/e2e/__snapshots__/proxy/redditPixel.json > /dev/null 2>&1 && echo "Valid JSON" || echo "Invalid JSON"

Repository: nuxt/scripts

Length of output: 68


🏁 Script executed:

sed -n '37,44p' test/e2e/__snapshots__/proxy/redditPixel.json

Repository: nuxt/scripts

Length of output: 216


Invalid JSON syntax in snapshot (privacy trailing comma).

The trailing comma after the last property makes this .json file invalid and parse/lint will fail.

Minimal fix
     "privacy": {
       "hardware": true,
       "ip": true,
       "language": true,
       "screen": true,
       "timezone": true,
-      "userAgent": true,
+      "userAgent": true
     },
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"privacy": {
"hardware": true,
"ip": true,
"language": true,
"screen": true,
"timezone": true,
"userAgent": true,
},
"privacy": {
"hardware": true,
"ip": true,
"language": true,
"screen": true,
"timezone": true,
"userAgent": true
},
🧰 Tools
🪛 Biome (2.4.4)

[error] 44-44: Expected a property but instead found '}'.

(parse)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/e2e/__snapshots__/proxy/redditPixel.json` around lines 37 - 44, The
snapshot JSON contains a trailing comma in the "privacy" object that makes the
file invalid; open the snapshot test/e2e/__snapshots__/proxy/redditPixel.json,
locate the "privacy" object (keys: hardware, ip, language, screen, timezone,
userAgent) and remove the trailing comma after "userAgent" so the JSON parses
correctly.

Comment on lines +9 to +16
"privacy": {
"hardware": false,
"ip": false,
"language": false,
"screen": false,
"timezone": false,
"userAgent": false,
},
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

cat -n test/e2e/__snapshots__/proxy/segment.json | head -30

Repository: nuxt/scripts

Length of output: 732


Invalid JSON syntax in snapshot (privacy object and others have trailing commas).

The file contains multiple trailing commas that break strict JSON parsing, including the trailing comma after "userAgent": false on line 15. Trailing commas are not valid in JSON.

Minimal fix
     "privacy": {
       "hardware": false,
       "ip": false,
       "language": false,
       "screen": false,
       "timezone": false,
-      "userAgent": false,
+      "userAgent": false
     },

Note: Similar trailing comma issues exist elsewhere in the file (e.g., line 7 after "query": {} and line 22) that should also be addressed.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"privacy": {
"hardware": false,
"ip": false,
"language": false,
"screen": false,
"timezone": false,
"userAgent": false,
},
"privacy": {
"hardware": false,
"ip": false,
"language": false,
"screen": false,
"timezone": false,
"userAgent": false
},
🧰 Tools
🪛 Biome (2.4.4)

[error] 16-16: Expected a property but instead found '}'.

(parse)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/e2e/__snapshots__/proxy/segment.json` around lines 9 - 16, Remove
trailing commas from the JSON snapshot so it becomes valid JSON: edit the
"privacy" object (remove the trailing comma after "userAgent": false), the
"query" object (remove the trailing comma after "{}"), and any other trailing
commas in this snapshot file (e.g., the comma noted on line 22) so all
objects/arrays end without a comma on the last entry; then validate the file
with a JSON linter or by parsing it to ensure no other trailing commas remain.

Comment on lines +5 to +20
"original": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) HeadlessChrome/145.0.7632.6 Safari/537.36",
},
"x-forwarded-for": {
"anonymized": "127.0.0.0",
"original": "<absent>",
},
},
"method": "GET",
"query": {
"dv": {
"anonymized": "UTC&en-GB&Google Inc.&Linux x86_64&255&1920&1080&24&24&1920&1080&0&na",
"original": "Australia/Melbourne&en-GB&Google Inc.&Linux x86_64&255&1280&720&24&24&1280&720&0&na",
},
},
"target": "t.co/1/i/adsct",
} No newline at end of file
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

# First, read the snapshot file to inspect its content
cat test/e2e/__snapshots__/proxy/xPixel/t.co~1~i~adsct.diff.json

Repository: nuxt/scripts

Length of output: 681


🏁 Script executed:

# Validate the JSON using jq
jq empty test/e2e/__snapshots__/proxy/xPixel/t.co~1~i~adsct.diff.json 2>&1 || true

Repository: nuxt/scripts

Length of output: 120


Fix invalid JSON syntax caused by trailing commas.

This snapshot is not valid JSON due to trailing commas before closing braces (e.g., lines 6, 10, 11, 20, 21, 23, 24, 25). The jq parser reports: "Expected another key-value pair at line 6, column 5". Remove all trailing commas from property values and closing braces.

🛠️ Proposed fix
 {
   "headers": {
     "user-agent": {
       "anonymized": "Mozilla/5.0 (compatible; Chrome/145.0)",
-      "original": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) HeadlessChrome/145.0.7632.6 Safari/537.36",
+      "original": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) HeadlessChrome/145.0.7632.6 Safari/537.36"
     },
     "x-forwarded-for": {
       "anonymized": "127.0.0.0",
-      "original": "<absent>",
-    },
-  },
+      "original": "<absent>"
+    }
+  },
   "method": "GET",
   "query": {
     "dv": {
       "anonymized": "UTC&en-GB&Google Inc.&Linux x86_64&255&1920&1080&24&24&1920&1080&0&na",
-      "original": "Australia/Melbourne&en-GB&Google Inc.&Linux x86_64&255&1280&720&24&24&1280&720&0&na",
-    },
-  },
-  "target": "t.co/1/i/adsct",
+      "original": "Australia/Melbourne&en-GB&Google Inc.&Linux x86_64&255&1280&720&24&24&1280&720&0&na"
+    }
+  },
+  "target": "t.co/1/i/adsct"
 }
🧰 Tools
🪛 Biome (2.4.4)

[error] 6-6: Expected a property but instead found '}'.

(parse)


[error] 10-10: Expected a property but instead found '}'.

(parse)


[error] 11-11: Expected a property but instead found '}'.

(parse)


[error] 17-17: Expected a property but instead found '}'.

(parse)


[error] 18-18: Expected a property but instead found '}'.

(parse)


[error] 20-20: Expected a property but instead found '}'.

(parse)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/e2e/__snapshots__/proxy/xPixel/t.co`~1~i~adsct.diff.json around lines 5
- 20, The snapshot JSON contains invalid trailing commas (e.g., after the
"original" and "anonymized" entries under "x-forwarded-for", inside "dv" under
"query", and before the closing object for "target"), so remove any trailing
commas that appear immediately before a closing brace or bracket; specifically
audit keys like "x-forwarded-for", "original", "anonymized", "query", "dv", and
"target" in this diff and delete the extraneous commas so the file becomes valid
JSON (no trailing commas anywhere).

Comment on lines +1000 to +1006
page.on('console', (msg) => {
const type = msg.type()
if (type === 'error') {
const text = msg.text()
if (!isKnownNoise(text)) {
consoleErrors.push({ type, text })
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Proxy-related failures can be silently filtered as noise.

At Line 1004, isKnownNoise runs before proxy classification, so messages that include /_proxy/ plus Failed to load resource/net::ERR_* can be dropped and never asserted.

Proposed fix
       page.on('console', (msg) => {
         const type = msg.type()
         if (type === 'error') {
           const text = msg.text()
-          if (!isKnownNoise(text)) {
+          if (isProxyRelated(text) || !isKnownNoise(text)) {
             consoleErrors.push({ type, text })
           }
         }
       })
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/e2e/first-party.test.ts` around lines 1000 - 1006, The console error
handler currently calls isKnownNoise(text) before checking for proxy-related
failures, which causes messages containing "/_proxy/" plus "Failed to load
resource" or "net::ERR_*" to be dropped; update the page.on('console', ...)
handler so proxy-related messages are classified/recorded before applying the
isKnownNoise filter — e.g., inspect msg.text() for "/_proxy/" and "Failed to
load resource"/"net::ERR_" (or add that logic into isKnownNoise) and push those
into consoleErrors (or mark them as non-noise) prior to returning early;
reference the existing page.on('console', (msg) => { ... }), isKnownNoise, and
consoleErrors symbols when making the change.

}
}
}
catch {}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Empty catch block triggers no-empty and hides intent.

Make the passthrough explicit in the error path.

Proposed fix
-      catch {}
+      catch {
+        return url
+      }
       return url
🧰 Tools
🪛 ESLint

[error] 611-611: Empty block statement.

(no-empty)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/unit/proxy-configs.test.ts` at line 611, Replace the empty catch block
in test/unit/proxy-configs.test.ts with an explicit passthrough that documents
intent: change "catch {}" to "catch (err) { /* passthrough: intentionally ignore
error in this test */ }" (or use "catch (err) { void err; }" if you prefer a
minimal no-op), so the no-empty rule is satisfied and the error-path intent is
clear; locate the empty catch in the test code and update it accordingly.

@harlan-zw
Copy link
Collaborator Author

Code review

Found 2 issues:

  1. Client intercept rules are generated for ALL vendors, but server routes are only registered for user-configured scripts. getInterceptRules() at line 471 calls buildProxyConfig() which returns all 14+ hardcoded proxy configs. The generated client plugin will intercept fetch/sendBeacon calls to domains like analytics.tiktok.com, tr.snapchat.com, etc. even if the user only configured googleAnalytics. Those rewritten URLs hit proxy paths that don't exist on the server, returning 404s and silently dropping analytics data. Fix: compute intercept rules inside modules:done after neededRoutes is known, passing only rules for configured scripts.

scripts/src/module.ts

Lines 470 to 472 in 22f5df1

if (firstPartyEnabled) {
const interceptRules = getInterceptRules(firstPartyCollectPrefix)

export function getInterceptRules(collectPrefix: string): InterceptRule[] {
const configs = buildProxyConfig(collectPrefix)
const rules: InterceptRule[] = []
// Extract unique domain -> target mappings from route rules
for (const config of Object.values(configs)) {

vs. server routes scoped to configured scripts only:

scripts/src/module.ts

Lines 563 to 578 in 22f5df1

const proxyConfigs = getAllProxyConfigs(firstPartyCollectPrefix)
const registryKeys = Object.keys(config.registry || {})
// Collect routes for all configured registry scripts that support proxying
const neededRoutes: Record<string, { proxy: string }> = {}
const routePrivacyOverrides: Record<string, ProxyPrivacyInput> = {}
const unsupportedScripts: string[] = []
for (const key of registryKeys) {
// Find the registry script definition
const script = registryScriptsWithImport.find(s => s.import.name.toLowerCase() === `usescript${key.toLowerCase()}`)
// Only proxy scripts that explicitly opt in with a proxy field
const proxyKey = script?.proxy || undefined
if (proxyKey) {
const proxyConfig = proxyConfigs[proxyKey]
if (proxyConfig?.routes) {
Object.assign(neededRoutes, proxyConfig.routes)

  1. InterceptRule.pattern JSDoc claims wildcard support (*.google-analytics.com) that is not implemented. The runtime rewriteUrl function does exact match + suffix match (hostname.endsWith('.' + rule.pattern)), not glob/wildcard expansion. A pattern containing * would never match. The JSDoc should document the actual behavior (plain domain with implicit subdomain matching).

export interface InterceptRule {
/** Domain pattern to match (supports wildcards like *.google-analytics.com) */
pattern: string
/** Path prefix to match and strip from the original URL (e.g., /tr for www.facebook.com/tr) */

🤖 Generated with Claude Code

- If this code review was useful, please react with 👍. Otherwise, react with 👎.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

♻️ Duplicate comments (1)
src/module.ts (1)

105-119: ⚠️ Potential issue | 🟠 Major

Avoid deleting unowned nuxt.vfs entries.

The cleanup still deletes dynamically by path when no replacement is produced. This can remove entries owned by other transforms/modules.

Ownership-safe cleanup
 function fixSelfClosingScriptComponents(nuxt: any) {
+  const ownedVfsEntries = new Set<string>()
+
   function expandTags(content: string): string | null {
@@
   function fixFile(filePath: string) {
@@
-    if (fixed) nuxt.vfs[filePath] = fixed
-    else delete nuxt.vfs[filePath]
+    if (fixed) {
+      nuxt.vfs[filePath] = fixed
+      ownedVfsEntries.add(filePath)
+    }
+    else if (ownedVfsEntries.has(filePath)) {
+      Reflect.deleteProperty(nuxt.vfs, filePath)
+      ownedVfsEntries.delete(filePath)
+    }
   }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/module.ts` around lines 105 - 119, The current fixFile in
fixSelfClosingScriptComponents mutates nuxt.vfs by deleting entries when
expandTags returns null, which can remove files owned by other transforms;
change fixFile to only set nuxt.vfs[filePath] = fixed when expandTags produced a
replacement and do not delete nuxt.vfs[filePath] when there is no change. If you
need to remove entries, guard the deletion by checking module ownership (e.g., a
vfs ownership map like nuxt.vfsOwners[filePath] or compare the existing vfs
content to the original file content) and only delete when this module is the
recorded owner; update expandTags/fixFile to preserve unowned entries.
🧹 Nitpick comments (1)
src/module.ts (1)

507-509: Normalize fetch inputs beyond strings to preserve interception coverage.

fetch accepts RequestInfo (string | URL | Request). Rewriting only string inputs can bypass proxying for URL/Request call sites.

Suggested hardening
+    function rewriteRequestInfo(input) {
+      if (typeof input === 'string') return rewriteUrl(input);
+      if (input instanceof URL) return rewriteUrl(input.toString());
+      if (input instanceof Request) return new Request(rewriteUrl(input.url), input);
+      return input;
+    }
+
     globalThis.__nuxtScripts = {
       sendBeacon: (url, data) => origBeacon(rewriteUrl(url), data),
-      fetch: (url, opts) => origFetch(typeof url === 'string' ? rewriteUrl(url) : url, opts),
+      fetch: (url, opts) => origFetch(rewriteRequestInfo(url), opts),
     };
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/module.ts` around lines 507 - 509, The fetch wrapper rewrites only string
URLs which misses Request and URL inputs; update the fetch handler (the fetch
property that currently calls origFetch) to detect URL and Request types and
rewrite their underlying URL before delegating to origFetch: for URL inputs call
rewriteUrl(url.href) or rewriteUrl(url.toString()), for Request inputs construct
a new Request that uses rewriteUrl(request.url) while preserving the original
Request's method, headers, body, credentials, mode, cache, redirect, referrer,
referrerPolicy, integrity, and signal, then pass that new Request to origFetch;
keep the existing string branch (rewriteUrl for strings) so all RequestInfo
variants are proxied.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In `@src/module.ts`:
- Around line 105-119: The current fixFile in fixSelfClosingScriptComponents
mutates nuxt.vfs by deleting entries when expandTags returns null, which can
remove files owned by other transforms; change fixFile to only set
nuxt.vfs[filePath] = fixed when expandTags produced a replacement and do not
delete nuxt.vfs[filePath] when there is no change. If you need to remove
entries, guard the deletion by checking module ownership (e.g., a vfs ownership
map like nuxt.vfsOwners[filePath] or compare the existing vfs content to the
original file content) and only delete when this module is the recorded owner;
update expandTags/fixFile to preserve unowned entries.

---

Nitpick comments:
In `@src/module.ts`:
- Around line 507-509: The fetch wrapper rewrites only string URLs which misses
Request and URL inputs; update the fetch handler (the fetch property that
currently calls origFetch) to detect URL and Request types and rewrite their
underlying URL before delegating to origFetch: for URL inputs call
rewriteUrl(url.href) or rewriteUrl(url.toString()), for Request inputs construct
a new Request that uses rewriteUrl(request.url) while preserving the original
Request's method, headers, body, credentials, mode, cache, redirect, referrer,
referrerPolicy, integrity, and signal, then pass that new Request to origFetch;
keep the existing string branch (rewriteUrl for strings) so all RequestInfo
variants are proxied.

ℹ️ Review info

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 22f5df1 and dd29cec.

📒 Files selected for processing (2)
  • src/module.ts
  • src/proxy-configs.ts

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

♻️ Duplicate comments (1)
test/e2e/first-party.test.ts (1)

1000-1006: ⚠️ Potential issue | 🟠 Major

Proxy-related errors are still filtered out as noise.

At Line 1004, noise filtering happens before proxy classification, so messages like /_proxy/... + Failed to load resource / net::ERR_* can be dropped and never asserted.

🔧 Proposed fix
       page.on('console', (msg) => {
         const type = msg.type()
         if (type === 'error') {
           const text = msg.text()
-          if (!isKnownNoise(text)) {
+          if (isProxyRelated(text) || !isKnownNoise(text)) {
             consoleErrors.push({ type, text })
           }
         }
       })
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/e2e/first-party.test.ts` around lines 1000 - 1006, The console handler
currently filters out noise via isKnownNoise(text) before checking for
proxy-related errors, causing proxy failures (e.g. messages containing
"/_proxy/" with "Failed to load resource" or "net::ERR_*") to be dropped; update
the page.on('console', ...) logic so proxy-classification runs first (or add an
explicit check for proxy patterns in the console text) and if the message
matches proxy-related patterns always push to consoleErrors (or mark it for
assertion) even if isKnownNoise(text) returns true; locate the handler that uses
isKnownNoise and consoleErrors and modify it to detect proxy messages (e.g.,
text.includes('/_proxy/') || /net::ERR_/.test(text) || text.includes('Failed to
load resource')) before applying the noise filter.
🧹 Nitpick comments (1)
test/unit/self-closing-script-tags.test.ts (1)

3-10: Consider sharing the expansion helper with src/module.ts to prevent drift.

Line 3 duplicates core transformation logic in tests. Extracting the regex/helper into a shared internal utility would keep test and runtime behavior locked together.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/unit/self-closing-script-tags.test.ts` around lines 3 - 10, The test
duplicates the runtime helper; extract the SELF_CLOSING_SCRIPT_RE and expandTags
logic into a shared internal utility used by both test and src/module.ts to
avoid drift. Create a non-exported helper (e.g., in an internal utils file) that
exposes expandTags and the SELF_CLOSING_SCRIPT_RE for import from tests and have
src/module.ts and test/unit/self-closing-script-tags.test.ts import and call
that same expandTags function so behavior stays consistent.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In `@test/e2e/first-party.test.ts`:
- Around line 1000-1006: The console handler currently filters out noise via
isKnownNoise(text) before checking for proxy-related errors, causing proxy
failures (e.g. messages containing "/_proxy/" with "Failed to load resource" or
"net::ERR_*") to be dropped; update the page.on('console', ...) logic so
proxy-classification runs first (or add an explicit check for proxy patterns in
the console text) and if the message matches proxy-related patterns always push
to consoleErrors (or mark it for assertion) even if isKnownNoise(text) returns
true; locate the handler that uses isKnownNoise and consoleErrors and modify it
to detect proxy messages (e.g., text.includes('/_proxy/') ||
/net::ERR_/.test(text) || text.includes('Failed to load resource')) before
applying the noise filter.

---

Nitpick comments:
In `@test/unit/self-closing-script-tags.test.ts`:
- Around line 3-10: The test duplicates the runtime helper; extract the
SELF_CLOSING_SCRIPT_RE and expandTags logic into a shared internal utility used
by both test and src/module.ts to avoid drift. Create a non-exported helper
(e.g., in an internal utils file) that exposes expandTags and the
SELF_CLOSING_SCRIPT_RE for import from tests and have src/module.ts and
test/unit/self-closing-script-tags.test.ts import and call that same expandTags
function so behavior stays consistent.

ℹ️ Review info

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between dd29cec and a3f6b86.

📒 Files selected for processing (4)
  • src/module.ts
  • test/e2e/first-party.test.ts
  • test/unit/proxy-configs.test.ts
  • test/unit/self-closing-script-tags.test.ts

@harlan-zw harlan-zw merged commit 64b91f8 into main Mar 2, 2026
9 of 10 checks passed
@harlan-zw harlan-zw deleted the feat/ast-api-rewriting branch March 2, 2026 02:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant