Skip to content

fix: Bump fast-xml-parser to ^5.5.10 to fix entity expansion limit on large JUnit XML#77

Open
mydea wants to merge 1 commit intomainfrom
fn/bump-fast-xml-parser
Open

fix: Bump fast-xml-parser to ^5.5.10 to fix entity expansion limit on large JUnit XML#77
mydea wants to merge 1 commit intomainfrom
fn/bump-fast-xml-parser

Conversation

@mydea
Copy link
Copy Markdown
Member

@mydea mydea commented Apr 13, 2026

Summary

  • Bumps fast-xml-parser from ^5.5.7 (locked at 5.5.7) to ^5.5.10 (resolves to 5.5.12)

Problem

JUnit XML parsing fails for large test suites with:

Failed to parse .../vitest.junit.xml: Entity expansion limit exceeded: 1001 > 1000

Timeline of upstream changes:

Version Change
v5.3.6 (Feb 2026) Added entity expansion limits (CVE-2026-26278), default maxTotalExpansions: 1000
v5.5.6 (Mar 2026) Standard/numeric entities (<, >, A) also count toward the limit (CVE-2026-33036)
v5.5.10 (Apr 2026) Changed default to Infinity after widespread complaints (fast-xml-parser#813)

Vitest's JUnit XML output naturally contains many standard XML entity references in test names, classnames, and failure content. Large test suites easily exceed 1000 total entity expansions — but standard entities (<, &, etc.) pose no security risk, which is why the upstream default was reverted to Infinity.

Affected

This currently breaks JUnit XML parsing in getsentry/sentry-javascript CI for these packages: browser, core, node, node-core, nextjs, react, replay-internal, sveltekit, cloudflare.

Test plan

  • All 194 existing tests pass
  • dist/index.js rebuilt

🤖 Generated with Claude Code

… large JUnit XML

The `fast-xml-parser` v5.3.6 introduced entity expansion limits (CVE-2026-26278)
with a default `maxTotalExpansions` of 1000. In v5.5.6, standard XML entities
(`<`, `>`, `&`, etc.) also started counting toward this limit
(CVE-2026-33036).

This causes JUnit XML parsing to fail for large test suites because vitest's
JUnit XML output naturally contains many standard entity references in test
names, classnames, and failure content. Suites with >1000 total entity
expansions fail with: "Entity expansion limit exceeded: 100X > 1000"

In v5.5.10, the default was changed to `Infinity` (fast-xml-parser#813) after
widespread complaints, since standard/numeric entities pose no security risk.

This bump resolves the issue without any code changes needed.

Affected consumers: getsentry/sentry-javascript CI (browser, core, node,
node-core, nextjs, react, replay-internal, sveltekit, cloudflare test suites)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@github-actions
Copy link
Copy Markdown

Codecov Results 📊

194 passed | Total: 194 | Pass Rate: 100% | Execution Time: 282ms

📊 Comparison with Base Branch

Metric Change
Total Tests
Passed Tests
Failed Tests
Skipped Tests

✨ No test changes detected

All tests are passing successfully.

✅ Patch coverage is 100.00%. Project has 759 uncovered lines.
✅ Project coverage is 56.82%. Comparing base (base) to head (head).

Coverage diff
@@            Coverage Diff             @@
##          main       #PR       +/-##
==========================================
+ Coverage    56.82%    56.82%        —%
==========================================
  Files           24        24         —
  Lines         1753      1753         —
  Branches      1257      1257         —
==========================================
+ Hits           994       994         —
- Misses         759       759         —
- Partials        97        97         —

Generated by Codecov Action

Copy link
Copy Markdown

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Fix All in Cursor

Bugbot Autofix prepared a fix for the issue found in the latest run.

  • ✅ Fixed: Library bump alone doesn't fix entity expansion limit
    • Configured processEntities as an object to use the new Infinity default for maxTotalExpansions instead of the hardcoded 1000 limit.

Create PR

Or push these changes by commenting:

@cursor push 56ab79b44c
Preview (56ab79b44c)
diff --git a/.nano-staged.js b/.nano-staged.js
--- a/.nano-staged.js
+++ b/.nano-staged.js
@@ -1,4 +1,7 @@
-export default {
-  "src/**/*.{js,ts}": (api) =>
-    [`pnpm dlx @biomejs/biome format --write ${api.filenames.join(" ")}`, "pnpm run build", "git add dist"],
-};
+export default {
+  "src/**/*.{js,ts}": (api) => [
+    `pnpm dlx @biomejs/biome format --write ${api.filenames.join(" ")}`,
+    "pnpm run build",
+    "git add dist",
+  ],
+};

diff --git a/dist/index.js b/dist/index.js
--- a/dist/index.js
+++ b/dist/index.js
@@ -36921,6 +36921,7 @@
             attributeNamePrefix: "@_",
             textNodeName: "#text",
             parseAttributeValue: true,
+            processEntities: { enabled: true },
         });
     }
     /**

diff --git a/src/__tests__/cobertura-parser.test.ts b/src/__tests__/cobertura-parser.test.ts
--- a/src/__tests__/cobertura-parser.test.ts
+++ b/src/__tests__/cobertura-parser.test.ts
@@ -60,7 +60,7 @@
     // Total: 10 statements, 7 covered (from both files)
     expect(result.metrics.statements).toBe(10);
     expect(result.metrics.coveredStatements).toBe(7);
-    
+
     // Total: 4 conditionals (2+2), 3 covered (1+2)
     expect(result.metrics.conditionals).toBe(4);
     expect(result.metrics.coveredConditionals).toBe(3);

diff --git a/src/__tests__/codecov-parser.test.ts b/src/__tests__/codecov-parser.test.ts
--- a/src/__tests__/codecov-parser.test.ts
+++ b/src/__tests__/codecov-parser.test.ts
@@ -164,7 +164,7 @@
     expect(parser.canParse(sampleCodecovJSON)).toBe(true);
     expect(parser.canParse(sampleCodecovJSON, "codecov.json")).toBe(true);
     expect(parser.canParse(sampleCodecovJSON, "coverage/codecov.json")).toBe(
-      true
+      true,
     );
 
     // Should not match non-Codecov JSON
@@ -189,11 +189,11 @@
     const parser = new CodecovParser();
 
     expect(parser.canParse("<coverage></coverage>", "coverage.xml")).toBe(
-      false
+      false,
     );
-    expect(parser.canParse("SF:/path\nDA:1,5\nend_of_record", "lcov.info")).toBe(
-      false
-    );
+    expect(
+      parser.canParse("SF:/path\nDA:1,5\nend_of_record", "lcov.info"),
+    ).toBe(false);
   });
 
   it("should handle empty coverage", async () => {
@@ -254,7 +254,7 @@
     const parser = new CodecovParser();
 
     await expect(parser.parseContent("not json")).rejects.toThrow(
-      "Invalid Codecov JSON"
+      "Invalid Codecov JSON",
     );
   });
 
@@ -262,7 +262,7 @@
     const parser = new CodecovParser();
 
     await expect(parser.parseContent('{"files": {}}')).rejects.toThrow(
-      "Invalid Codecov JSON: missing 'coverage' key"
+      "Invalid Codecov JSON: missing 'coverage' key",
     );
   });
 
@@ -315,7 +315,7 @@
     // Even with minimal valid JSON, codecov.json filename should be detected
     expect(parser.canParse('{"coverage": {}}', "codecov.json")).toBe(true);
     expect(parser.canParse('{"coverage": {}}', "/path/to/codecov.json")).toBe(
-      true
+      true,
     );
   });
 });

diff --git a/src/__tests__/go-parser.test.ts b/src/__tests__/go-parser.test.ts
--- a/src/__tests__/go-parser.test.ts
+++ b/src/__tests__/go-parser.test.ts
@@ -39,18 +39,14 @@
     const parser = new GoParser();
     const result = await parser.parseContent(sampleGoCoverage);
 
-    const mathFile = result.files.find((f) =>
-      f.path.includes("pkg/math.go")
-    );
+    const mathFile = result.files.find((f) => f.path.includes("pkg/math.go"));
     expect(mathFile).toBeDefined();
     expect(mathFile!.name).toBe("math.go");
     expect(mathFile!.statements).toBe(4);
     expect(mathFile!.coveredStatements).toBe(3);
     expect(mathFile!.lineRate).toBe(75);
 
-    const utilsFile = result.files.find((f) =>
-      f.path.includes("pkg/utils.go")
-    );
+    const utilsFile = result.files.find((f) => f.path.includes("pkg/utils.go"));
     expect(utilsFile).toBeDefined();
     expect(utilsFile!.statements).toBe(4);
     expect(utilsFile!.coveredStatements).toBe(3);
@@ -60,9 +56,7 @@
     const parser = new GoParser();
     const result = await parser.parseContent(sampleGoCoverage);
 
-    const mathFile = result.files.find((f) =>
-      f.path.includes("pkg/math.go")
-    );
+    const mathFile = result.files.find((f) => f.path.includes("pkg/math.go"));
 
     // Lines 5-7 should be covered (block 1)
     const line5 = mathFile!.lines.find((l) => l.lineNumber === 5);
@@ -86,18 +80,18 @@
     expect(parser.canParse(sampleGoCoverage, "coverage.out")).toBe(true);
     expect(parser.canParse(sampleGoCoverage, "cover.out")).toBe(true);
     expect(parser.canParse(sampleGoCoverage, "profile.coverprofile")).toBe(
-      true
+      true,
     );
 
     // Should not match XML
     expect(parser.canParse("<coverage></coverage>", "coverage.xml")).toBe(
-      false
+      false,
     );
 
     // Should not match LCOV
-    expect(parser.canParse("SF:/path\nDA:1,5\nend_of_record", "lcov.info")).toBe(
-      false
-    );
+    expect(
+      parser.canParse("SF:/path\nDA:1,5\nend_of_record", "lcov.info"),
+    ).toBe(false);
   });
 
   it("should handle count mode", async () => {

diff --git a/src/__tests__/istanbul-parser.test.ts b/src/__tests__/istanbul-parser.test.ts
--- a/src/__tests__/istanbul-parser.test.ts
+++ b/src/__tests__/istanbul-parser.test.ts
@@ -104,7 +104,7 @@
     expect(mathFile!.lineRate).toBe(50);
 
     const helperFile = result.files.find(
-      (f) => f.path === "/src/utils/helper.ts"
+      (f) => f.path === "/src/utils/helper.ts",
     );
     expect(helperFile).toBeDefined();
     expect(helperFile!.statements).toBe(3);
@@ -140,7 +140,7 @@
 
     expect(parser.canParse(sampleIstanbulJSON)).toBe(true);
     expect(parser.canParse(sampleIstanbulJSON, "coverage-final.json")).toBe(
-      true
+      true,
     );
     expect(parser.canParse(sampleIstanbulJSON, "coverage.json")).toBe(true);
 
@@ -149,13 +149,13 @@
 
     // Should not match XML
     expect(parser.canParse("<coverage></coverage>", "coverage.xml")).toBe(
-      false
+      false,
     );
 
     // Should not match LCOV
-    expect(parser.canParse("SF:/path\nDA:1,5\nend_of_record", "lcov.info")).toBe(
-      false
-    );
+    expect(
+      parser.canParse("SF:/path\nDA:1,5\nend_of_record", "lcov.info"),
+    ).toBe(false);
   });
 
   it("should handle empty coverage", async () => {
@@ -191,7 +191,10 @@
         fnMap: {},
         branchMap: {
           "0": {
-            loc: { start: { line: 1, column: 0 }, end: { line: 1, column: 25 } },
+            loc: {
+              start: { line: 1, column: 0 },
+              end: { line: 1, column: 25 },
+            },
             type: "if",
             locations: [
               { start: { line: 1, column: 0 }, end: { line: 1, column: 12 } },
@@ -229,7 +232,7 @@
     const parser = new IstanbulParser();
 
     await expect(parser.parseContent("not json")).rejects.toThrow(
-      "Invalid Istanbul JSON"
+      "Invalid Istanbul JSON",
     );
   });
 

diff --git a/src/__tests__/lcov-parser.test.ts b/src/__tests__/lcov-parser.test.ts
--- a/src/__tests__/lcov-parser.test.ts
+++ b/src/__tests__/lcov-parser.test.ts
@@ -79,7 +79,7 @@
     expect(mathFile!.coveredMethods).toBe(1);
 
     const helperFile = result.files.find(
-      (f) => f.path === "/src/utils/helper.ts"
+      (f) => f.path === "/src/utils/helper.ts",
     );
     expect(helperFile).toBeDefined();
     expect(helperFile!.statements).toBe(4);
@@ -117,13 +117,13 @@
     expect(parser.canParse(sampleLcov, "report.info")).toBe(true);
 
     // Should not match XML formats
-    expect(
-      parser.canParse(`<coverage><project></project></coverage>`)
-    ).toBe(false);
+    expect(parser.canParse(`<coverage><project></project></coverage>`)).toBe(
+      false,
+    );
 
     // Should not match JSON
     expect(parser.canParse(`{"statementMap": {}}`, "coverage.json")).toBe(
-      false
+      false,
     );
   });
 

diff --git a/src/__tests__/parser-factory.test.ts b/src/__tests__/parser-factory.test.ts
--- a/src/__tests__/parser-factory.test.ts
+++ b/src/__tests__/parser-factory.test.ts
@@ -114,61 +114,61 @@
   describe("detectFormatFromPath", () => {
     it("should detect Clover from path", () => {
       expect(CoverageParserFactory.detectFormatFromPath("clover.xml")).toBe(
-        "clover"
+        "clover",
       );
       expect(
-        CoverageParserFactory.detectFormatFromPath("coverage/clover.xml")
+        CoverageParserFactory.detectFormatFromPath("coverage/clover.xml"),
       ).toBe("clover");
     });
 
     it("should detect Cobertura from path", () => {
       expect(CoverageParserFactory.detectFormatFromPath("cobertura.xml")).toBe(
-        "cobertura"
+        "cobertura",
       );
       expect(
-        CoverageParserFactory.detectFormatFromPath("cobertura-coverage.xml")
+        CoverageParserFactory.detectFormatFromPath("cobertura-coverage.xml"),
       ).toBe("cobertura");
       expect(
-        CoverageParserFactory.detectFormatFromPath("coverage.cobertura.xml")
+        CoverageParserFactory.detectFormatFromPath("coverage.cobertura.xml"),
       ).toBe("cobertura");
     });
 
     it("should detect JaCoCo from path", () => {
       expect(CoverageParserFactory.detectFormatFromPath("jacoco.xml")).toBe(
-        "jacoco"
+        "jacoco",
       );
       expect(
-        CoverageParserFactory.detectFormatFromPath("build/jacoco/test.xml")
+        CoverageParserFactory.detectFormatFromPath("build/jacoco/test.xml"),
       ).toBe("jacoco");
     });
 
     it("should detect LCOV from path", () => {
       expect(CoverageParserFactory.detectFormatFromPath("lcov.info")).toBe(
-        "lcov"
+        "lcov",
       );
-      expect(
-        CoverageParserFactory.detectFormatFromPath("coverage.lcov")
-      ).toBe("lcov");
+      expect(CoverageParserFactory.detectFormatFromPath("coverage.lcov")).toBe(
+        "lcov",
+      );
     });
 
     it("should detect Istanbul from path", () => {
       expect(
-        CoverageParserFactory.detectFormatFromPath("coverage-final.json")
+        CoverageParserFactory.detectFormatFromPath("coverage-final.json"),
       ).toBe("istanbul");
     });
 
     it("should detect Go from path", () => {
       expect(CoverageParserFactory.detectFormatFromPath("coverage.out")).toBe(
-        "go"
+        "go",
       );
       expect(CoverageParserFactory.detectFormatFromPath("cover.out")).toBe(
-        "go"
+        "go",
       );
     });
 
     it("should return null for unknown paths", () => {
       expect(
-        CoverageParserFactory.detectFormatFromPath("unknown.txt")
+        CoverageParserFactory.detectFormatFromPath("unknown.txt"),
       ).toBeNull();
     });
   });
@@ -177,20 +177,20 @@
     it("should return correct parser for each format", () => {
       expect(CoverageParserFactory.getParser("clover").format).toBe("clover");
       expect(CoverageParserFactory.getParser("cobertura").format).toBe(
-        "cobertura"
+        "cobertura",
       );
       expect(CoverageParserFactory.getParser("jacoco").format).toBe("jacoco");
       expect(CoverageParserFactory.getParser("lcov").format).toBe("lcov");
       expect(CoverageParserFactory.getParser("istanbul").format).toBe(
-        "istanbul"
+        "istanbul",
       );
       expect(CoverageParserFactory.getParser("go").format).toBe("go");
     });
 
     it("should throw for unknown format", () => {
-      expect(() =>
-        CoverageParserFactory.getParser("unknown" as never)
-      ).toThrow("Unsupported coverage format");
+      expect(() => CoverageParserFactory.getParser("unknown" as never)).toThrow(
+        "Unsupported coverage format",
+      );
     });
   });
 
@@ -204,7 +204,7 @@
       const result = await CoverageParserFactory.parseContent(
         lcovContent,
         undefined,
-        "lcov"
+        "lcov",
       );
       expect(result.files).toHaveLength(1);
     });
@@ -213,14 +213,14 @@
       const result = await CoverageParserFactory.parseContent(
         lcovContent,
         "lcov.info",
-        "auto"
+        "auto",
       );
       expect(result.files).toHaveLength(1);
     });
 
     it("should throw for undetectable format", async () => {
       await expect(
-        CoverageParserFactory.parseContent("unknown content")
+        CoverageParserFactory.parseContent("unknown content"),
       ).rejects.toThrow("Unable to detect coverage format");
     });
   });
@@ -254,7 +254,7 @@
 
       // Combined metrics
       expect(aggregated.totalStatements).toBe(
-        result1.metrics.statements + result2.metrics.statements
+        result1.metrics.statements + result2.metrics.statements,
       );
     });
 

diff --git a/src/parsers/clover-parser.ts b/src/parsers/clover-parser.ts
--- a/src/parsers/clover-parser.ts
+++ b/src/parsers/clover-parser.ts
@@ -90,22 +90,22 @@
     const statements = Number.parseInt(metricsAttrs.statements || "0", 10);
     const coveredStatements = Number.parseInt(
       metricsAttrs.coveredstatements || "0",
-      10
+      10,
     );
     const conditionals = Number.parseInt(metricsAttrs.conditionals || "0", 10);
     const coveredConditionals = Number.parseInt(
       metricsAttrs.coveredconditionals || "0",
-      10
+      10,
     );
     const methods = Number.parseInt(metricsAttrs.methods || "0", 10);
     const coveredMethods = Number.parseInt(
       metricsAttrs.coveredmethods || "0",
-      10
+      10,
     );
     const elements = Number.parseInt(metricsAttrs.elements || "0", 10);
     const coveredElements = Number.parseInt(
       metricsAttrs.coveredelements || "0",
-      10
+      10,
     );
 
     return {
@@ -127,7 +127,7 @@
    */
   private parseFileElement(fileElement: Record<string, unknown>): FileCoverage {
     const metrics = this.parseMetrics(
-      fileElement.metrics as Record<string, string>
+      fileElement.metrics as Record<string, string>,
     );
 
     // Parse lines
@@ -203,7 +203,7 @@
    * Aggregate multiple coverage results into a single result
    */
   static aggregateResults(
-    results: CoverageResults[]
+    results: CoverageResults[],
   ): AggregatedCoverageResults {
     let totalStatements = 0;
     let coveredStatements = 0;
@@ -228,13 +228,13 @@
     const lineRate =
       totalStatements > 0
         ? Number.parseFloat(
-            ((coveredStatements / totalStatements) * 100).toFixed(2)
+            ((coveredStatements / totalStatements) * 100).toFixed(2),
           )
         : 0;
     const branchRate =
       totalConditionals > 0
         ? Number.parseFloat(
-            ((coveredConditionals / totalConditionals) * 100).toFixed(2)
+            ((coveredConditionals / totalConditionals) * 100).toFixed(2),
           )
         : 0;
 

diff --git a/src/parsers/cobertura-parser.ts b/src/parsers/cobertura-parser.ts
--- a/src/parsers/cobertura-parser.ts
+++ b/src/parsers/cobertura-parser.ts
@@ -139,7 +139,7 @@
     const missingLines: number[] = [];
     const partialLines: number[] = [];
     const classLines = this.ensureArray(
-      (classElement.lines as Record<string, unknown>)?.line
+      (classElement.lines as Record<string, unknown>)?.line,
     );
 
     let statements = 0;
@@ -200,7 +200,7 @@
 
     // Parse methods if available
     const methods = this.ensureArray(
-      (classElement.methods as Record<string, unknown>)?.method
+      (classElement.methods as Record<string, unknown>)?.method,
     );
     let methodCount = methods.length;
     let coveredMethodCount = 0;
@@ -208,7 +208,7 @@
     for (const methodData of methods) {
       const method = methodData as Record<string, unknown>;
       const methodLines = this.ensureArray(
-        (method.lines as Record<string, unknown>)?.line
+        (method.lines as Record<string, unknown>)?.line,
       );
       const hasHits = methodLines.some((l) => {
         const lineObj = l as Record<string, unknown>;

diff --git a/src/parsers/codecov-parser.ts b/src/parsers/codecov-parser.ts
--- a/src/parsers/codecov-parser.ts
+++ b/src/parsers/codecov-parser.ts
@@ -108,7 +108,7 @@
       data = JSON.parse(content);
     } catch (error) {
       throw new Error(
-        `Invalid Codecov JSON: ${error instanceof Error ? error.message : "parse error"}`
+        `Invalid Codecov JSON: ${error instanceof Error ? error.message : "parse error"}`,
       );
     }
 
@@ -157,7 +157,7 @@
    */
   private parseFileCoverage(
     filePath: string,
-    lineCoverage: { [lineNumber: string]: number | string | null }
+    lineCoverage: { [lineNumber: string]: number | string | null },
   ): FileCoverage {
     const fileName = filePath.split("/").pop() || filePath;
 

diff --git a/src/parsers/go-parser.ts b/src/parsers/go-parser.ts
--- a/src/parsers/go-parser.ts
+++ b/src/parsers/go-parser.ts
@@ -146,22 +146,20 @@
    * Parse a single coverage line
    * Format: file.go:startLine.startCol,endLine.endCol numStmts count
    */
-  private parseCoverageLine(line: string):
-    | {
-        file: string;
-        block: {
-          startLine: number;
-          startCol: number;
-          endLine: number;
-          endCol: number;
-          numStmts: number;
-          count: number;
-        };
-      }
-    | null {
+  private parseCoverageLine(line: string): {
+    file: string;
+    block: {
+      startLine: number;
+      startCol: number;
+      endLine: number;
+      endCol: number;
+      numStmts: number;
+      count: number;
+    };
+  } | null {
     // Match: file.go:10.2,12.16 1 5
     const match = line.match(
-      /^(.+\.go):(\d+)\.(\d+),(\d+)\.(\d+) (\d+) (\d+)$/
+      /^(.+\.go):(\d+)\.(\d+),(\d+)\.(\d+) (\d+) (\d+)$/,
     );
     if (!match) {
       return null;
@@ -192,7 +190,7 @@
       endCol: number;
       numStmts: number;
       count: number;
-    }>
+    }>,
   ): FileCoverage {
     const fileName = filePath.split("/").pop() || filePath;
 

diff --git a/src/parsers/istanbul-parser.ts b/src/parsers/istanbul-parser.ts
--- a/src/parsers/istanbul-parser.ts
+++ b/src/parsers/istanbul-parser.ts
@@ -114,7 +114,7 @@
       data = JSON.parse(content);
     } catch (error) {
       throw new Error(
-        `Invalid Istanbul JSON: ${error instanceof Error ? error.message : "parse error"}`
+        `Invalid Istanbul JSON: ${error instanceof Error ? error.message : "parse error"}`,
       );
     }
 
@@ -170,7 +170,7 @@
     const statementHits = fileCoverage.s || {};
     const totalStatements = Object.keys(statementMap).length;
     const coveredStatements = Object.values(statementHits).filter(
-      (h) => h > 0
+      (h) => h > 0,
     ).length;
 
     // Parse functions

diff --git a/src/parsers/jacoco-parser.ts b/src/parsers/jacoco-parser.ts
--- a/src/parsers/jacoco-parser.ts
+++ b/src/parsers/jacoco-parser.ts
@@ -91,7 +91,7 @@
 
     // Get report-level counters
     const reportCounters = this.parseCounters(
-      this.ensureArray(report.counter) as Array<Record<string, unknown>>
+      this.ensureArray(report.counter) as Array<Record<string, unknown>>,
     );
 
     const metrics: CoverageMetrics = {
@@ -106,11 +106,11 @@
         reportCounters.line.covered + reportCounters.branch.covered,
       lineRate: this.calculateRate(
         reportCounters.line.covered,
-        reportCounters.line.total
+        reportCounters.line.total,
       ),
       branchRate: this.calculateRate(
         reportCounters.branch.covered,
-        reportCounters.branch.total
+        reportCounters.branch.total,
       ),
     };
 
@@ -126,7 +126,7 @@
    */
   private parseSourceFile(
     sourceFile: Record<string, unknown>,
-    packageName: string
+    packageName: string,
   ): FileCoverage {
     const fileName = (sourceFile.name as string) || "";
     const packagePath = packageName.replace(/\./g, "/");
@@ -143,7 +143,7 @@
       const lineNum = Number.parseInt((line.nr as string) || "0", 10);
       const coveredInstructions = Number.parseInt(
         (line.ci as string) || "0",
-        10
+        10,
       );
       const missedBranches = Number.parseInt((line.mb as string) || "0", 10);
       const coveredBranches = Number.parseInt((line.cb as string) || "0", 10);
@@ -175,7 +175,7 @@
 
     // Parse counters for this source file
     const counters = this.parseCounters(
-      this.ensureArray(sourceFile.counter) as Array<Record<string, unknown>>
+      this.ensureArray(sourceFile.counter) as Array<Record<string, unknown>>,
     );
 
     return {
@@ -190,7 +190,7 @@
       lineRate: this.calculateRate(counters.line.covered, counters.line.total),
       branchRate: this.calculateRate(
         counters.branch.covered,
-        counters.branch.total
+        counters.branch.total,
       ),
       lines,
       missingLines,

diff --git a/src/parsers/junit-parser.ts b/src/parsers/junit-parser.ts
--- a/src/parsers/junit-parser.ts
+++ b/src/parsers/junit-parser.ts
@@ -16,6 +16,7 @@
       attributeNamePrefix: "@_",
       textNodeName: "#text",
       parseAttributeValue: true,
+      processEntities: { enabled: true },
     });
   }
 

diff --git a/src/parsers/parser-factory.ts b/src/parsers/parser-factory.ts
--- a/src/parsers/parser-factory.ts
+++ b/src/parsers/parser-factory.ts
@@ -117,7 +117,7 @@
    */
   async parseFile(
     filePath: string,
-    format?: CoverageFormat | "auto"
+    format?: CoverageFormat | "auto",
   ): Promise<CoverageResults> {
     const content = await fs.readFile(filePath, "utf-8");
     return CoverageParserFactory.parseContent(content, filePath, format);
@@ -133,7 +133,7 @@
   async parseContent(
     content: string,
     filePath?: string,
-    format?: CoverageFormat | "auto"
+    format?: CoverageFormat | "auto",
   ): Promise<CoverageResults> {
     let parser: ICoverageParser | null = null;
 
@@ -151,8 +151,8 @@
         `Unable to detect coverage format${hint}. ` +
           "Please specify format explicitly or ensure the file is in a supported format. " +
           `Supported formats: ${CoverageParserFactory.getSupportedFormats().join(
-            ", "
-          )}`
+            ", ",
+          )}`,
       );
     }
 
@@ -221,13 +221,13 @@
     const lineRate =
       totalStatements > 0
         ? Number.parseFloat(
-            ((coveredStatements / totalStatements) * 100).toFixed(2)
+            ((coveredStatements / totalStatements) * 100).toFixed(2),
           )
         : 0;
     const branchRate =
       totalConditionals > 0
         ? Number.parseFloat(
-            ((coveredConditionals / totalConditionals) * 100).toFixed(2)
+            ((coveredConditionals / totalConditionals) * 100).toFixed(2),
           )
         : 0;
 

diff --git a/src/reporters/status-check.ts b/src/reporters/status-check.ts
--- a/src/reporters/status-check.ts
+++ b/src/reporters/status-check.ts
@@ -15,24 +15,29 @@
     context: string,
     state: "success" | "failure" | "pending",
     description: string,
-    targetUrl?: string
+    targetUrl?: string,
   ): Promise<void> {
     try {
       // Use the exposed octokit instance and context info from GitHubClient
       // We need to access private properties or extend GitHubClient to support this.
       // Since GitHubClient wraps octokit and doesn't expose it directly in the current implementation,
       // we'll rely on a new method we'll need to add to GitHubClient, or use the existing patterns.
-      
+
       // Checking GitHubClient implementation first...
       // It seems we need to extend GitHubClient to support createCommitStatus
-      await this.client.createCommitStatus(context, state, description, targetUrl);
-      
+      await this.client.createCommitStatus(
+        context,
+        state,
+        description,
+        targetUrl,
+      );
+
       core.info(`✅ Reported status '${context}': ${state} - ${description}`);
     } catch (error) {
       core.warning(
         `Failed to report status '${context}': ${
           error instanceof Error ? error.message : String(error)
-        }`
+        }`,
       );
     }
   }

diff --git a/src/utils/comparison.ts b/src/utils/comparison.ts
--- a/src/utils/comparison.ts
+++ b/src/utils/comparison.ts
@@ -12,7 +12,7 @@
 function generateTestKey(
   suiteName: string,
   classname: string,
-  testName: string
+  testName: string,
 ): string {
   return `${suiteName}::${classname}::${testName}`;
 }
@@ -21,7 +21,7 @@
  * Build a map of all tests from aggregated results
  */
 function buildTestMap(
-  results: AggregatedTestResults
+  results: AggregatedTestResults,
 ): Map<
   string,
   { identifier: TestIdentifier; testCase: TestCase; isPassing: boolean }
@@ -58,7 +58,7 @@
    */
   compareResults(
     baseResults: AggregatedTestResults,
-    currentResults: AggregatedTestResults
+    currentResults: AggregatedTestResults,
   ): TestComparison {
     const baseTestMap = buildTestMap(baseResults);
     const currentTestMap = buildTestMap(currentResults);

diff --git a/src/utils/github-client.ts b/src/utils/github-client.ts
--- a/src/utils/github-client.ts
+++ b/src/utils/github-client.ts
@@ -62,7 +62,7 @@
       const existingComment = comments.find(
         (comment) =>
           comment.body?.includes(identifier) ||
-          comment.body?.includes(legacyIdentifier)
+          comment.body?.includes(legacyIdentifier),
       );
 
       const fullCommentBody = `${identifier}\n${commentBody}`;
@@ -90,9 +90,7 @@
       }
     } catch (error) {
       const message = error instanceof Error ? error.message : "Unknown error";
-      core.warning(
-        `Failed to post/update PR comment: ${message}`
-      );
+      core.warning(`Failed to post/update PR comment: ${message}`);
       // Don't throw - comment posting failure shouldn't fail the action
       // This commonly happens on fork PRs where GITHUB_TOKEN has limited permissions
     }
@@ -110,7 +108,7 @@
       core.warning(
         `Failed to detect default branch, falling back to 'main': ${
           error instanceof Error ? error.message : String(error)
... diff truncated: showing 800 of 2736 lines

This Bugbot Autofix run was free. To enable autofix for future PRs, go to the Cursor dashboard.

Reviewed by Cursor Bugbot for commit 5394de0. Configure here.

@@ -34072,10 +34072,10 @@ function normalizeProcessEntities(value) {
return {
enabled: value.enabled !== false,
maxEntitySize: Math.max(1, value.maxEntitySize ?? 10000),
maxExpansionDepth: Math.max(1, value.maxExpansionDepth ?? 10),
maxTotalExpansions: Math.max(1, value.maxTotalExpansions ?? 1000),
maxExpansionDepth: Math.max(1, value.maxExpansionDepth ?? 10000),
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Library bump alone doesn't fix entity expansion limit

High Severity

The JUnitParser doesn't set processEntities, so it gets the library default processEntities: true (boolean). The normalizeProcessEntities boolean path still hardcodes maxTotalExpansions: 1000 at line 34062 — unchanged from the old version. Only the object path (line 34076) now defaults to Infinity. The bump alone doesn't fix the "Entity expansion limit exceeded: 1001 > 1000" error the PR targets. The parser configuration needs to pass processEntities as an object (e.g. { enabled: true }) to benefit from the new Infinity default.

Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit 5394de0. Configure here.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's see in a follow up if this is really needed, for now we can just bump this a bit and see if the problem goes away.

@mydea mydea requested a review from MathurAditya724 April 13, 2026 12:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants