Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
Note Reviews pausedUse the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds an automated docs sync: a Node.js script and GitHub Actions workflow that pull versioned docs from external GitHub repositories, sync them into local Docusaurus version directories (including live refs), plus manifest/config, README notes, and removal of an older workflow. Changes
Sequence DiagramsequenceDiagram
participant GHA as GitHub Actions (Workflow)
participant Script as Sync Script (Node.js)
participant GHAPI as GitHub API
participant FS as File System
participant Docusaurus as Docusaurus CLI
participant Git as Git
GHA->>GHA: Checkout repo & setup env
GHA->>Script: Execute sync-external-docs.mjs
Script->>Script: Load manifest & init caches
Script->>GHAPI: Fetch remote tags for each repo
GHAPI-->>Script: Return tag refs
Script->>Script: Parse tags into families/versions
Script->>FS: Read local `<section>_versions.json`
loop For each pending tag
Script->>GHAPI: Download tarball for `repo@tag`
GHAPI-->>Script: Tarball bytes
Script->>FS: Write & extract tarball to temp cache
FS-->>Script: Extraction path
Script->>FS: Validate `docs_path` exists
Script->>FS: rsync docs_path -> local section dir
Script->>Docusaurus: Run `docusaurus docs:version:<section> <tag>` (unless dry-run)
Docusaurus-->>Script: Versioning result
end
Script->>GHAPI: Materialize live_ref snapshot
Script->>FS: Sync live docs_path -> live section
Script-->>GHA: Exit (status)
GHA->>Git: git status --porcelain
Git-->>GHA: Changes present?
alt Changes detected
GHA->>Git: Create signed commit with bot identity
Git-->>GHA: Commit pushed
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~40 minutes Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
| Device | URL |
|---|---|
| desktop | http://localhost:3000/ |
| Device | URL |
|---|---|
| mobile | http://localhost:3000/ |
Not what you expected? Are your scores flaky? GitHub runners could be the cause.
Try running on Foo instead
There was a problem hiding this comment.
🧹 Nitpick comments (2)
.github/scripts/sync-external-docs.mjs (2)
11-14: Consider cleaning up temporary directory on exit.The script creates a temp directory at
tempRootbut never cleans it up. While the OS will eventually purge/tmp, explicit cleanup prevents disk accumulation during repeated local runs or long-lived CI runners.♻️ Proposed cleanup handler
const tempRoot = mkdtempSync(path.join(os.tmpdir(), 'docs-sync-')) + +process.on('exit', () => { + try { + fs.rm(tempRoot, { recursive: true, force: true }) + } catch { + // Ignore cleanup errors + } +})Note: Since
fsis imported as promises, you'd need to usermSyncfrom the sync API or handle this differently.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/sync-external-docs.mjs around lines 11 - 14, The temp directory created by mkdtempSync and stored in tempRoot is never removed; add a cleanup handler that removes tempRoot on process exit/signals (e.g., 'exit', 'SIGINT', 'SIGTERM', 'uncaughtException') to avoid accumulating temp files. Implement it by calling a synchronous removal (fs.rmSync or fs.rmdirSync) or safely awaiting fs.promises.rm in async handlers, ensure the handler checks that tempRoot is set and exists and ignores ENOENT, and register it near where tempRoot is created so mkdtempSync/tempRoot are the referenced symbols.
55-68: Family prefix matching with startsWith is fragile for multi-family scenarios.The
version.startsWith(family)filter at line 57 works correctly with current data, but is brittle if contract families are added with overlapping prefixes. For example, if families"axone-objectarium-v"and"axone-objectarium-v-utils-v"both existed,startsWith("axone-objectarium-v")would incorrectly match versions from both families. Currently only one contract family ("axone-gov-v") exists in tracked versions, so this doesn't manifest, but consider using more precise matching (e.g., splitting on the version part) to prevent issues when new contract families are added.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/sync-external-docs.mjs around lines 55 - 68, The filtering using trackedVersions.filter(version => version.startsWith(family)) is fragile; instead derive each tracked version's family the same way remoteTags does and compare equality. Update the block (variables remoteTags, family, trackedVersions, trackedForFamily) to compute trackedVersionFamily for each tracked version (e.g., split or parse the version string the same way you compute tag.family from remoteTags) and use trackedVersions.filter(version => trackedVersionFamily === family) so only exact family matches are selected before building trackedSet and missingForFamily.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In @.github/scripts/sync-external-docs.mjs:
- Around line 11-14: The temp directory created by mkdtempSync and stored in
tempRoot is never removed; add a cleanup handler that removes tempRoot on
process exit/signals (e.g., 'exit', 'SIGINT', 'SIGTERM', 'uncaughtException') to
avoid accumulating temp files. Implement it by calling a synchronous removal
(fs.rmSync or fs.rmdirSync) or safely awaiting fs.promises.rm in async handlers,
ensure the handler checks that tempRoot is set and exists and ignores ENOENT,
and register it near where tempRoot is created so mkdtempSync/tempRoot are the
referenced symbols.
- Around line 55-68: The filtering using trackedVersions.filter(version =>
version.startsWith(family)) is fragile; instead derive each tracked version's
family the same way remoteTags does and compare equality. Update the block
(variables remoteTags, family, trackedVersions, trackedForFamily) to compute
trackedVersionFamily for each tracked version (e.g., split or parse the version
string the same way you compute tag.family from remoteTags) and use
trackedVersions.filter(version => trackedVersionFamily === family) so only exact
family matches are selected before building trackedSet and missingForFamily.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 4e2f6268-73b8-4e3f-a698-a03d76bf48dc
📒 Files selected for processing (5)
.github/scripts/sync-external-docs.mjs.github/sync-external-docs-sources.json.github/workflows/sync-external-docs.yml.github/workflows/update-versioned-docs.ymlREADME.md
💤 Files with no reviewable changes (1)
- .github/workflows/update-versioned-docs.yml
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (2)
.github/scripts/sync-external-docs.mjs (2)
261-271: Dry-run still performs downloads and extractions.The
tarexception on Line 262 means dry-run mode still downloads tarballs and extracts them—onlyrsyncanddocusauruscommands are skipped. This is reasonable for validation purposes but worth documenting if not already clear to users.Consider adding a timeout to
execFileSyncfor commands likersyncthat could potentially hang.📝 Proposed enhancement
execFileSync(command, commandArgs, { cwd: workspace, - stdio: 'inherit' + stdio: 'inherit', + timeout: 300000 // 5 minutes for rsync operations })🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/sync-external-docs.mjs around lines 261 - 271, The run function currently bypasses dry-run for only the 'tar' command (dryRun variable and run(command, commandArgs)), causing downloads/extractions to still occur; update run to either skip 'tar' as well when dryRun is true (so nothing is executed) or explicitly document this behavior if intentional, and add a timeout option to the execFileSync invocation to prevent long-running hangs for commands like 'rsync' (modify the execFileSync call site to include { cwd: workspace, stdio: 'inherit', timeout: <reasonable-ms> } or conditionally set timeouts for specific commands).
106-109: Consider adding a timeout to prevent indefinite hangs.The
execFileSynccall has no timeout, which could cause the script (and CI) to hang indefinitely if the git command stalls due to network issues.⏱️ Proposed fix to add timeout
const output = execFileSync('git', ['ls-remote', '--tags', '--refs', repositoryUrl], { cwd: workspace, - encoding: 'utf8' + encoding: 'utf8', + timeout: 60000 // 60 seconds })🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/sync-external-docs.mjs around lines 106 - 109, The execFileSync call that populates the output variable can hang without a timeout; update the execFileSync invocation (the call using execFileSync with repositoryUrl and workspace) to include a sensible timeout option (e.g., 2–5 minutes, configurable via env var) and wrap the call in a try/catch so timeouts throw a clear error and the script exits non‑zero with a helpful log message; ensure you reference the same execFileSync call that assigns output and preserve encoding and cwd options when adding the timeout.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/scripts/sync-external-docs.mjs:
- Around line 190-199: readTrackedVersions currently calls JSON.parse on the
file at versionsPath without handling invalid JSON; wrap the read/parse in a
try-catch in readTrackedVersions to catch JSON parsing errors and either return
an empty array or rethrow a new, clearer Error that includes versionsPath and
the original error message; ensure parsed is still validated with
Array.isArray(parsed) before returning and reference the readTrackedVersions
function, versionsPath variable and parsed value when adding the error handling.
- Line 14: The temp directory created by mkdtempSync and assigned to tempRoot is
never removed; update the main() flow to remove tempRoot when finished (or on
error) by adding cleanup logic that deletes the directory (e.g., using fs.rmSync
or fs.rmdirSync with recursive:true) in a finally block or at the end of main();
ensure references to tempRoot are used for the removal and that cleanup runs
even on exceptions so temporary archives are not left behind.
---
Nitpick comments:
In @.github/scripts/sync-external-docs.mjs:
- Around line 261-271: The run function currently bypasses dry-run for only the
'tar' command (dryRun variable and run(command, commandArgs)), causing
downloads/extractions to still occur; update run to either skip 'tar' as well
when dryRun is true (so nothing is executed) or explicitly document this
behavior if intentional, and add a timeout option to the execFileSync invocation
to prevent long-running hangs for commands like 'rsync' (modify the execFileSync
call site to include { cwd: workspace, stdio: 'inherit', timeout:
<reasonable-ms> } or conditionally set timeouts for specific commands).
- Around line 106-109: The execFileSync call that populates the output variable
can hang without a timeout; update the execFileSync invocation (the call using
execFileSync with repositoryUrl and workspace) to include a sensible timeout
option (e.g., 2–5 minutes, configurable via env var) and wrap the call in a
try/catch so timeouts throw a clear error and the script exits non‑zero with a
helpful log message; ensure you reference the same execFileSync call that
assigns output and preserve encoding and cwd options when adding the timeout.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 53e8d14b-43a9-44ab-a49b-b7a5b4d90c85
📒 Files selected for processing (5)
.editorconfig.github/scripts/sync-external-docs.mjs.github/sync-external-docs-sources.json.github/workflows/sync-external-docs.ymlpackage.json
✅ Files skipped from review due to trivial changes (4)
- .editorconfig
- package.json
- .github/sync-external-docs-sources.json
- .github/workflows/sync-external-docs.yml
|
@coderabbitai pause |
✅ Actions performedReviews paused. |
|
@coderabbitai resolve |
✅ Actions performedComments resolved. Auto-approval is disabled; enable |
Changes the external documentation update model from a mix of upstream-triggered and notification-based mechanisms to an exclusively pull-based approach.
Summary by CodeRabbit
New Features
Documentation
Chores