Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
WalkthroughAdds a flat ESLint config and CI lint workflow; updates Prettier plugins/ignores and lint scripts; stabilizes Svelte templates with keys and normalized event bindings; hardens FileEditWidget save/get flow (Authorization, async, notifications); applies C/C++ safety initializers, safer casts, static-analysis configs, and many minor UI/class fixes. Changes
Sequence Diagram(s)sequenceDiagram
participant UI as FileEditWidget (UI)
participant Field as FieldRenderer
participant API as Files API
participant Notif as Notification System
participant Store as Local Store
UI->>Field: gather edited content & metadata
UI->>API: POST /files/state (with Authorization header)
API-->>UI: 200 + JSON / error
alt success
UI->>Store: update editableFile (size/time/contents)
UI->>Notif: show success notification
else error
UI->>Notif: show error notification
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested labels
🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 4
🧹 Nitpick comments (1)
interface/src/lib/components/moonbase/EditRowWidget.svelte (1)
61-72: Avoid index-based keys in keyed{#each}blocks.Lines 61 and 72 use
(i)and(j)as keys, which can cause state mix-ups when array order changes. Replace with stable identifiers:
- Line 61: Use
propertyN.nameinstead of(i)— consistent with similar loops elsewhere in RowRenderer.svelte- Line 72: Use
control.p(with fallbacks tocontrol.nameorj) instead of(j)— the.pfield is a unique identifier per control♻️ Suggested change
-{`#each` property.n as propertyN, i (i)} +{`#each` property.n as propertyN (propertyN.name)} @@ - {`#each` dataEditable[propertyN.name] as control, j (j)} + {`#each` dataEditable[propertyN.name] as control (control.p ?? control.name ?? control.id)}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/lib/components/moonbase/EditRowWidget.svelte` around lines 61 - 72, The keyed {`#each`} blocks use unstable numeric indexes causing potential state mix-ups; update the first loop that iterates over property.n (variable propertyN) to key by propertyN.name instead of (i), and update the inner controls loop that iterates over dataEditable[propertyN.name] (variables control, j) to key by control.p with fallbacks to control.name or j if p is missing; adjust the {`#each` ... as ... (key)} expressions accordingly around the RowRenderer and controls iteration so Svelte uses stable identifiers (propertyN.name and control.p || control.name || j).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@interface/src/lib/components/moonbase/FileEditWidget.svelte`:
- Around line 100-106: The folder string is being built naively (folder = path +
'/') which turns '/' into '//' and breaks onSave() path composition; update the
folder normalization in FileEditWidget.svelte so you canonicalize trailing
slashes (e.g. if path === '/' set folder to '/' or to empty string depending on
how onSave() expects root, otherwise strip any trailing slashes and append a
single '/') and ensure the same normalization is applied when deriving folder
from parts (used with editableFile.name and onSave()) so root-level creates do
not produce '//' paths.
In `@interface/src/routes/moonbase/filemanager/FileManager.svelte`:
- Line 284: Remove the stray trailing hyphen after the FileEditWidget component
invocation: locate the JSX/Svelte line containing FileEditWidget with props
newItem, path and isFile={editableFile.isFile} and delete the accidental "-"
character immediately following the component so it no longer renders a stray
hyphen in the UI.
In `@interface/src/routes/moonbase/module/Module.svelte`:
- Around line 129-142: The array-handling branch in updateRecursive treats every
element as an object and always calls updateRecursive(oldData[key][i],
newData[key][i]); change it to detect primitive elements
(string/number/boolean/null/undefined) and directly assign oldData[key][i] =
newData[key][i] when the new value is primitive or the types differ, otherwise
recurse for plain objects/arrays; ensure this logic lives inside the loop that
iterates over oldData[key] / newData[key] so primitive array items get replaced
instead of silently ignored.
In `@misc/upstream-prs/001-lint-fixes.md`:
- Around line 18-29: The example explanation has mismatched delays: the text
states an intended 2000 ms delay but the code block uses 5000 ms; update the
example so both the prose and code use the same delay (change the code's 5000 to
2000 or vice versa). Specifically, modify the setInterval example and its
accompanying "Fix" snippet to use the consistent delay value (reference the
setInterval call and the updateData usage) so the guidance is unambiguous.
---
Nitpick comments:
In `@interface/src/lib/components/moonbase/EditRowWidget.svelte`:
- Around line 61-72: The keyed {`#each`} blocks use unstable numeric indexes
causing potential state mix-ups; update the first loop that iterates over
property.n (variable propertyN) to key by propertyN.name instead of (i), and
update the inner controls loop that iterates over dataEditable[propertyN.name]
(variables control, j) to key by control.p with fallbacks to control.name or j
if p is missing; adjust the {`#each` ... as ... (key)} expressions accordingly
around the RowRenderer and controls iteration so Svelte uses stable identifiers
(propertyN.name and control.p || control.name || j).
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (20)
CLAUDE.mdinterface/.eslintignoreinterface/.eslintrc.cjsinterface/.prettierignoreinterface/.prettierrcinterface/eslint.config.jsinterface/package.jsoninterface/src/lib/components/moonbase/EditRowWidget.svelteinterface/src/lib/components/moonbase/FieldRenderer.svelteinterface/src/lib/components/moonbase/FileEditWidget.svelteinterface/src/lib/components/moonbase/RowRenderer.svelteinterface/src/lib/stores/moonbase_utilities.tsinterface/src/lib/types/moonbase_models.tsinterface/src/routes/moonbase/filemanager/+page.svelteinterface/src/routes/moonbase/filemanager/+page.tsinterface/src/routes/moonbase/filemanager/FileManager.svelteinterface/src/routes/moonbase/module/+page.svelteinterface/src/routes/moonbase/module/Module.svelteinterface/src/routes/moonbase/monitor/monitor.tsmisc/upstream-prs/001-lint-fixes.md
💤 Files with no reviewable changes (4)
- interface/.eslintignore
- interface/.eslintrc.cjs
- interface/src/routes/moonbase/module/+page.svelte
- interface/src/routes/moonbase/filemanager/+page.svelte
There was a problem hiding this comment.
Actionable comments posted: 3
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
interface/src/lib/components/moonbase/FileEditWidget.svelte (1)
149-171:⚠️ Potential issue | 🟠 MajorDon’t close/reset the editor before save completion.
Line 169 fires
postFilesState(response)without awaiting it, while Lines 170-171 immediately close the editor and clear dirty state. Failed saves can silently discard unsaved edits.Suggested fix
- function onSave() { + async function onSave() { console.log('onSave', editableFile.isFile); let valid = true; @@ - postFilesState(response); - showEditor = false; - changed = false; + const saved = await postFilesState(response); + if (saved !== null) { + showEditor = false; + changed = false; + } } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/lib/components/moonbase/FileEditWidget.svelte` around lines 149 - 171, The code closes the editor and clears the dirty flag immediately after calling postFilesState(response) without awaiting its completion; update the flow in the block that handles newItem vs update so you await postFilesState(response) (or its returned promise) and wrap it in a try/catch, only setting showEditor = false and changed = false after the awaited save succeeds, and on error keep the editor open and surface/log the error (e.g., using the existing logger or a user-visible error state), referencing postFilesState, showEditor, changed, and editableFile.
♻️ Duplicate comments (1)
interface/src/routes/moonbase/module/Module.svelte (1)
131-143:⚠️ Potential issue | 🟠 MajorHandle type transitions safely in
updateRecursivearray updates.Line 131 only normalizes when falsy, so a truthy non-array can make Line 132 compute
Math.max(undefined, ...)and skip the loop. Also, Line 141 can recurse into a primitive old item when element types flip (primitive → object), which is runtime-unsafe.💡 Proposed fix
if (Array.isArray(newData[key])) { //loop over array - if (!oldData[key]) oldData[key] = []; //create an empty array + if (!Array.isArray(oldData[key])) oldData[key] = []; //normalize to array for (let i = 0; i < Math.max(oldData[key].length, newData[key].length); i++) { if (oldData[key][i] == undefined) { // console.log("add row", key, i, newData[key][i]); oldData[key][i] = newData[key][i]; //create new row if not existed, trigger reactiveness } else if (newData[key][i] == undefined) { // console.log("remove remaining rows", key, i, oldData[key][i]); oldData[key].splice(i); } else { // console.log("change row", key, i, oldData[key][i], newData[key][i]); - if (newData[key][i] !== null && typeof newData[key][i] === 'object') { - updateRecursive(oldData[key][i], newData[key][i]); - } else if (oldData[key][i] !== newData[key][i]) { - oldData[key][i] = newData[key][i]; - } + const oldItem = oldData[key][i]; + const newItem = newData[key][i]; + const bothObjects = + oldItem !== null && + typeof oldItem === 'object' && + newItem !== null && + typeof newItem === 'object'; + if (bothObjects) { + updateRecursive(oldItem, newItem); + } else if (oldItem !== newItem) { + oldData[key][i] = newItem; + } } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/moonbase/module/Module.svelte` around lines 131 - 143, In updateRecursive, defensive-harden the array branch: ensure oldData[key] is an array (replace non-array/truthy values with [] or with Array.from(newData[key]) as appropriate) before using Math.max on lengths; guard the loop by using (Array.isArray(oldData[key]) ? oldData[key].length : 0) and (Array.isArray(newData[key]) ? newData[key].length : 0); and before recursing with updateRecursive(oldData[key][i], newData[key][i]) verify oldData[key][i] is an object (if it’s a primitive or wrong type, replace it with {} or directly assign newData[key][i] when types differ) so you never recurse into a non-object. Ensure these changes are applied around the array handling for key in updateRecursive to safely handle type transitions.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@interface/src/lib/components/moonbase/FileEditWidget.svelte`:
- Around line 223-224: The cancel handler currently logs the wrong label
("Save") before calling onCancel(); update the console.log in the cancel flow to
a correct message like "Cancel" (or remove the log entirely) so the output
matches the action; locate the cancel button/handler in FileEditWidget.svelte
and change the console.log('Save') to console.log('Cancel') (or drop it)
adjacent to the onCancel() invocation.
- Around line 79-90: The save flag `changed` is being set immediately after
`reader.readAsText(file)` which can enable saving before `reader.onload` assigns
`editableFile.contents`; move the `changed = true` assignment into the
`reader.onload` callback after `editableFile.contents` is set and after the
`showEditor = false; await tick(); showEditor = true;` sequence completes so the
UI state and contents are fully updated before allowing saves; update references
in the `reader.onload` handler (reader.onload, editableFile.contents,
showEditor, changed) accordingly.
In `@interface/src/routes/moonbase/module/Module.svelte`:
- Around line 187-197: Replace the invalid Tailwind utility "lex-shrink-0" with
the correct "shrink-0" on the icon component class attributes: update the
<Router ... class="lex-shrink-0 mr-2 h-6 w-6 self-end" /> and the <Help ...
class="lex-shrink-0 mr-2 h-6 w-6 self-end" /> occurrences so both use "shrink-0
mr-2 h-6 w-6 self-end" instead.
---
Outside diff comments:
In `@interface/src/lib/components/moonbase/FileEditWidget.svelte`:
- Around line 149-171: The code closes the editor and clears the dirty flag
immediately after calling postFilesState(response) without awaiting its
completion; update the flow in the block that handles newItem vs update so you
await postFilesState(response) (or its returned promise) and wrap it in a
try/catch, only setting showEditor = false and changed = false after the awaited
save succeeds, and on error keep the editor open and surface/log the error
(e.g., using the existing logger or a user-visible error state), referencing
postFilesState, showEditor, changed, and editableFile.
---
Duplicate comments:
In `@interface/src/routes/moonbase/module/Module.svelte`:
- Around line 131-143: In updateRecursive, defensive-harden the array branch:
ensure oldData[key] is an array (replace non-array/truthy values with [] or with
Array.from(newData[key]) as appropriate) before using Math.max on lengths; guard
the loop by using (Array.isArray(oldData[key]) ? oldData[key].length : 0) and
(Array.isArray(newData[key]) ? newData[key].length : 0); and before recursing
with updateRecursive(oldData[key][i], newData[key][i]) verify oldData[key][i] is
an object (if it’s a primitive or wrong type, replace it with {} or directly
assign newData[key][i] when types differ) so you never recurse into a
non-object. Ensure these changes are applied around the array handling for key
in updateRecursive to safely handle type transitions.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (6)
interface/src/lib/components/moonbase/EditRowWidget.svelteinterface/src/lib/components/moonbase/FileEditWidget.svelteinterface/src/routes/moonbase/filemanager/FileManager.svelteinterface/src/routes/moonbase/module/Module.sveltelib/framework/WWWData.hmisc/upstream-prs/001-lint-fixes.md
🚧 Files skipped from review as they are similar to previous changes (3)
- interface/src/routes/moonbase/filemanager/FileManager.svelte
- interface/src/lib/components/moonbase/EditRowWidget.svelte
- misc/upstream-prs/001-lint-fixes.md
fix type lex-shrink-0 -> shrink-0 Module.svelte: updateRecursive tweaks pio.ini: add cppcheck
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
interface/src/routes/system/metrics/BatteryMetrics.svelte (1)
138-140:⚠️ Potential issue | 🔴 CriticalCritical bug:
setIntervalmissing delay argument.The
5000is inside the callback as part of a comma expression, not the interval delay. This causes the interval to fire continuously (~4ms) instead of every 5 seconds, potentially freezing the browser.🐛 Proposed fix
setInterval(() => { - updateData(), 5000; - }); + updateData(); + }, 5000);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/system/metrics/BatteryMetrics.svelte` around lines 138 - 140, The setInterval call in BatteryMetrics.svelte is incorrect: the delay 5000 is placed inside the callback via a comma expression so the interval fires at the default ~4ms; fix by moving 5000 out of the callback and passing it as the second argument to setInterval so it reads setInterval(() => updateData(), 5000); keep the existing updateData reference and (optionally) capture the returned interval id so you can clear it on component destroy if needed.
♻️ Duplicate comments (1)
interface/src/lib/components/moonbase/FileEditWidget.svelte (1)
214-217:⚠️ Potential issue | 🟠 MajorDon’t enable Save before file contents finish loading.
Line 216 sets
changed = trueimmediately after triggering async file read, so Save can be enabled beforeeditableFile.contentsis populated.💡 Suggested fix
<FieldRenderer property={{ name: 'Upload', type: 'file' }} bind:value={editableFile.contents} onChange={(event) => { uploadFile(event); - changed = true; }} ></FieldRenderer>🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/lib/components/moonbase/FileEditWidget.svelte` around lines 214 - 217, The Save flag is flipped too early: do not set changed = true immediately in the onChange handler; instead set changed only after the async file read completes and editableFile.contents is populated. Update the onChange handler to await or handle the Promise returned by uploadFile (or move the changed assignment into the uploadFile completion callback) so changed is set after uploadFile has finished and editableFile.contents is assigned (references: uploadFile, changed, editableFile.contents, FileEditWidget.svelte).
🧹 Nitpick comments (2)
interface/src/routes/connections/ntp/NTP.svelte (1)
162-162: Class fix is correct; consider defensive coding for the URL construction.The
shrink-0class fix is good. Optionally, consider guarding againstpage.data.githubbeing undefined or malformed to prevent runtime errors:{`#if` page.data.github} <a href="https://{page.data.github.split('/')[0]}.github.io/{page.data.github.split('/')[1]}/network/ntp" ...> {/if}Or extract the split result to avoid calling it twice.
,🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/connections/ntp/NTP.svelte` at line 162, The anchor's href uses page.data.github.split(...) directly which can throw if page.data.github is undefined or malformed; in NTP.svelte locate the anchor that renders the Help component and guard the value by extracting page.data.github into a local variable (e.g., github = page?.data?.github || '') and compute parts once (split result or defaults) before building the URL, then conditionally render the <a> only when the parts are valid (or fall back to a safe URL) to avoid runtime errors from calling split twice or on undefined.interface/src/routes/system/update/GithubFirmwareManager.svelte (1)
130-130: Consider adding a key to the{#each}loop.The PR summary mentions adding Svelte keyed iterations. This loop over
githubReleasescould benefit from a key (e.g.,release.idorrelease.tag_name) for optimal DOM reconciliation when releases are updated or reordered.♻️ Suggested change
- {`#each` githubReleases as release} + {`#each` githubReleases as release (release.id)}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/system/update/GithubFirmwareManager.svelte` at line 130, Add a Svelte key to the {`#each` githubReleases as release} loop to improve DOM reconciliation when items change; update the loop to use a stable unique identifier from each release (e.g., release.id or release.tag_name) so Svelte can track items correctly—look for the githubReleases iteration in GithubFirmwareManager.svelte and change the loop to a keyed form using release.id or release.tag_name as the key.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@interface/src/routes/moonbase/module/Module.svelte`:
- Around line 155-159: The current branch in updateRecursive replaces non-array
objects wholesale (oldData[key] = newData[key]), which drops sibling fields on
partial patches; instead, detect plain objects and recursively merge: when
encountering an object at newData[key] and oldData[key] is also an object, call
the recursive updater (e.g., updateRecursive(oldData[key], newData[key]) or loop
over newData[key] keys and assign/merge per-key) rather than assigning the whole
object; keep the direct assignment only for arrays or non-object primitives and
ensure compareRecursive/updateRecursive are used consistently to preserve
sibling fields.
In `@interface/src/routes/system/update/GithubFirmwareManager.svelte`:
- Line 107: The anchor that renders the documentation link (the <a ...
target="_blank" ...> using page.data.github.split and page.url.pathname around
the Help component) lacks rel="noopener noreferrer"; update that <a> element to
include rel="noopener noreferrer" to prevent tabnabbing and match the other
links in this component.
---
Outside diff comments:
In `@interface/src/routes/system/metrics/BatteryMetrics.svelte`:
- Around line 138-140: The setInterval call in BatteryMetrics.svelte is
incorrect: the delay 5000 is placed inside the callback via a comma expression
so the interval fires at the default ~4ms; fix by moving 5000 out of the
callback and passing it as the second argument to setInterval so it reads
setInterval(() => updateData(), 5000); keep the existing updateData reference
and (optionally) capture the returned interval id so you can clear it on
component destroy if needed.
---
Duplicate comments:
In `@interface/src/lib/components/moonbase/FileEditWidget.svelte`:
- Around line 214-217: The Save flag is flipped too early: do not set changed =
true immediately in the onChange handler; instead set changed only after the
async file read completes and editableFile.contents is populated. Update the
onChange handler to await or handle the Promise returned by uploadFile (or move
the changed assignment into the uploadFile completion callback) so changed is
set after uploadFile has finished and editableFile.contents is assigned
(references: uploadFile, changed, editableFile.contents, FileEditWidget.svelte).
---
Nitpick comments:
In `@interface/src/routes/connections/ntp/NTP.svelte`:
- Line 162: The anchor's href uses page.data.github.split(...) directly which
can throw if page.data.github is undefined or malformed; in NTP.svelte locate
the anchor that renders the Help component and guard the value by extracting
page.data.github into a local variable (e.g., github = page?.data?.github || '')
and compute parts once (split result or defaults) before building the URL, then
conditionally render the <a> only when the parts are valid (or fall back to a
safe URL) to avoid runtime errors from calling split twice or on undefined.
In `@interface/src/routes/system/update/GithubFirmwareManager.svelte`:
- Line 130: Add a Svelte key to the {`#each` githubReleases as release} loop to
improve DOM reconciliation when items change; update the loop to use a stable
unique identifier from each release (e.g., release.id or release.tag_name) so
Svelte can track items correctly—look for the githubReleases iteration in
GithubFirmwareManager.svelte and change the loop to a keyed form using
release.id or release.tag_name as the key.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (23)
docs/components.mdinterface/src/lib/components/Collapsible.svelteinterface/src/lib/components/SettingsCard.svelteinterface/src/lib/components/moonbase/FileEditWidget.svelteinterface/src/lib/components/moonbase/RowRenderer.svelteinterface/src/routes/connections/mqtt/MQTT.svelteinterface/src/routes/connections/mqtt/MQTTConfig.svelteinterface/src/routes/connections/ntp/NTP.svelteinterface/src/routes/ethernet/Ethernet.svelteinterface/src/routes/moonbase/filemanager/FileManager.svelteinterface/src/routes/moonbase/module/Module.svelteinterface/src/routes/moonbase/monitor/Monitor.svelteinterface/src/routes/system/coredump/+page.svelteinterface/src/routes/system/metrics/BatteryMetrics.svelteinterface/src/routes/system/metrics/SystemMetrics.svelteinterface/src/routes/system/status/SystemStatus.svelteinterface/src/routes/system/update/GithubFirmwareManager.svelteinterface/src/routes/system/update/UploadFirmware.svelteinterface/src/routes/user/+page.svelteinterface/src/routes/wifi/ap/Accesspoint.svelteinterface/src/routes/wifi/sta/Wifi.sveltelib/framework/WWWData.hplatformio.ini
✅ Files skipped from review due to trivial changes (2)
- docs/components.md
- interface/src/routes/user/+page.svelte
interface/src/routes/system/update/GithubFirmwareManager.svelte
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Actionable comments posted: 3
🧹 Nitpick comments (2)
interface/src/routes/connections/ntp/NTP.svelte (1)
162-162: Add an explicit accessible name to the icon-only docs link.
titleis not a reliable accessible name for all assistive tech. Please addaria-labelon the<a>element.♿ Suggested patch
- <div class="absolute right-5"><a href="https://{page.data.github.split("/")[0]}.github.io/{page.data.github.split("/")[1]}/network/ntp" target="_blank" rel="noopener noreferrer" title="Documentation"><Help class="shrink-0 mr-2 h-6 w-6 self-end" /></a></div> <!-- 🌙 link to docs - {page.url.pathname} hardcoded --> + <div class="absolute right-5"><a href="https://{page.data.github.split("/")[0]}.github.io/{page.data.github.split("/")[1]}/network/ntp" target="_blank" rel="noopener noreferrer" title="Documentation" aria-label="Open NTP documentation"><Help class="shrink-0 mr-2 h-6 w-6 self-end" /></a></div> <!-- 🌙 link to docs - {page.url.pathname} hardcoded -->🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/connections/ntp/NTP.svelte` at line 162, The docs link anchor in NTP.svelte (the <a> wrapping the Help icon) lacks an explicit accessible name; add an aria-label attribute on that anchor (e.g., aria-label="Open NTP documentation (opens in new tab)" or similar descriptive text) so assistive technologies get a reliable name for the icon-only link while keeping existing target and rel attributes and the Help component intact.interface/src/routes/system/coredump/+page.svelte (1)
76-76: Migrate to Svelte 5 event syntax for consistency with modern conventions.Svelte 5 uses
onclickinstead of the legacyon:clickdirective. Whileon:clickstill works, update toonclickto align with Svelte 5 conventions.Suggested change
- <button class="btn btn-primary mt-4" on:click={downloadCoreDump}> + <button class="btn btn-primary mt-4" onclick={downloadCoreDump}>🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/system/coredump/`+page.svelte at line 76, The button uses legacy Svelte event directive on:click; update it to Svelte 5 syntax by replacing on:click={downloadCoreDump} with onclick={downloadCoreDump} on the button element so the existing downloadCoreDump handler is invoked using the modern attribute-style event binding in +page.svelte.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@interface/src/lib/components/moonbase/FileEditWidget.svelte`:
- Around line 94-123: getFileContents is asynchronous and can have out-of-order
responses that overwrite editableFile.contents for a newer path; fix by
capturing the current path (or a request id) at the start of getFileContents
(e.g. const current = path) and, before assigning editableFile.contents (and any
other path-dependent fields), verify that current === path (or the request id
matches the latest token); update getFileContents and the $effect trigger to use
this snapshot/token check so only the latest load writes to editableFile.
In `@interface/src/routes/moonbase/filemanager/FileManager.svelte`:
- Line 223: The JSON.parse(bc) call can throw on corrupted localStorage; wrap
the parse in a try/catch around the existing if (bc) branch so that on parse
error you log or warn, reset breadCrumbs to a safe default (e.g., [] or existing
default state) and optionally remove the corrupted localStorage entry;
specifically modify the block that references breadCrumbs and JSON.parse(bc) to
try { breadCrumbs = JSON.parse(bc) } catch (err) { console.warn('Invalid
breadcrumb JSON', err); breadCrumbs = [] ; localStorage.removeItem(/* the
breadcrumb key variable or string */) }.
In `@interface/src/routes/moonbase/module/Module.svelte`:
- Around line 132-139: The loop that syncs arrays in Module.svelte uses
oldData[key].splice(i) to truncate remaining entries but continues iterating to
Math.max(...), which lets later iterations recreate undefined slots; after
removing the tail you should stop iterating. Update the loop that touches
oldData and newData for a given key so that when newData[key][i] == undefined
and you call oldData[key].splice(i) you immediately break out of the for-loop
(or otherwise update the loop bound to use the new shorter length), ensuring no
further iterations re-add undefined elements to oldData[key].
---
Nitpick comments:
In `@interface/src/routes/connections/ntp/NTP.svelte`:
- Line 162: The docs link anchor in NTP.svelte (the <a> wrapping the Help icon)
lacks an explicit accessible name; add an aria-label attribute on that anchor
(e.g., aria-label="Open NTP documentation (opens in new tab)" or similar
descriptive text) so assistive technologies get a reliable name for the
icon-only link while keeping existing target and rel attributes and the Help
component intact.
In `@interface/src/routes/system/coredump/`+page.svelte:
- Line 76: The button uses legacy Svelte event directive on:click; update it to
Svelte 5 syntax by replacing on:click={downloadCoreDump} with
onclick={downloadCoreDump} on the button element so the existing
downloadCoreDump handler is invoked using the modern attribute-style event
binding in +page.svelte.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (14)
interface/src/lib/components/moonbase/FileEditWidget.svelteinterface/src/routes/connections/mqtt/MQTT.svelteinterface/src/routes/connections/ntp/NTP.svelteinterface/src/routes/moonbase/filemanager/FileManager.svelteinterface/src/routes/moonbase/module/Module.svelteinterface/src/routes/system/coredump/+page.svelteinterface/src/routes/system/metrics/BatteryMetrics.svelteinterface/src/routes/system/metrics/SystemMetrics.svelteinterface/src/routes/system/status/SystemStatus.svelteinterface/src/routes/system/update/GithubFirmwareManager.svelteinterface/src/routes/user/+page.svelteinterface/src/routes/wifi/ap/Accesspoint.svelteinterface/src/routes/wifi/sta/Wifi.sveltelib/framework/WWWData.h
🚧 Files skipped from review as they are similar to previous changes (5)
- interface/src/routes/connections/mqtt/MQTT.svelte
- interface/src/routes/user/+page.svelte
- interface/src/routes/wifi/sta/Wifi.svelte
- interface/src/routes/system/status/SystemStatus.svelte
- interface/src/routes/system/metrics/BatteryMetrics.svelte
frontend ======= - general: on:X to onX - FileEdit: getFileContents: fix path - File manager: fix breadCrumbs - Module: fix updateRecursive - SystemMetrics: fix setInterval, updateData backend ======= - pio.ini: cppcheck tweaks - init variables - WowiMoveEffect: fix range slider
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
src/MoonBase/pal_espidf.cpp (1)
139-154:⚠️ Potential issue | 🟡 MinorAdd
overridekeyword to therecv_frommethod declaration.The
recv_frommethod overrides a pure virtual function from theUdpSocketbase class but is missing theoverridespecifier. All other virtual methods in this class (open,send_to,close, and the destructor) correctly useoverride. Add it for consistency and compile-time safety.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/MoonBase/pal_espidf.cpp` around lines 139 - 154, The recv_from method in this class overrides the pure virtual from UdpSocket but lacks the override specifier; update the recv_from declaration (the function named recv_from that currently returns int and takes uint8_t*, size_t, char*, uint16_t*) to append the override keyword (matching how open, send_to, close and the destructor are declared) so the compiler enforces correct overriding and catches signature mismatches.interface/src/routes/system/metrics/SystemMetrics.svelte (1)
175-177:⚠️ Potential issue | 🟠 MajorInitialize PSRAM chart when data appears after mount.
At Line [176], chart creation is one-time in
onMount. If PSRAM data becomes available later, Line [396] can run beforepsramChartexists and throw.💡 Proposed fix
+ function initPsramChart() { + if (!psramChartElement || psramChart) return; + psramChart = new Chart(psramChartElement, { + // keep the existing PSRAM chart config unchanged + }); + } + onMount(() => { - if (hasPsramData) { - psramChart = new Chart(psramChartElement, { - // ... - }); - } + if (hasPsramData) initPsramChart(); // ... }); + + $effect(() => { + if (hasPsramData) initPsramChart(); + }); - if (hasPsramData) { + if (hasPsramData && psramChart) { psramChart.data.labels = $analytics.uptime; psramChart.data.datasets[0].data = $analytics.used_psram; psramChart.update('none'); // ... }Also applies to: 396-403
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/system/metrics/SystemMetrics.svelte` around lines 175 - 177, The PSRAM Chart is only created inside the onMount block when hasPsramData is true, but updates later (code around lines 396-403) assume psramChart exists and may throw if data appears after mount; change the update path to lazily initialize psramChart when new PSRAM data arrives: in the code that runs when PSRAM data or hasPsramData changes, check if psramChart is falsy and if so create it using psramChartElement (same options used in the original onMount creation), otherwise update the existing psramChart; ensure the creation uses the same constructor/options and reference to psramChartElement so both initial mount and later data paths produce the chart.interface/src/routes/moonbase/module/Module.svelte (1)
173-176:⚠️ Potential issue | 🟠 MajorDo not delete keys missing from incremental patches.
Line [173]-Line [176] removes fields not present in
newData. For patch-style updates, this can drop sibling state that was intentionally omitted from the payload.Based on learnings: In MoonModules/MoonLight Module::update() + compareRecursive, passing a partial object acts as a patch — only provided keys are visited/updated and missing siblings are preserved.💡 Proposed fix
- //remove properties that are not in newData (e.g. control min and max) - for (let key in oldData) { - if (newData[key] == null) { - delete oldData[key]; //remove property if not in newData - } - } + // Keep missing keys: incoming websocket/module updates are partial patches.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/moonbase/module/Module.svelte` around lines 173 - 176, The loop in compareRecursive (inside Module.svelte) currently deletes properties from oldData when newData[key] is null, which removes sibling state on partial/patch updates; instead, change the logic to preserve keys missing from the patch by not deleting properties when the key is absent from newData: only process keys that exist in newData (use Object.prototype.hasOwnProperty.call(newData, key) or key in newData) and, if you still need to support explicit removal, treat an explicit sentinel (e.g., newData[key] === DELETE_SENTINEL or newData.hasOwnProperty(key) && newData[key] === null) for deletion—remove the delete oldData[key] path and ensure compareRecursive and Module.update only update or recurse into provided keys.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@interface/src/routes/system/metrics/SystemMetrics.svelte`:
- Around line 374-376: The onMount block in SystemMetrics.svelte starts a
polling interval with setInterval calling updateData() but does not return a
cleanup function, causing intervals to accumulate on remount and Chart contexts
to leak; update the onMount handler that creates the interval and the Chart
instance (look for onMount, setInterval, updateData and the chart variable/Chart
constructor usage) to store the interval id and chart reference and return a
cleanup function that calls clearInterval(intervalId) and, if present, calls
chart.destroy() (or the Chart instance's appropriate teardown method) to
properly remove listeners and free resources.
---
Outside diff comments:
In `@interface/src/routes/moonbase/module/Module.svelte`:
- Around line 173-176: The loop in compareRecursive (inside Module.svelte)
currently deletes properties from oldData when newData[key] is null, which
removes sibling state on partial/patch updates; instead, change the logic to
preserve keys missing from the patch by not deleting properties when the key is
absent from newData: only process keys that exist in newData (use
Object.prototype.hasOwnProperty.call(newData, key) or key in newData) and, if
you still need to support explicit removal, treat an explicit sentinel (e.g.,
newData[key] === DELETE_SENTINEL or newData.hasOwnProperty(key) && newData[key]
=== null) for deletion—remove the delete oldData[key] path and ensure
compareRecursive and Module.update only update or recurse into provided keys.
In `@interface/src/routes/system/metrics/SystemMetrics.svelte`:
- Around line 175-177: The PSRAM Chart is only created inside the onMount block
when hasPsramData is true, but updates later (code around lines 396-403) assume
psramChart exists and may throw if data appears after mount; change the update
path to lazily initialize psramChart when new PSRAM data arrives: in the code
that runs when PSRAM data or hasPsramData changes, check if psramChart is falsy
and if so create it using psramChartElement (same options used in the original
onMount creation), otherwise update the existing psramChart; ensure the creation
uses the same constructor/options and reference to psramChartElement so both
initial mount and later data paths produce the chart.
In `@src/MoonBase/pal_espidf.cpp`:
- Around line 139-154: The recv_from method in this class overrides the pure
virtual from UdpSocket but lacks the override specifier; update the recv_from
declaration (the function named recv_from that currently returns int and takes
uint8_t*, size_t, char*, uint16_t*) to append the override keyword (matching how
open, send_to, close and the destructor are declared) so the compiler enforces
correct overriding and catches signature mismatches.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (14)
interface/src/lib/components/moonbase/FieldRenderer.svelteinterface/src/lib/components/moonbase/FileEditWidget.svelteinterface/src/routes/moonbase/filemanager/FileManager.svelteinterface/src/routes/moonbase/module/Module.svelteinterface/src/routes/system/coredump/+page.svelteinterface/src/routes/system/metrics/SystemMetrics.sveltelib/framework/WWWData.hplatformio.inisrc/MoonBase/NodeManager.hsrc/MoonBase/pal_espidf.cppsrc/MoonLight/Layers/PhysicalLayer.cppsrc/MoonLight/Layers/PhysicalLayer.hsrc/MoonLight/Layers/VirtualLayer.hsrc/MoonLight/Nodes/Effects/E_MovingHeads.h
CI == - add .clang-tidy - add lint.yml frontend and backend frontend ======= - remove nothing from :then nothing - battery metrics: rename heapChartElement to batteryChartElement, cleanup poller - System metrics: cleanup poller, use initPsramChart and add hasPsramData $effect
|
@coderabbitai resume |
✅ Actions performedReviews resumed. |
|
@coderabbitai, please review 6ee4536 |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🏁 Script executed: Length of output: 19539
✅ Highlights1. CI/CD Infrastructure The
|
|
@coderabbitai, regarding
where should I add this? |
|
Tip For best results, initiate chat on the files or code changes.
Option 1: Using Husky (Recommended for Node.js projects)Husky is the standard tool for managing git hooks in Node.js projects. Here's how to set it up:
cd interface
npm install --save-dev husky
npx husky init
npx husky add .husky/pre-commit "cd interface && npm run lint"
chmod +x .husky/pre-commit
{
"scripts": {
"prepare": "cd .. && husky install interface/.husky"
}
}The #!/usr/bin/env sh
. "$(dirname -- "$0")/_/husky.sh"
cd interface && npm run lintOption 2: Manual Git Hook (Simpler, but not committed)If you want a simpler approach without dependencies, create the hook manually:
#!/bin/sh
cd interface && npm run lint
if [ $? -ne 0 ]; then
echo "❌ Linting failed. Please fix errors before committing."
exit 1
fi
chmod +x .git/hooks/pre-commit
My RecommendationUse Option 1 (Husky) because:
With husky, every developer on your team will automatically have the pre-commit checks running! 🎯 |
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
interface/src/lib/components/moonbase/FieldRenderer.svelte (1)
375-395:⚠️ Potential issue | 🟡 MinorClear hover/click timeouts on destroy to prevent post-unmount callbacks.
Pad interactions create
hoverTimeout/clickTimeout, but teardown currently only clearsintervaland slider throttle timer.💡 Proposed fix
onDestroy(() => { clearInterval(interval); + if (hoverTimeout) clearTimeout(hoverTimeout); + if (clickTimeout) clearTimeout(clickTimeout); if (throttleTimer) { clearTimeout(throttleTimer); // Send final pending update if any if (pendingSliderEvent) { onChange(pendingSliderEvent); } } });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/lib/components/moonbase/FieldRenderer.svelte` around lines 375 - 395, The component teardown currently clears only interval and the slider throttle timer but not the pad interaction timers; update the cleanup (onDestroy/cleanup block in FieldRenderer.svelte) to also clear hoverTimeout and clickTimeout using clearTimeout to prevent callbacks after unmount. Locate variables named hoverTimeout and clickTimeout (and the existing interval/slider throttle clear code) and add clearTimeout(hoverTimeout) and clearTimeout(clickTimeout) (guarded if needed) to that same teardown sequence so all timers are canceled on destroy.
♻️ Duplicate comments (1)
interface/src/routes/moonbase/module/Module.svelte (1)
155-175:⚠️ Potential issue | 🟠 MajorPartial-patch behavior is still broken by the cleanup pass.
The merge at Line [155] preserves nested objects, but Line [173] deletes any key omitted from
newData. With partial updates, this drops sibling fields that should remain.💡 Proposed fix
-function updateRecursive(oldData: any, newData: any) { +function updateRecursive(oldData: any, newData: any, pruneMissing = false) { //loop over properties for (let key in newData) { if (Array.isArray(newData[key])) { if (!Array.isArray(oldData[key])) oldData[key] = []; for (let i = 0; i < newData[key].length; i++) { if (oldData[key][i] == undefined) { oldData[key][i] = newData[key][i]; } else { const oldItem = oldData[key][i]; const newItem = newData[key][i]; const bothObjects = oldItem !== null && typeof oldItem === 'object' && newItem !== null && typeof newItem === 'object'; if (bothObjects) { - updateRecursive(oldItem, newItem); + updateRecursive(oldItem, newItem, pruneMissing); } else if (oldItem !== newItem) { oldData[key][i] = newItem; } } } if (oldData[key].length > newData[key].length) { oldData[key].splice(newData[key].length); } } else if (newData[key] !== null && typeof newData[key] === 'object') { if ( oldData[key] === null || typeof oldData[key] !== 'object' || Array.isArray(oldData[key]) ) { oldData[key] = {}; } - updateRecursive(oldData[key], newData[key]); + updateRecursive(oldData[key], newData[key], pruneMissing); } else if (newData[key] !== oldData[key]) { oldData[key] = newData[key]; } } - for (let key in oldData) { - if (newData[key] == null) { - delete oldData[key]; - } - } + if (pruneMissing) { + for (let key in oldData) { + if (!(key in newData)) delete oldData[key]; + } + } }Based on learnings: In MoonModules/MoonLight Module::update() + compareRecursive, passing a partial object acts as a patch — only the provided keys are visited/updated and missing siblings are preserved.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/moonbase/module/Module.svelte` around lines 155 - 175, The cleanup loop currently deletes any key where newData[key] == null, which removes sibling properties on partial patches; change the logic in updateRecursive so it only deletes properties when the incoming newData explicitly contains that key set to null (i.e., Object.prototype.hasOwnProperty.call(newData, key) && newData[key] === null), and skip deletion when the key is simply omitted from newData to preserve siblings in oldData.
🧹 Nitpick comments (2)
.github/workflows/lint.yml (1)
42-56: Consider usingxargswith null-termination for robustness.The current
find+greppipeline could break on filenames containing spaces or special characters. While unlikely in this codebase, using null-terminated strings is more defensive:♻️ Suggested improvement
- name: Run cppcheck run: | cppcheck \ --enable=warning \ --suppress=missingInclude \ --suppress=missingIncludeSystem \ --inline-suppr \ --language=c++ \ --std=c++20 \ --error-exitcode=1 \ -DPLATFORMIO=1 \ -DFT_MOONLIGHT=1 \ -DFT_MOONBASE=1 \ --template="{file}:{line}: {severity}: {message} [{id}]" \ - $(find src -name "*.cpp" -o -name "*.h" | grep -v moonmanpng) + $(find src \( -name "*.cpp" -o -name "*.h" \) ! -path "*moonmanpng*" -print)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/lint.yml around lines 42 - 56, The Run cppcheck step currently builds the file list with a subshell using `find ... | grep -v moonmanpng`, which can break on paths with spaces; change the invocation to produce and consume null-terminated paths instead: have `find` emit NULs (use -print0) for the matched patterns (the same -name "*.cpp" -o -name "*.h" logic), filter/exclude the unwanted entry without breaking null-termination (either via find predicates to exclude or a NUL-aware filter), and pipe the NUL stream into xargs -0 so cppcheck gets safe filenames; update the pipeline replacing the `$(find ... | grep -v moonmanpng)` usage in the Run cppcheck step accordingly.interface/src/routes/system/metrics/SystemMetrics.svelte (1)
398-424: Apply axis max beforeupdate()to avoid a one-cycle lag.
y.maxis currently assigned afterchart.update('none'), so the new bound is applied on the next poll tick.💡 Proposed fix
- lpsChart.data.labels = $analytics.uptime; - lpsChart.data.datasets[0].data = $analytics.lps; - lpsChart.update('none'); - if (lpsChart.options.scales?.y) { - lpsChart.options.scales.y.max = Math.round(Math.max(...$analytics.lps)); - } + lpsChart.data.labels = $analytics.uptime; + lpsChart.data.datasets[0].data = $analytics.lps; + if (lpsChart.options.scales?.y) { + lpsChart.options.scales.y.max = Math.round(Math.max(...$analytics.lps)); + } + lpsChart.update('none');Apply the same ordering to heap/psram/filesystem blocks.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/system/metrics/SystemMetrics.svelte` around lines 398 - 424, Move the assignments to each chart's y-axis max so they occur before calling update('none'): for heapChart, psramChart (inside the hasPsramData block), and filesystemChart set options.scales.y.max (using Math.round(Math.max(...$analytics.*))) prior to calling heapChart.update('none'), psramChart.update('none') and filesystemChart.update('none') respectively; keep using the same properties (heapChart, psramChart, filesystemChart, options.scales?.y.max and update('none')) but reorder the operations to avoid the one-cycle lag.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@interface/vite-plugin-littlefs.ts`:
- Around line 11-27: The code assumes config.build.rollupOptions.output is a
single object and that assetFileNames/chunkFileNames/entryFileNames are strings;
update the logic to first normalize output to an array (Array.isArray or wrap
single object), then map each OutputOptions and for each of the three fields: if
typeof field === 'string' perform field.replace('.[hash]',''), if typeof field
=== 'function' wrap it with a new function like (...args) =>
original(...args).replace('.[hash]','') (ensuring you call the original and only
call .replace on the returned string), and if the field is absent leave it
unchanged; finally, assign the mapped array back to
config.build.rollupOptions.output (or a single object when original was an
object) so both array outputs and function-based filename generators are handled
safely.
---
Outside diff comments:
In `@interface/src/lib/components/moonbase/FieldRenderer.svelte`:
- Around line 375-395: The component teardown currently clears only interval and
the slider throttle timer but not the pad interaction timers; update the cleanup
(onDestroy/cleanup block in FieldRenderer.svelte) to also clear hoverTimeout and
clickTimeout using clearTimeout to prevent callbacks after unmount. Locate
variables named hoverTimeout and clickTimeout (and the existing interval/slider
throttle clear code) and add clearTimeout(hoverTimeout) and
clearTimeout(clickTimeout) (guarded if needed) to that same teardown sequence so
all timers are canceled on destroy.
---
Duplicate comments:
In `@interface/src/routes/moonbase/module/Module.svelte`:
- Around line 155-175: The cleanup loop currently deletes any key where
newData[key] == null, which removes sibling properties on partial patches;
change the logic in updateRecursive so it only deletes properties when the
incoming newData explicitly contains that key set to null (i.e.,
Object.prototype.hasOwnProperty.call(newData, key) && newData[key] === null),
and skip deletion when the key is simply omitted from newData to preserve
siblings in oldData.
---
Nitpick comments:
In @.github/workflows/lint.yml:
- Around line 42-56: The Run cppcheck step currently builds the file list with a
subshell using `find ... | grep -v moonmanpng`, which can break on paths with
spaces; change the invocation to produce and consume null-terminated paths
instead: have `find` emit NULs (use -print0) for the matched patterns (the same
-name "*.cpp" -o -name "*.h" logic), filter/exclude the unwanted entry without
breaking null-termination (either via find predicates to exclude or a NUL-aware
filter), and pipe the NUL stream into xargs -0 so cppcheck gets safe filenames;
update the pipeline replacing the `$(find ... | grep -v moonmanpng)` usage in
the Run cppcheck step accordingly.
In `@interface/src/routes/system/metrics/SystemMetrics.svelte`:
- Around line 398-424: Move the assignments to each chart's y-axis max so they
occur before calling update('none'): for heapChart, psramChart (inside the
hasPsramData block), and filesystemChart set options.scales.y.max (using
Math.round(Math.max(...$analytics.*))) prior to calling
heapChart.update('none'), psramChart.update('none') and
filesystemChart.update('none') respectively; keep using the same properties
(heapChart, psramChart, filesystemChart, options.scales?.y.max and
update('none')) but reorder the operations to avoid the one-cycle lag.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (22)
.clang-tidy.github/workflows/lint.ymlinterface/src/lib/components/moonbase/FieldRenderer.svelteinterface/src/routes/connections/mqtt/MQTT.svelteinterface/src/routes/connections/mqtt/MQTTConfig.svelteinterface/src/routes/connections/ntp/NTP.svelteinterface/src/routes/ethernet/Ethernet.svelteinterface/src/routes/moonbase/filemanager/FileManager.svelteinterface/src/routes/moonbase/module/Module.svelteinterface/src/routes/moonbase/monitor/Monitor.svelteinterface/src/routes/system/metrics/BatteryMetrics.svelteinterface/src/routes/system/metrics/SystemMetrics.svelteinterface/src/routes/system/status/SystemStatus.svelteinterface/src/routes/user/+page.svelteinterface/src/routes/wifi/ap/Accesspoint.svelteinterface/src/routes/wifi/sta/Scan.svelteinterface/src/routes/wifi/sta/Wifi.svelteinterface/svelte.config.jsinterface/vite-plugin-littlefs.tslib/framework/WWWData.hmisc/upstream-prs/001-lint-fixes.mdsrc/MoonBase/pal_espidf.cpp
✅ Files skipped from review due to trivial changes (1)
- interface/svelte.config.js
🚧 Files skipped from review as they are similar to previous changes (6)
- interface/src/routes/wifi/ap/Accesspoint.svelte
- interface/src/routes/system/status/SystemStatus.svelte
- interface/src/routes/system/metrics/BatteryMetrics.svelte
- interface/src/routes/connections/mqtt/MQTTConfig.svelte
- interface/src/routes/connections/ntp/NTP.svelte
- interface/src/routes/connections/mqtt/MQTT.svelte
interface/vite-plugin-littlefs.ts
Outdated
| if (!config.build?.rollupOptions?.output) return; | ||
| const { assetFileNames, chunkFileNames, entryFileNames } = | ||
| config.build?.rollupOptions?.output; | ||
| config.build.rollupOptions.output; | ||
|
|
||
| // Handle Server-build + Client Assets | ||
| config.build.rollupOptions.output = { | ||
| ...config.build?.rollupOptions?.output, | ||
| assetFileNames: assetFileNames.replace('.[hash]', '') | ||
| } | ||
| // Handle Server-build + Client Assets | ||
| config.build.rollupOptions.output = { | ||
| ...config.build?.rollupOptions?.output, | ||
| assetFileNames: assetFileNames.replace('.[hash]', '') | ||
| }; | ||
|
|
||
| // Handle Client-build | ||
| if (config.build?.rollupOptions?.output.chunkFileNames.includes('hash')) { | ||
|
|
||
| config.build.rollupOptions.output = { | ||
| ...config.build?.rollupOptions?.output, | ||
| chunkFileNames: chunkFileNames.replace('.[hash]', ''), | ||
| entryFileNames: entryFileNames.replace('.[hash]', ''), | ||
| } | ||
| } | ||
| } | ||
| } | ||
| ] | ||
| // Handle Client-build | ||
| if (config.build?.rollupOptions?.output.chunkFileNames.includes('hash')) { | ||
| config.build.rollupOptions.output = { | ||
| ...config.build?.rollupOptions?.output, | ||
| chunkFileNames: chunkFileNames.replace('.[hash]', ''), | ||
| entryFileNames: entryFileNames.replace('.[hash]', '') | ||
| }; |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Vite build.rollupOptions.output type array configuration
💡 Result:
In Vite, build.rollupOptions must be a single Rollup options object, but build.rollupOptions.output can be an array of Rollup OutputOptions to generate multiple outputs from the same build. Vite explicitly supports “multiple Rollup outputs” via build.rollupOptions. [1]
// vite.config.ts
import { defineConfig } from "vite";
export default defineConfig({
build: {
rollupOptions: {
input: "src/main.ts",
output: [
{
dir: "dist/es",
format: "es",
entryFileNames: "[name].mjs",
chunkFileNames: "chunks/[name]-[hash].mjs",
assetFileNames: "assets/[name]-[hash][extname]",
},
{
dir: "dist/cjs",
format: "cjs",
entryFileNames: "[name].cjs",
chunkFileNames: "chunks/[name]-[hash].cjs",
assetFileNames: "assets/[name]-[hash][extname]",
},
],
},
},
});This matches Rollup’s capability: to “build different bundles with the same input, you supply an array of output options”. [2] Rollup also uses output: [...] for per-output customization (e.g., plugins/minification settings per output). [3]
Note: Vite does not accept build.rollupOptions itself as an array (RollupOptions[]); that pattern errors in Vite (tracked in Vite issue #6960). [4]
Sources: Vite build guide [1], Rollup CLI/config docs (output array) [2], Rollup output options example [3], Vite issue about RollupOptions[] not supported [4].
🌐 Web query:
Rollup output configuration assetFileNames chunkFileNames entryFileNames types
💡 Result:
In Rollup, these output filename options are typed as follows (TypeScript):
import type { OutputOptions, PreRenderedAsset, PreRenderedChunk } from "rollup";
type AssetFileNames =
| string
| ((assetInfo: PreRenderedAsset) => string);
type ChunkFileNames =
| string
| ((chunkInfo: PreRenderedChunk) => string);
type EntryFileNames =
| string
| ((chunkInfo: PreRenderedChunk) => string);
// Same as:
type AssetFileNames2 = OutputOptions["assetFileNames"];
type ChunkFileNames2 = OutputOptions["chunkFileNames"];
type EntryFileNames2 = OutputOptions["entryFileNames"];Sources: Rollup OutputOptions type definitions for assetFileNames, chunkFileNames, entryFileNames (including their callback parameter types). [1] Also reflected in the Rollup docs for output.entryFileNames (uses PreRenderedChunk). [2]
References: [1] (tessl.io) [2] (rollup.nodejs.cn)
Citations:
- 1: https://tessl.io/registry/tessl/npm-rollup?utm_source=openai
- 2: https://rollup.nodejs.cn/configuration-options/?utm_source=openai
Handle Rollup output shape and field types before calling string methods.
Line 12 assumes output is a single object, but Vite/Rollup allow output to be an array of OutputOptions. Lines 18, 22, 25, 26 assume filename fields are always strings and call .replace() / .includes() directly, but these fields can be functions of type (assetInfo|chunkInfo) => string. This will cause runtime errors when the config uses array outputs or function-based filenames.
✅ Proposed fix
async config(config, _configEnv) {
- if (!config.build?.rollupOptions?.output) return;
- const { assetFileNames, chunkFileNames, entryFileNames } =
- config.build.rollupOptions.output;
-
- // Handle Server-build + Client Assets
- config.build.rollupOptions.output = {
- ...config.build?.rollupOptions?.output,
- assetFileNames: assetFileNames.replace('.[hash]', '')
- };
-
- // Handle Client-build
- if (config.build?.rollupOptions?.output.chunkFileNames.includes('hash')) {
- config.build.rollupOptions.output = {
- ...config.build?.rollupOptions?.output,
- chunkFileNames: chunkFileNames.replace('.[hash]', ''),
- entryFileNames: entryFileNames.replace('.[hash]', '')
- };
- }
+ const output = config.build?.rollupOptions?.output;
+ if (!output) return;
+
+ const outputs = Array.isArray(output) ? output : [output];
+ const normalized = outputs.map((o) => {
+ const next = { ...o };
+
+ if (typeof next.assetFileNames === 'string') {
+ next.assetFileNames = next.assetFileNames.replace('.[hash]', '');
+ }
+
+ if (
+ typeof next.chunkFileNames === 'string' &&
+ next.chunkFileNames.includes('[hash]')
+ ) {
+ next.chunkFileNames = next.chunkFileNames.replace('.[hash]', '');
+ }
+
+ if (
+ typeof next.entryFileNames === 'string' &&
+ next.entryFileNames.includes('[hash]')
+ ) {
+ next.entryFileNames = next.entryFileNames.replace('.[hash]', '');
+ }
+
+ return next;
+ });
+
+ config.build!.rollupOptions!.output = Array.isArray(output)
+ ? normalized
+ : normalized[0];
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| if (!config.build?.rollupOptions?.output) return; | |
| const { assetFileNames, chunkFileNames, entryFileNames } = | |
| config.build?.rollupOptions?.output; | |
| config.build.rollupOptions.output; | |
| // Handle Server-build + Client Assets | |
| config.build.rollupOptions.output = { | |
| ...config.build?.rollupOptions?.output, | |
| assetFileNames: assetFileNames.replace('.[hash]', '') | |
| } | |
| // Handle Server-build + Client Assets | |
| config.build.rollupOptions.output = { | |
| ...config.build?.rollupOptions?.output, | |
| assetFileNames: assetFileNames.replace('.[hash]', '') | |
| }; | |
| // Handle Client-build | |
| if (config.build?.rollupOptions?.output.chunkFileNames.includes('hash')) { | |
| config.build.rollupOptions.output = { | |
| ...config.build?.rollupOptions?.output, | |
| chunkFileNames: chunkFileNames.replace('.[hash]', ''), | |
| entryFileNames: entryFileNames.replace('.[hash]', ''), | |
| } | |
| } | |
| } | |
| } | |
| ] | |
| // Handle Client-build | |
| if (config.build?.rollupOptions?.output.chunkFileNames.includes('hash')) { | |
| config.build.rollupOptions.output = { | |
| ...config.build?.rollupOptions?.output, | |
| chunkFileNames: chunkFileNames.replace('.[hash]', ''), | |
| entryFileNames: entryFileNames.replace('.[hash]', '') | |
| }; | |
| const output = config.build?.rollupOptions?.output; | |
| if (!output) return; | |
| const outputs = Array.isArray(output) ? output : [output]; | |
| const normalized = outputs.map((o) => { | |
| const next = { ...o }; | |
| if (typeof next.assetFileNames === 'string') { | |
| next.assetFileNames = next.assetFileNames.replace('.[hash]', ''); | |
| } | |
| if ( | |
| typeof next.chunkFileNames === 'string' && | |
| next.chunkFileNames.includes('[hash]') | |
| ) { | |
| next.chunkFileNames = next.chunkFileNames.replace('.[hash]', ''); | |
| } | |
| if ( | |
| typeof next.entryFileNames === 'string' && | |
| next.entryFileNames.includes('[hash]') | |
| ) { | |
| next.entryFileNames = next.entryFileNames.replace('.[hash]', ''); | |
| } | |
| return next; | |
| }); | |
| config.build!.rollupOptions!.output = Array.isArray(output) | |
| ? normalized | |
| : normalized[0]; |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@interface/vite-plugin-littlefs.ts` around lines 11 - 27, The code assumes
config.build.rollupOptions.output is a single object and that
assetFileNames/chunkFileNames/entryFileNames are strings; update the logic to
first normalize output to an array (Array.isArray or wrap single object), then
map each OutputOptions and for each of the three fields: if typeof field ===
'string' perform field.replace('.[hash]',''), if typeof field === 'function'
wrap it with a new function like (...args) =>
original(...args).replace('.[hash]','') (ensuring you call the original and only
call .replace on the returned string), and if the field is absent leave it
unchanged; finally, assign the mapped array back to
config.build.rollupOptions.output (or a single object when original was an
object) so both array outputs and function-based filename generators are handled
safely.
|
@coderabbitai, when running rpm run build I got the following warnings: 6:15:43 PM [vite-plugin-svelte] WARNING: The following packages have a svelte field in their package.json but no exports condition for svelte. svelte-focus-trap@1.2.0 Please see https://github.com/sveltejs/vite-plugin-svelte/blob/main/docs/faq.md#missing-exports-condition for details. .svelte-kit/output/client/_app/immutable/bundle.js 673.63 kB │ gzip: 205.91 kB (!) Some chunks are larger than 500 kB after minification. Consider:
Can you check how to avoid? |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🏁 Script executed: Length of output: 2924 🌐 Web query: 💡 Result: The latest (and still current) npm version of
Warning 1: svelte-focus-trap Missing Exports ConditionRoot cause: Solutions (pick one): Option A: Suppress the warning (simplest)Add to import { sveltekit } from '@sveltejs/kit/vite';
import type { UserConfig } from 'vite';
import Icons from 'unplugin-icons/vite';
import viteLittleFS from './vite-plugin-littlefs';
import tailwindcss from '@tailwindcss/vite';
const config: UserConfig = {
plugins: [
sveltekit(),
Icons({
compiler: 'svelte'
}),
tailwindcss(),
viteLittleFS()
],
// Add this to suppress the warning
onwarn(warning, warn) {
// Suppress svelte-focus-trap exports warning
if (warning.code === 'MISSING_EXPORT' && warning.message.includes('svelte-focus-trap')) {
return;
}
warn(warning);
},
// ... rest of configOption B: Replace with maintained alternativeConsider using cd interface
npm uninstall svelte-focus-trap
npm install --save-dev svelte-trap-focusWarning 2: Large Bundle Size (673 kB)Root cause: Your Important context: For ESP32 embedded systems, 205 kB gzipped is actually reasonable! The warning threshold is generic and doesn't account for embedded deployment. Solutions (pick one): Option A: Adjust warning threshold (recommended for your use case)Update build: {
minify: 'terser',
sourcemap: false,
chunkSizeWarningLimit: 1000, // Increase from default 500 kB
rollupOptions: {
output: {
manualChunks(id) {
if (id.includes('node_modules')) return 'vendor';
}
}
},
cssCodeSplit: true
}Option B: Keep single bundle but improve chunkingIf you want to split code but keep the single-bundle strategy for initial load, update build: {
minify: 'terser',
sourcemap: false,
rollupOptions: {
output: {
manualChunks(id) {
// Split heavy libraries
if (id.includes('chart.js')) return 'charts';
if (id.includes('luxon')) return 'datetime';
if (id.includes('gl-matrix')) return 'graphics';
if (id.includes('node_modules')) return 'vendor';
}
}
},
cssCodeSplit: true
}And update kit: {
adapter: adapter({
pages: 'build',
assets: 'build',
fallback: 'index.html',
precompress: false,
strict: true
}),
alias: {
$src: './src'
},
output: {
bundleStrategy: 'split' // Change from 'single' to 'split'
}
}
Option C: Keep as-is and suppress warningIf 205 kB gzipped works fine for your ESP32, just adjust the threshold per Option A. The warning is informational, not critical. My RecommendationFor your ESP32 embedded system:
These are build-time warnings that don't affect functionality. Your current setup with a single 205 kB gzipped bundle is actually optimal for ESP32 deployment! 🎯 Would you like me to prepare the exact changes as a code snippet you can copy-paste? 🧠 Learnings used |
|
@coderabbitai, regarding
I get: ewoud@Mac interface % npm uninstall svelte-focus-trap removed 2 packages, and audited 260 packages in 2s 62 packages are looking for funding 12 vulnerabilities (2 low, 5 moderate, 5 high) To address all issues, run: Run |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🌐 Web query: 💡 Result: Actively maintained (good choices in 2024–2025)
Svelte-native / Svelte-friendly options (but often tied to a UI library)
Svelte-specific packages that look stale (not great for “maintained in 2024–2025”)
Sources[1] (newreleases.io) Citations:
🏁 Script executed: Length of output: 108 🏁 Script executed: Length of output: 5149
I found that you're actively using
Quick Fix: Reinstall the OriginalThe simplest solution is to reinstall cd interface
npm install --save-dev svelte-focus-trapThen add this to import { sveltekit } from '@sveltejs/kit/vite';
import type { UserConfig } from 'vite';
import Icons from 'unplugin-icons/vite';
import viteLittleFS from './vite-plugin-littlefs';
import tailwindcss from '@tailwindcss/vite';
const config: UserConfig = {
plugins: [
sveltekit(),
Icons({
compiler: 'svelte'
}),
tailwindcss(),
viteLittleFS()
],
server: {
port: 3000,
proxy: {}
},
preview: {
port: 3000
},
build: {
minify: 'terser',
sourcemap: false,
chunkSizeWarningLimit: 1000 // Also suppress bundle size warning
},
// Suppress svelte-focus-trap warning
onwarn(warning, warn) {
if (warning.code === 'MISSING_EXPORT' && warning.message?.includes('svelte-focus-trap')) {
return;
}
warn(warning);
}
};
export default config;Despite being 4 years old, Alternative: Migrate to Actively Maintained
|
|
@coderabbitai, since the changes in this PR I got these messages in te browser console: Can you fix? [Error] Error: Invalid type: 0xc1 — bundle.js:3:15544 |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🏁 Script executed: Length of output: 5171 🏁 Script executed: Length of output: 8779
These errors started appearing because this PR introduced changes to Module.svelte's data handling, specifically:
Root CauseThe issue is likely in how property definitions with RegEx patterns are being sent over WebSocket. When MessagePack tries to encode/decode RegEx objects, it fails because RegEx isn't a native MessagePack type. FixHere are the changes needed: 1. Add robust error handling in socket.ts// interface/src/lib/stores/socket.ts
ws.onmessage = (message) => {
resetUnresponsiveCheck();
let payload = message.data;
const binary = payload instanceof ArrayBuffer;
listeners.get(binary ? 'binary' : 'message')?.forEach((listener) => listener(payload));
try {
payload = binary ? msgpack.decode(new Uint8Array(payload)) : JSON.parse(payload);
} catch (error) {
console.error('[WebSocket] Decode error:', error, 'Raw data length:',
binary ? (payload as ArrayBuffer).byteLength : payload.length);
listeners.get('error')?.forEach((listener) => listener(error));
return; // <-- Already present, good!
}
// Add validation before destructuring
if (!payload || typeof payload !== 'object') {
console.error('[WebSocket] Invalid payload after decode:', payload);
return;
}
listeners.get('json')?.forEach((listener) => listener(payload));
const { event, data } = payload;
if (event) {
listeners.get(event)?.forEach((listener) => listener(data));
} else {
listeners.get("monitor")?.forEach((listener) => listener(new Uint8Array(message.data)));
}
};2. Add validation in Module.svelte for definition loading// interface/src/routes/moonbase/module/Module.svelte
//load definition
try {
const responseDef = await fetch('/rest/' + moduleName + 'Def', {
method: 'GET',
headers: {
Authorization: page.data.features.security ? 'Bearer ' + $user.bearer_token : 'Basic',
'Content-Type': 'application/json'
}
});
if (!responseDef.ok) {
throw new Error(`Definition fetch failed: ${responseDef.status} ${responseDef.statusText}`);
}
const defData = await responseDef.json();
// Validate definition structure
if (!defData || !Array.isArray(defData)) {
console.error('[Module] Invalid definition structure:', defData);
definition = [];
} else {
definition = defData;
}
console.log('getState definition', definition);
} catch (error) {
console.error('[Module] Error loading definition:', error);
definition = [];
}3. Backend: Convert RegEx to strings before sendingThe real fix needs to be on the backend. RegEx objects in property definitions should be serialized as strings. Check your backend code where you send module definitions and ensure RegEx patterns are converted: // Example backend fix (adjust to your actual code)
// Instead of sending: {"validation": /pattern/}
// Send: {"validation": "pattern"} or {"validation": {"pattern": "pattern", "flags": "g"}}4. Check for definition endpointVerify the endpoint exists: Would you like me to search the backend code to locate where the definition endpoint is handled and where RegEx patterns might be sent? 🏁 Script executed: Length of output: 1089 🏁 Script executed: Length of output: 30706
Root Causes
The backend endpoint Fixes Required1. Improve socket.ts error handling and validation// interface/src/lib/stores/socket.ts
ws.onmessage = (message) => {
resetUnresponsiveCheck();
let payload = message.data;
const binary = payload instanceof ArrayBuffer;
listeners.get(binary ? 'binary' : 'message')?.forEach((listener) => listener(payload));
try {
payload = binary ? msgpack.decode(new Uint8Array(payload)) : JSON.parse(payload);
} catch (error) {
console.error('[WebSocket] Decode error:', error,
'Binary:', binary,
'Size:', binary ? (payload as ArrayBuffer).byteLength : payload.length);
listeners.get('error')?.forEach((listener) => listener(error));
return; // Stop processing on decode error
}
// Validate payload structure
if (!payload || typeof payload !== 'object') {
console.error('[WebSocket] Invalid payload structure after decode:', payload);
return; // Don't process invalid payloads
}
listeners.get('json')?.forEach((listener) => listener(payload));
// Safe destructuring with defaults
const { event = null, data = null } = payload;
if (event && data !== null) {
listeners.get(event)?.forEach((listener) => listener(data));
} else if (!event) {
// Monitor data (no event field)
listeners.get("monitor")?.forEach((listener) => listener(new Uint8Array(message.data)));
} else {
console.warn('[WebSocket] Event with null/undefined data:', event);
}
};2. Add defensive checks in Module.svelte// interface/src/routes/moonbase/module/Module.svelte
function updateRecursive(oldData: any, newData: any) {
// Add validation at the start
if (!oldData || typeof oldData !== 'object') {
console.error('[Module] Invalid oldData in updateRecursive:', oldData);
return;
}
if (!newData || typeof newData !== 'object') {
console.error('[Module] Invalid newData in updateRecursive:', newData);
return;
}
// Rest of the function...
if (Array.isArray(newData)) {
// ... existing array logic
} else if (typeof newData === 'object' && newData !== null) {
// ... existing object logic
}
}
const handleState = (state: any) => {
// Add validation before processing
if (!state || typeof state !== 'object') {
console.error('[Module] Invalid state received:', state);
return;
}
console.log('[Module] handleState', state);
updateRecursive(data, state);
};3. Add try-catch in definition loading// interface/src/routes/moonbase/module/Module.svelte
// Around line 44-58
//load definition
try {
const responseDef = await fetch('/rest/' + moduleName + 'Def', {
method: 'GET',
headers: {
Authorization: page.data.features.security ? 'Bearer ' + $user.bearer_token : 'Basic',
'Content-Type': 'application/json'
}
});
if (!responseDef.ok) {
throw new Error(`Definition fetch failed: ${responseDef.status}`);
}
const defData = await responseDef.json();
if (!Array.isArray(defData)) {
console.error('[Module] Invalid definition structure:', defData);
definition = [];
} else {
definition = defData;
}
console.log('[Module] getState definition', definition);
} catch (error) {
console.error('[Module] Error loading definition:', error);
definition = [];
}4. Backend: Check MessagePack encodingThe backend needs investigation. Run this to check how state updates are sent: Would you like me to create a complete patch file with all these fixes? 🏁 Script executed: Length of output: 3003 🏁 Script executed: Length of output: 9209 🏁 Script executed: Length of output: 4877 🏁 Script executed: Length of output: 3922 🏁 Script executed: Length of output: 11274
The ProblemThe backend is using MessagePack encoding (via ArduinoJson's
The errors occur because ArduinoJson is producing MessagePack data with type code Why It Started After This PRYour PR introduced:
These changes likely exposed edge cases where certain data structures don't serialize correctly between ArduinoJson and msgpack-lite. Solution: Switch to JSON ModeThe simplest and most reliable fix is to enable JSON mode instead of MessagePack. Here's how: Check Current ConfigurationIf the script confirms While we wait for the script results, here are the immediate frontend fixes to add better error handling: Frontend Fix (Add Now)Update ws.onmessage = (message) => {
resetUnresponsiveCheck();
let payload = message.data;
const binary = payload instanceof ArrayBuffer;
listeners.get(binary ? 'binary' : 'message')?.forEach((listener) => listener(payload));
try {
payload = binary ? msgpack.decode(new Uint8Array(payload)) : JSON.parse(payload);
} catch (error) {
console.error('[WebSocket] Decode error:', error);
console.error('[WebSocket] Binary:', binary, 'Size:',
binary ? (payload as ArrayBuffer).byteLength : payload?.length);
// Log first few bytes for debugging
if (binary) {
const bytes = new Uint8Array(payload as ArrayBuffer);
console.error('[WebSocket] First 20 bytes:',
Array.from(bytes.slice(0, 20)).map(b => '0x' + b.toString(16).padStart(2, '0')).join(' '));
}
listeners.get('error')?.forEach((listener) => listener(error));
return; // Stop processing
}
// Validate payload structure
if (!payload || typeof payload !== 'object') {
console.error('[WebSocket] Invalid payload after decode:', payload);
return;
}
listeners.get('json')?.forEach((listener) => listener(payload));
// Safe destructuring
const { event = null, data = null } = payload;
if (event) {
if (data !== null && data !== undefined) {
listeners.get(event)?.forEach((listener) => listener(data));
} else {
console.warn('[WebSocket] Event with null/undefined data:', event);
}
} else {
listeners.get("monitor")?.forEach((listener) => listener(new Uint8Array(message.data)));
}
};This will give you detailed debug output showing exactly what byte sequence is causing the failure. Would you like me to wait for the script results to confirm the configuration, or shall I provide the complete fix assuming EVENT_USE_JSON needs to be enabled? 🏁 Script executed: Length of output: 853 Perfect! Confirmed - the backend is using MessagePack (default), but there's a serialization incompatibility causing the errors. The FixEnable JSON mode by uncommenting line 116 in # platformio.ini (line 116)
# Change from:
; -D EVENT_USE_JSON=1 ; 💫 switch off for FT_MONITOR
# To:
-D EVENT_USE_JSON=1 ; 💫 switch off for FT_MONITORThen rebuild and reflash the firmware: pio run -t uploadWhy This Fixes It
|
|
@coderabbitai, and what about
? |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🏁 Script executed: Length of output: 2661
How Each Error Is Fixed1. ❌
|
lint.yml: support filenames with spaces package-lock.json update by npm audit -fix frontend ======= - fieldrenderer: onDestroy timer cleanup - sockets: catch errors - Module: updateRecursive: add pruneMissing argument - System metrics: charts: update after scales - vite littlefs plugin: refactor config
There was a problem hiding this comment.
Actionable comments posted: 7
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
interface/vite-plugin-littlefs.ts (1)
1-45:⚠️ Potential issue | 🟡 MinorFix Prettier formatting to pass CI.
The pipeline indicates Prettier formatting check failed on this file. Run
prettier --write interface/vite-plugin-littlefs.tsto resolve.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/vite-plugin-littlefs.ts` around lines 1 - 45, Prettier formatting errors in the exported function viteLittleFS (and its async config method) are causing CI failures; run Prettier on this file (e.g., prettier --write) or apply your project's Prettier formatting rules to update whitespace, commas, and indentation in viteLittleFS, the config(...) block, and the returned Plugin[] structure, then commit the formatted file so the CI Prettier check passes.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@interface/src/lib/stores/socket.ts`:
- Around line 70-73: The current guard prevents dispatching events when data is
null/undefined; remove that conditional so events are always emitted and
listeners receive the payload (including null). In the block where you check "if
(event) { ... }" replace the inner "if (data !== null && data !== undefined) {
listeners.get(event)?.forEach((listener) => listener(data)); }" logic with an
unconditional listeners.get(event)?.forEach((listener) => listener(data)); so
that the variables event, data, and the listeners Map are used to always call
listeners and let them handle null payloads.
- Line 55: The forEach callback currently uses an expression body that
implicitly returns listener(error) which triggers the lint rule; update the
forEach call on listeners.get('error') to use a block body so the callback does
not return a value (i.e., change the arrow function from an expression form to a
block form that calls listener(error);). Locate the invocation on
listeners.get('error') and replace the callback signature (listener) =>
listener(error) with a block-bodied callback (listener) => { listener(error); }
to satisfy lint/suspicious/useIterableCallbackReturn.
- Around line 74-77: The fallback that routes frames without an event to
listeners.get("monitor") blindly converts message.data to a Uint8Array; change
the branch that currently handles the "no event" case to first check
message.binary (or equivalent frame type flag) and only call
listeners.get("monitor")?.forEach(listener => listener(new
Uint8Array(message.data))) when message.binary is true; if message.binary is
false/undefined, call processLogger.warn (or console.warn) with a clear message
mentioning the unexpected non-binary, non-event payload and do not invoke
monitor listeners. Ensure you update the code paths around message.event,
message.data, and listeners.get("monitor") to include this binary guard and
warning.
In `@interface/src/routes/system/metrics/BatteryMetrics.svelte`:
- Around line 106-108: The y-axis max calculation uses
Math.max(...$batteryHistory.voltage) and Math.max(...$batteryHistory.current)
which return -Infinity on empty arrays; replace those spread calls with guarded
values (e.g., const maxVoltage = $batteryHistory.voltage.length ?
Math.max(...$batteryHistory.voltage) : 0 and const maxCurrent =
$batteryHistory.current.length ? Math.max(...$batteryHistory.current) : 0) and
then use Math.round(Math.max(maxVoltage, maxCurrent)); apply the same guarded
approach to the other occurrence around the 155-157 block to ensure a finite
chart scale value.
- Around line 153-157: The chart update is being called before mutating the Y
scale so the new max lags one cycle; move the scale calculation and assignment
(setting batteryChart.options.scales.y.max using Math.max over
$batteryHistory.voltage and $batteryHistory.current) to run before calling
batteryChart.update('none') so the updated max is applied immediately when
batteryChart.update executes.
In `@interface/src/routes/system/metrics/SystemMetrics.svelte`:
- Around line 95-96: The Math.max(...array) calls can produce -Infinity for
empty telemetry series (e.g., Math.max(...$analytics.psram_size)), so change
those expressions to guard against empty arrays: compute a safeMax by checking
the series length (or using Math.max(...series, fallback)) and use that safeMax
(rounded) for axis.max so you never pass -Infinity; update every occurrence of
Math.max(...$analytics.*) in SystemMetrics.svelte (including the
psram_size/psram_percent and other series referenced around the flagged lines)
to use this guarded pattern.
- Around line 34-40: The chart isn't destroyed when telemetry resets so a stale
psramChart blocks re-initialization; update the reactive effect around
hasPsramData (the $effect watching hasPsramData) to destroy and null out the
existing psramChart when hasPsramData becomes false (call psramChart.destroy()
if defined and then set psramChart = null), and keep initPsramChart() unchanged
except it can safely early-return when psramChart is present; reference the
$effect, hasPsramData, initPsramChart, psramChart and psramChartElement symbols
to locate where to add the teardown logic.
---
Outside diff comments:
In `@interface/vite-plugin-littlefs.ts`:
- Around line 1-45: Prettier formatting errors in the exported function
viteLittleFS (and its async config method) are causing CI failures; run Prettier
on this file (e.g., prettier --write) or apply your project's Prettier
formatting rules to update whitespace, commas, and indentation in viteLittleFS,
the config(...) block, and the returned Plugin[] structure, then commit the
formatted file so the CI Prettier check passes.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
interface/package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (8)
.github/workflows/lint.ymlinterface/src/lib/components/moonbase/FieldRenderer.svelteinterface/src/lib/stores/socket.tsinterface/src/routes/moonbase/module/Module.svelteinterface/src/routes/system/metrics/BatteryMetrics.svelteinterface/src/routes/system/metrics/SystemMetrics.svelteinterface/vite-plugin-littlefs.tslib/framework/WWWData.h
🚧 Files skipped from review as they are similar to previous changes (1)
- .github/workflows/lint.yml
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/MoonLight/Nodes/Drivers/D_ArtnetIn.h (1)
150-171:⚠️ Potential issue | 🟠 MajorClamp DDP copy length to actual UDP payload size.
dataLenfrom the packet header is trusted without verifying bytes actually received. A malformed/truncated packet can make this loop read stale bytes frompacketBufferand write corrupted channel data.Proposed bounds check with packet-size aware handler
- if (ddp) - handleDDP(); - else - handleArtNet(); + if (ddp) + handleDDP(packetSize); + else + handleArtNet(packetSize); @@ - void handleDDP() { + void handleDDP(int packetSize) { + if (packetSize < static_cast<int>(sizeof(DDPHeader))) return; @@ - if (dataType == 0x01) { + if (dataType == 0x01) { uint8_t* pixelData = packetBuffer + sizeof(DDPHeader); + int payloadBytes = packetSize - static_cast<int>(sizeof(DDPHeader)); + int safeDataLen = MIN(static_cast<int>(dataLen), payloadBytes); @@ - int numPixels = MIN(dataLen / layerP.lights.header.channelsPerLight, layerP.lights.header.nrOfLights - startPixel); + int numPixels = MIN(safeDataLen / layerP.lights.header.channelsPerLight, layerP.lights.header.nrOfLights - startPixel);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/MoonLight/Nodes/Drivers/D_ArtnetIn.h` around lines 150 - 171, The handler trusts DDPHeader::dataLen and can read past the UDP payload; clamp the copy length using the actual received packet size by computing availableBytes = receivedSize - sizeof(DDPHeader), then set safeDataLen = min(dataLen, availableBytes) and recompute numPixels = MIN(safeDataLen / layerP.lights.header.channelsPerLight, layerP.lights.header.nrOfLights - startPixel); also validate offset is within total channel bounds before calculating startPixel to avoid negative/overflow indices; apply these checks before taking swapMutex and doing memcpy into layerP.lights.channelsD (and the virtual-layer forEachLightIndex path).
♻️ Duplicate comments (3)
interface/src/routes/system/metrics/SystemMetrics.svelte (2)
37-40:⚠️ Potential issue | 🟠 MajorReset
psramChartafter destroy so re-init can occur.Destroying without clearing the reference keeps
initPsramChart()locked behindif (psramChart) return.Proposed fix
} else if (psramChart) { psramChart.destroy(); - // psramChart = undefined; + psramChart = undefined; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/system/metrics/SystemMetrics.svelte` around lines 37 - 40, The psramChart reference is left intact after calling psramChart.destroy(), which prevents initPsramChart() from re-initializing because it checks if (psramChart) return; update the branch that calls psramChart.destroy() to also clear the reference (e.g., set psramChart = undefined or null) immediately after destroy so future calls to initPsramChart() can recreate the chart instance; ensure the symbol names psramChart and initPsramChart are used exactly as in the code.
100-101:⚠️ Potential issue | 🟡 MinorGuard all telemetry-based
Math.max(...series)calls against empty arrays.If a series is temporarily empty, these max assignments become
-Infinityand can invalidate axis bounds.Proposed fix pattern
+ const safeMax = (values: number[]) => (values.length ? Math.round(Math.max(...values)) : 0); ... - max: Math.round(Math.max(...$analytics.psram_size)), + max: safeMax($analytics.psram_size), ... - max: Math.round(Math.max(...$analytics.lps)), + max: safeMax($analytics.lps), ... - max: Math.round(Math.max(...$analytics.total_heap)), + max: safeMax($analytics.total_heap), ... - psramChart.options.scales.y.max = Math.round(Math.max(...$analytics.psram_size)); + psramChart.options.scales.y.max = safeMax($analytics.psram_size);#!/bin/bash FILE="interface/src/routes/system/metrics/SystemMetrics.svelte" rg -n 'Math\.max\(\.\.\.\$analytics\.' "$FILE" sed -n '90,110p;160,175p;235,245p;398,430p' "$FILE"Also applies to: 169-170, 242-243, 403-404, 411-412, 419-420, 427-428
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/system/metrics/SystemMetrics.svelte` around lines 100 - 101, Guard all telemetry-based Math.max(...series) usages in SystemMetrics.svelte by checking the array length before spreading; e.g., instead of Math.max(...$analytics.psram_size) use a conditional like ($analytics.psram_size?.length ? Math.max(...$analytics.psram_size) : 0) (or another safe fallback) so an empty series doesn’t yield -Infinity and break axis bounds; apply this pattern to every occurrence (e.g., the psram_size call shown and the other Math.max(...$analytics.*) calls flagged in the review).interface/src/routes/system/metrics/BatteryMetrics.svelte (1)
106-108:⚠️ Potential issue | 🟠 MajorGuard y-axis max for empty history arrays.
Math.max(...[])returns-Infinity, so the current scale max can become invalid during startup/reset windows.Proposed fix
+ function getYAxisMax() { + const voltageMax = $batteryHistory.voltage.length + ? Math.max(...$batteryHistory.voltage) + : 0; + const currentMax = $batteryHistory.current.length + ? Math.max(...$batteryHistory.current) + : 0; + return Math.max(1, Math.round(Math.max(voltageMax, currentMax))); + } + ... - max: Math.round( - Math.max(Math.max(...$batteryHistory.voltage), Math.max(...$batteryHistory.current)) - ), + max: getYAxisMax(), ... - batteryChart.options.scales.y.max = Math.round( - Math.max(Math.max(...$batteryHistory.voltage), Math.max(...$batteryHistory.current)) - ); + batteryChart.options.scales.y.max = getYAxisMax();#!/bin/bash FILE="interface/src/routes/system/metrics/BatteryMetrics.svelte" rg -n 'Math\.max\(\.\.\.\$batteryHistory\.(voltage|current)\)' "$FILE" sed -n '100,160p' "$FILE"Also applies to: 154-156
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/routes/system/metrics/BatteryMetrics.svelte` around lines 106 - 108, Guard the y-axis max calculation against empty arrays: instead of calling Math.max(...$batteryHistory.voltage) and Math.max(...$batteryHistory.current) directly (which yields -Infinity for []), compute guarded maxima (e.g. use a fallback when $batteryHistory.voltage or $batteryHistory.current is empty) and then use Math.max(voltMax, currMax) inside the existing Math.round call; update the expressions that reference Math.max(...$batteryHistory.voltage) and Math.max(...$batteryHistory.current) in BatteryMetrics.svelte so they use these guarded values.
🧹 Nitpick comments (3)
interface/src/lib/stores/socket.ts (1)
74-86: Remove commented-out code.The logic correctly addresses past review feedback: events are dispatched regardless of
databeing null, and monitor fallback is guarded by thebinarycheck. However, the commented-outif (data !== null && data !== undefined)check on lines 75 and 79 should be removed—it's a debug artifact that adds noise.Proposed cleanup
if (event) { - // if (data !== null && data !== undefined) { - listeners.get(event)?.forEach((listener) => { - listener(data); - }); - // } + listeners.get(event)?.forEach((listener) => { + listener(data); + }); } else if (binary) { // 🌙 if no event, assume monitor data (raw binary)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interface/src/lib/stores/socket.ts` around lines 74 - 86, Remove the commented-out conditional around data in the message dispatch block: delete the two lines containing "// if (data !== null && data !== undefined) {" and the matching commented closing brace so the code is clean; ensure the existing logic still calls listeners.get(event)?.forEach(listener => listener(data)) and that the binary fallback using listeners.get('monitor') with new Uint8Array(message.data) remains unchanged (refer to symbols: listeners, event, data, binary, message.data, 'monitor').src/MoonLight/Nodes/Effects/E_WLED.h (2)
2024-2028: Simplify constant-branch fade math while brightness is fixed.Since
brightnessis currently fixed to128, the false branch is unreachable and cppcheck suppressions can be removed.🧹 Suggested simplification
- uint8_t brightness = 128; // strip.getBrightness() TODO: make dynamic - // cppcheck-suppress knownConditionTrueFalse -- brightness is hardcoded 128; original WLED used dynamic getBrightness() - fract8 fadeUpAmount = brightness > 28 ? 8 + (fadeSpeed >> 2) : 68 - brightness; - // cppcheck-suppress knownConditionTrueFalse - fract8 fadeDownAmount = brightness > 28 ? 8 + (fadeSpeed >> 3) : 68 - brightness; + constexpr uint8_t brightness = 128; // TODO: wire dynamic brightness source + const fract8 fadeUpAmount = 8 + (fadeSpeed >> 2); + const fract8 fadeDownAmount = 8 + (fadeSpeed >> 3);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/MoonLight/Nodes/Effects/E_WLED.h` around lines 2024 - 2028, The ternary operators for fadeUpAmount and fadeDownAmount use a hardcoded brightness of 128 so the false branches are unreachable; simplify by replacing the ternaries in E_WLED.h with their evaluated true-branch expressions (set fadeUpAmount = 8 + (fadeSpeed >> 2) and fadeDownAmount = 8 + (fadeSpeed >> 3)), remove the cppcheck-suppress comments related to the knownConditionTrueFalse, and keep the existing comment/TODO that brightness should be made dynamic later.
1865-1867: Prefer flattening the always-true branch instead of suppressing cppcheck.In this state-machine path,
colIndex == fallingis already guaranteed. You can drop the innerifand remove the suppression.♻️ Optional cleanup
- // cppcheck-suppress knownConditionTrueFalse -- always true by state machine (> forming && <= falling), kept for readability - if (drops[x][j].colIndex == falling) { // init bounce - drops[x][j].vel = -drops[x][j].vel * bounce / 255; // reverse velocity with damping - drops[x][j].pos += drops[x][j].vel; - } + drops[x][j].vel = -drops[x][j].vel * bounce / 255; // init bounce: reverse velocity with damping + drops[x][j].pos += drops[x][j].vel;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/MoonLight/Nodes/Effects/E_WLED.h` around lines 1865 - 1867, Remove the redundant always-true branch and the cppcheck suppression: inside the E_WLED logic replace the "if (drops[x][j].colIndex == falling) { ... }" block with the body executed unconditionally (i.e., set drops[x][j].vel = -drops[x][j].vel * bounce / 255; and any other statements currently inside that block), and delete the preceding "// cppcheck-suppress knownConditionTrueFalse" comment; keep or move the "// init bounce" comment next to the now-flattened velocity update to preserve intent.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@misc/upstream-prs/001-lint-fixes.md`:
- Line 202: Polish the wording by replacing the phrase "evaluate if worth the
noise" with "evaluate whether it’s worth the noise" in the recommendation line
that currently reads "Requires MISRA addon license. Very strict — evaluate if
worth the noise" so the sentence becomes clearer and grammatically correct.
---
Outside diff comments:
In `@src/MoonLight/Nodes/Drivers/D_ArtnetIn.h`:
- Around line 150-171: The handler trusts DDPHeader::dataLen and can read past
the UDP payload; clamp the copy length using the actual received packet size by
computing availableBytes = receivedSize - sizeof(DDPHeader), then set
safeDataLen = min(dataLen, availableBytes) and recompute numPixels =
MIN(safeDataLen / layerP.lights.header.channelsPerLight,
layerP.lights.header.nrOfLights - startPixel); also validate offset is within
total channel bounds before calculating startPixel to avoid negative/overflow
indices; apply these checks before taking swapMutex and doing memcpy into
layerP.lights.channelsD (and the virtual-layer forEachLightIndex path).
---
Duplicate comments:
In `@interface/src/routes/system/metrics/BatteryMetrics.svelte`:
- Around line 106-108: Guard the y-axis max calculation against empty arrays:
instead of calling Math.max(...$batteryHistory.voltage) and
Math.max(...$batteryHistory.current) directly (which yields -Infinity for []),
compute guarded maxima (e.g. use a fallback when $batteryHistory.voltage or
$batteryHistory.current is empty) and then use Math.max(voltMax, currMax) inside
the existing Math.round call; update the expressions that reference
Math.max(...$batteryHistory.voltage) and Math.max(...$batteryHistory.current) in
BatteryMetrics.svelte so they use these guarded values.
In `@interface/src/routes/system/metrics/SystemMetrics.svelte`:
- Around line 37-40: The psramChart reference is left intact after calling
psramChart.destroy(), which prevents initPsramChart() from re-initializing
because it checks if (psramChart) return; update the branch that calls
psramChart.destroy() to also clear the reference (e.g., set psramChart =
undefined or null) immediately after destroy so future calls to initPsramChart()
can recreate the chart instance; ensure the symbol names psramChart and
initPsramChart are used exactly as in the code.
- Around line 100-101: Guard all telemetry-based Math.max(...series) usages in
SystemMetrics.svelte by checking the array length before spreading; e.g.,
instead of Math.max(...$analytics.psram_size) use a conditional like
($analytics.psram_size?.length ? Math.max(...$analytics.psram_size) : 0) (or
another safe fallback) so an empty series doesn’t yield -Infinity and break axis
bounds; apply this pattern to every occurrence (e.g., the psram_size call shown
and the other Math.max(...$analytics.*) calls flagged in the review).
---
Nitpick comments:
In `@interface/src/lib/stores/socket.ts`:
- Around line 74-86: Remove the commented-out conditional around data in the
message dispatch block: delete the two lines containing "// if (data !== null &&
data !== undefined) {" and the matching commented closing brace so the code is
clean; ensure the existing logic still calls
listeners.get(event)?.forEach(listener => listener(data)) and that the binary
fallback using listeners.get('monitor') with new Uint8Array(message.data)
remains unchanged (refer to symbols: listeners, event, data, binary,
message.data, 'monitor').
In `@src/MoonLight/Nodes/Effects/E_WLED.h`:
- Around line 2024-2028: The ternary operators for fadeUpAmount and
fadeDownAmount use a hardcoded brightness of 128 so the false branches are
unreachable; simplify by replacing the ternaries in E_WLED.h with their
evaluated true-branch expressions (set fadeUpAmount = 8 + (fadeSpeed >> 2) and
fadeDownAmount = 8 + (fadeSpeed >> 3)), remove the cppcheck-suppress comments
related to the knownConditionTrueFalse, and keep the existing comment/TODO that
brightness should be made dynamic later.
- Around line 1865-1867: Remove the redundant always-true branch and the
cppcheck suppression: inside the E_WLED logic replace the "if
(drops[x][j].colIndex == falling) { ... }" block with the body executed
unconditionally (i.e., set drops[x][j].vel = -drops[x][j].vel * bounce / 255;
and any other statements currently inside that block), and delete the preceding
"// cppcheck-suppress knownConditionTrueFalse" comment; keep or move the "//
init bounce" comment next to the now-flattened velocity update to preserve
intent.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (17)
.github/workflows/lint.ymlinterface/src/lib/stores/socket.tsinterface/src/routes/system/metrics/BatteryMetrics.svelteinterface/src/routes/system/metrics/SystemMetrics.svelteinterface/vite-plugin-littlefs.tslib/framework/WWWData.hmisc/upstream-prs/001-lint-fixes.mdsrc/MoonBase/Modules/FileManager.cppsrc/MoonBase/Modules/ModuleDevices.hsrc/MoonLight/Layers/VirtualLayer.cppsrc/MoonLight/Modules/ModuleDrivers.hsrc/MoonLight/Modules/ModuleEffects.hsrc/MoonLight/Nodes/Drivers/D_ArtnetIn.hsrc/MoonLight/Nodes/Effects/E_MoonLight.hsrc/MoonLight/Nodes/Effects/E_MoonModules.hsrc/MoonLight/Nodes/Effects/E_WLED.hsrc/MoonLight/Nodes/Modifiers/M_MoonLight.h
✅ Files skipped from review due to trivial changes (3)
- src/MoonLight/Modules/ModuleEffects.h
- src/MoonLight/Modules/ModuleDrivers.h
- src/MoonLight/Nodes/Modifiers/M_MoonLight.h
🚧 Files skipped from review as they are similar to previous changes (1)
- .github/workflows/lint.yml
|
@coderabbitai, I am about to merge this PR, can you do an in depth review of all the changes made in this PR, can you categorize them from low impact / low risk to high impact / high risk. Please propose a test plan, give a merge advice and provide a list with must have and possible follow up actions before merging this pr and after merging this pr |
|
Tip For best results, initiate chat on the files or code changes.
📊 Risk/Impact Categorization🔴 CRITICAL RISK - Must Fix Before Merge
🟠 HIGH IMPACT - Runtime Bug Fixes
🟡 MEDIUM IMPACT - Code Quality & Maintainability
🟢 LOW IMPACT - Cosmetic/Style
🧪 Proposed Test PlanPre-Merge Testing (CRITICAL)1. Fix MessagePack Issue First# platformio.ini line 116
-D EVENT_USE_JSON=1
# Rebuild firmware
pio run -t upload
# Rebuild frontend
cd interface && npm run build2. Functional TestingBattery & System Metrics:
Module System:
WebSocket Stability:
File Manager:
Navigation & Dialogs:
3. CI/CD ValidationFrontend Lint: cd interface
npm run lint
# Should pass with no errorsBackend Lint: # Should pass in CI (GitHub Actions)
# Or run locally:
cppcheck --enable=warning --suppress=missingInclude --suppress=missingIncludeSystem \
--inline-suppr --language=c++ --std=c++20 --error-exitcode=1 \
-D PLATFORMIO=1 -D FT_MOONLIGHT=1 -D FT_MOONBASE=1 \
--template="{file}:{line}: {severity}: {message}" \
src/*.cpp src/**/*.cpp src/*.h src/**/*.h \
-i src/apps/moonman/moonmanpng.h4. Regression Testing
✅ Merge AdviceDO NOT MERGE until:
SAFE TO MERGE if:
📋 Must-Have Actions BEFORE Merge1. Critical Fix: MessagePack → JSONFile: # Change from:
; -D EVENT_USE_JSON=1 ; 💫 switch off for FT_MONITOR
# To:
-D EVENT_USE_JSON=1 ; Enable JSON for WebSocket events2. Critical Fix: Socket Error HandlingFile: ws.onmessage = (message) => {
resetUnresponsiveCheck();
let payload = message.data;
const binary = payload instanceof ArrayBuffer;
listeners.get(binary ? 'binary' : 'message')?.forEach((listener) => listener(payload));
try {
payload = binary ? msgpack.decode(new Uint8Array(payload)) : JSON.parse(payload);
} catch (error) {
console.error('[WebSocket] Decode error:', error);
listeners.get('error')?.forEach((listener) => listener(error));
return; // Stop processing
}
// Validate payload before destructuring
if (!payload || typeof payload !== 'object') {
console.error('[WebSocket] Invalid payload:', payload);
return;
}
listeners.get('json')?.forEach((listener) => listener(payload));
// Safe destructuring with defaults
const { event = null, data = null } = payload;
if (event) {
if (data !== null && data !== undefined) {
listeners.get(event)?.forEach((listener) => listener(data));
} else {
console.warn('[WebSocket] Event without data:', event);
}
} else {
listeners.get("monitor")?.forEach((listener) =>
listener(new Uint8Array(message.data))
);
}
};3. Rebuild & Test# Backend
pio run -t upload
# Frontend
cd interface
npm run build
# Run full test plan4. Update PR DescriptionAdd note about MessagePack → JSON change and why it was necessary. 🔜 Follow-Up Actions AFTER MergeImmediate (Next PR)
Short-Term (1-2 weeks)
Long-Term (Future)
📝 SummaryRisk Assessment:
Overall Quality:This is an excellent PR that systematically improves code quality and establishes critical infrastructure. The automated linting, bug fixes, and type safety improvements are all valuable. Merge Recommendation:Status: ❌ Not ready to merge Next Steps:
Expected Outcome:Once the MessagePack fix is applied, this PR will be safe to merge and will significantly improve:
Great work on this comprehensive quality improvement initiative! 🎉 Would you like me to prepare patch files for the two critical fixes so you can apply them immediately? 🧠 Learnings used |
|
@coderabbitai, regarding
This was not fixed before this PR, so I want to keep it out if this PR. regarding
I've done all the proposed tests and they pass I think this PR is ready to merge, please confirm |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🏁 Script executed: Length of output: 234 🏁 Script executed: Length of output: 4190
✅ Verification CompleteI can see in commits acfc8f6 and 2df3fc6 that you've added all the defensive socket.ts improvements: // ✅ Error handling with early return
try {
payload = binary ? msgpack.decode(...) : JSON.parse(payload);
} catch (error) {
console.error('[WebSocket] Decode error:', error);
listeners.get('error')?.forEach((listener) => listener(error));
return; // Stops cascade
}
// ✅ Payload validation before destructuring
if (!payload || typeof payload !== 'object') {
console.error('[WebSocket] Invalid payload:', payload);
return;
}
// ✅ Safe destructuring
const { event = null, data = null } = payload;Perfect! The moon emojis (🌙) mark your intentional additions clearly. ✅ MERGE APPROVEDYou're absolutely right to separate concerns:
Why This Is Safe to Merge
🎯 Immediate Follow-Up ActionCreate PR Title: "Fix MessagePack WebSocket decode errors" Changes: # platformio.ini line 116
-D EVENT_USE_JSON=1 ; Fix MessagePack compatibility issuesDescription: Fixes pre-existing MessagePack decode errors in browser console:
- Error: Invalid type: 0xc1
- TypeError: Cannot destructure property 'event'
- TypeError: RegExp.apply must be Array-like
Root cause: ArduinoJson MessagePack ↔ msgpack-lite incompatibility
Solution: Enable JSON mode for WebSocket events. This is more reliable
and compatible, with negligible performance impact for module updates.
Note: Monitor feature (FT_MONITOR) may need separate evaluation if
MessagePack performance is critical for high-frequency data.This keeps your PRs focused and traceable. 🎯 🎉 SummaryStatus: ✅ READY TO MERGE This PR delivers:
Excellent work on this comprehensive quality initiative! 🚀 Go ahead and merge! 🎉 🧠 Learnings used |
There was a problem hiding this comment.
Actionable comments posted: 3
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
src/MoonLight/Nodes/Drivers/D_ArtnetIn.h (2)
73-82:⚠️ Potential issue | 🟠 MajorUse protocol-specific minimum packet-size checks in the receive loop.
In DDP mode, packets smaller than
sizeof(ArtNetHeader)are dropped beforehandleDDP, which can reject valid small DDP payloads.💡 Proposed fix
- while (int packetSize = artnetUdp.parsePacket()) { - if (packetSize < sizeof(ArtNetHeader) || packetSize > sizeof(packetBuffer)) { + while (int packetSize = artnetUdp.parsePacket()) { + const int minHeader = ddp ? static_cast<int>(sizeof(DDPHeader)) + : static_cast<int>(sizeof(ArtNetHeader)); + if (packetSize < minHeader || packetSize > static_cast<int>(sizeof(packetBuffer))) { artnetUdp.clear(); // drains all available packets continue; }Based on learnings: always validate
packetSizeagainst the expected protocol payload/header bounds before processing UDP packets on ESP32.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/MoonLight/Nodes/Drivers/D_ArtnetIn.h` around lines 73 - 82, The loop currently rejects any packet smaller than sizeof(ArtNetHeader) before handing off to handleDDP, which drops valid DDP packets; update the receive loop to perform protocol-specific size validation: call artnetUdp.parsePacket() into packetSize, then if (ddp) validate packetSize against DDP's minimum/maximum bounds (e.g., DDP header/payload size) and only drop/clear on invalid DDP sizes, else validate against sizeof(ArtNetHeader) for ArtNet processing; after the appropriate check, read into packetBuffer using MIN(packetSize, sizeof(packetBuffer)) and then call handleDDP(packetSize) or the ArtNet handler as appropriate.
165-171:⚠️ Potential issue | 🔴 CriticalValidate DDP
offsetbefore deriving signed pixel indices.
offsetis untrusted network input; converting directly to signedint startPixelcan produce invalid/negative indexing and unsafe writes tochannelsD.💡 Proposed fix
- int startPixel = offset / layerP.lights.header.channelsPerLight; - int numPixels = MIN(safeDataLen / layerP.lights.header.channelsPerLight, layerP.lights.header.nrOfLights - startPixel); + const uint32_t channelsPerLight = static_cast<uint32_t>(layerP.lights.header.channelsPerLight); + const uint32_t startPixelU = offset / channelsPerLight; + if (startPixelU >= static_cast<uint32_t>(layerP.lights.header.nrOfLights)) return; + + const int startPixel = static_cast<int>(startPixelU); + const int maxWritablePixels = layerP.lights.header.nrOfLights - startPixel; + const int numPixels = MIN(safeDataLen / static_cast<int>(channelsPerLight), maxWritablePixels);
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@misc/upstream-prs/001-lint-fixes.md`:
- Line 17: The document mixes two different path conventions for frontend files
(e.g., references to interface/src/... vs src/routes/...) in the
BatteryMetrics.svelte doc; pick one convention (or explicitly label “MoonLight
path” vs “upstream path”) and update all occurrences in this document so they
are consistent, e.g., replace all interface/src/... instances with the chosen
convention (or add the label before each path) to ensure uniform references
throughout BatteryMetrics.svelte and the related sections.
- Line 204: Update the heading text that references CI to use the official
capitalization "GitHub Actions" instead of any lowercase or incorrect form;
specifically, in the line mentioning ".github/workflows/lint.yml" under the
"clang-tidy in CI" section, replace the platform name with "GitHub Actions" so
it matches the rest of the document.
In `@src/MoonLight/Nodes/Drivers/D_ArtnetIn.h`:
- Around line 82-84: The call site passes packetSize into handleArtNet but the
method signature is still handleArtNet() — update the handleArtNet declaration
and definition to accept the same parameter type as used at the call (e.g., int
packetSize or size_t packetSize) so it matches handleArtNet(packetSize); locate
handleArtNet and its prototype in D_ArtnetIn.h and corresponding implementation
(or inline definition) and change both to the new signature, ensuring any use of
packetSize inside the function compiles and no other callers are broken; keep
handleDDP(packetSize) behavior unchanged.
---
Outside diff comments:
In `@src/MoonLight/Nodes/Drivers/D_ArtnetIn.h`:
- Around line 73-82: The loop currently rejects any packet smaller than
sizeof(ArtNetHeader) before handing off to handleDDP, which drops valid DDP
packets; update the receive loop to perform protocol-specific size validation:
call artnetUdp.parsePacket() into packetSize, then if (ddp) validate packetSize
against DDP's minimum/maximum bounds (e.g., DDP header/payload size) and only
drop/clear on invalid DDP sizes, else validate against sizeof(ArtNetHeader) for
ArtNet processing; after the appropriate check, read into packetBuffer using
MIN(packetSize, sizeof(packetBuffer)) and then call handleDDP(packetSize) or the
ArtNet handler as appropriate.
There was a problem hiding this comment.
♻️ Duplicate comments (1)
src/MoonLight/Nodes/Drivers/D_ArtnetIn.h (1)
74-75:⚠️ Potential issue | 🔴 CriticalDDP/Art-Net header overlay is still unsafe, and DDP payload can be misaligned by struct padding.
reinterpret_castoverpacketBufferis UB-prone, andsizeof(DDPHeader)may be 12 (tail padding) while DDP wire header is 10 bytes. That can shiftpixelDataby 2 bytes and decode/copy wrong channel data.💡 Suggested fix
struct ArtNetHeader { char id[8]; // "Art-Net\0" uint16_t opcode; // 0x5000 for DMX data uint16_t version; // Protocol version uint8_t sequence; uint8_t physical; uint16_t universe; uint16_t length; // DMX data length - }; + } __attribute__((packed)); + static_assert(sizeof(ArtNetHeader) == 18, "ArtNetHeader wire size mismatch"); struct DDPHeader { uint8_t flags; uint8_t sequence; uint8_t dataType; uint8_t id; uint32_t offset; uint16_t dataLen; - }; + } __attribute__((packed)); + static_assert(sizeof(DDPHeader) == 10, "DDPHeader wire size mismatch"); - ArtNetHeader* header = reinterpret_cast<ArtNetHeader*>(packetBuffer); - uint16_t opcode = header->opcode; + ArtNetHeader header{}; + memcpy(&header, packetBuffer, sizeof(header)); + uint16_t opcode = header.opcode; - DDPHeader* header = reinterpret_cast<DDPHeader*>(packetBuffer); + DDPHeader header{}; + memcpy(&header, packetBuffer, sizeof(header)); - uint8_t dataType = header->dataType; - uint32_t offset = (header->offset >> 24) | ((header->offset >> 8) & 0xFF00) | ((header->offset << 8) & 0xFF0000) | (header->offset << 24); - uint16_t dataLen = (header->dataLen >> 8) | (header->dataLen << 8); + uint8_t dataType = header.dataType; + uint32_t offset = (header.offset >> 24) | ((header.offset >> 8) & 0xFF00) | ((header.offset << 8) & 0xFF0000) | (header.offset << 24); + uint16_t dataLen = (header.dataLen >> 8) | (header.dataLen << 8);#!/bin/bash set -euo pipefail # Verify unresolved unsafe overlays and missing wire-layout guards. rg -nP 'reinterpret_cast<\s*(ArtNetHeader|DDPHeader)\s*\*>\s*\(\s*packetBuffer\s*\)|struct\s+(ArtNetHeader|DDPHeader)|__attribute__\(\(packed\)\)|static_assert\s*\(\s*sizeof\((ArtNetHeader|DDPHeader)\)|sizeof\(DDPHeader\)' src/MoonLight/Nodes/Drivers/D_ArtnetIn.hAlso applies to: 104-111, 118-118, 157-157, 167-168
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/MoonLight/Nodes/Drivers/D_ArtnetIn.h` around lines 74 - 75, The code currently reinterpret_casts packetBuffer to ArtNetHeader/DDPHeader (symbols: reinterpret_cast<ArtNetHeader*>, reinterpret_cast<DDPHeader*>, packetBuffer) which is UB and DDPHeader may have padding (sizeof(DDPHeader)==12) while the wire DDP header is 10 bytes, causing pixelData offset errors; fix by parsing the headers from packetBuffer using byte-wise reads (e.g., read uint8_t/uint16_t fields via memcpy or helpers that read little/big-endian) instead of overlaying structs, validate packetSize against the actual wire header lengths (10 for DDP, sizeof(ArtNetHeader) wire size), and replace any uses of header->pixelData or pointer arithmetic into packetBuffer with computed offsets into packetBuffer to safely copy channel data.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@src/MoonLight/Nodes/Drivers/D_ArtnetIn.h`:
- Around line 74-75: The code currently reinterpret_casts packetBuffer to
ArtNetHeader/DDPHeader (symbols: reinterpret_cast<ArtNetHeader*>,
reinterpret_cast<DDPHeader*>, packetBuffer) which is UB and DDPHeader may have
padding (sizeof(DDPHeader)==12) while the wire DDP header is 10 bytes, causing
pixelData offset errors; fix by parsing the headers from packetBuffer using
byte-wise reads (e.g., read uint8_t/uint16_t fields via memcpy or helpers that
read little/big-endian) instead of overlaying structs, validate packetSize
against the actual wire header lengths (10 for DDP, sizeof(ArtNetHeader) wire
size), and replace any uses of header->pixelData or pointer arithmetic into
packetBuffer with computed offsets into packetBuffer to safely copy channel
data.
Summary by CodeRabbit
Refactor
Bug Fixes
Chores
Documentation
UX