Fix embedded audio fallback handling and audio diagnostics#309
Fix embedded audio fallback handling and audio diagnostics#309webadderall wants to merge 3 commits intomainfrom
Conversation
📝 WalkthroughWalkthroughAdds Windows microphone fallback detection and wiring into the native recording startup; introduces utilities to resolve source vs. external audio fallback paths and media resource parsing; updates audio export to use resolved fallbacks; refines video editor preview/audio behavior and motion blur filter management. Changes
Sequence Diagram(s)sequenceDiagram
participant Renderer as Renderer (UI)
participant Main as Electron Main (IPC)
participant Helper as Native Helper (wcProc)
participant Browser as Browser Mic Fallback
Renderer->>Main: start-native-screen-recording(options)
Main->>Helper: spawn native helper (systemAudio/microphone paths)
Helper-->>Main: stdout/stderr chunks (accumulate into captureOutput)
Helper-->>Main: started signal
Main->>Main: call shouldUseWindowsBrowserMicrophoneFallback(captureOutput, options)
alt fallback required
Main->>Main: clear microphonePath, set microphoneFallbackRequired = true
Main-->>Renderer: { success: true, microphoneFallbackRequired: true }
Main->>Browser: attempt browser mic fallback (if applicable)
else no fallback
Main-->>Renderer: { success: true, microphoneFallbackRequired: false }
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~65 minutes Possibly related PRs
Suggested labels
Poem
🚥 Pre-merge checks | ✅ 3 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 3
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
src/components/video-editor/VideoEditor.tsx (1)
3491-3506:⚠️ Potential issue | 🟡 MinorHandle companion audio source resolution failures.
If
resolveMediaElementSource(audioPath)rejects, this async task becomes an unhandled rejection and the preview silently misses companion audio despite the new diagnostics.🛡️ Proposed fix
void (async () => { - const resolved = await resolveMediaElementSource(audioPath); - const latestAudio = existing.get(audioPath); - - if ( - cancelled || - latestAudio !== audio || - sourceAudioElementResourcesRef.current.get(audioPath) !== audioPath - ) { - resolved.revoke(); - return; - } - - sourceAudioElementRevokersRef.current.set(audioPath, resolved.revoke); - latestAudio.src = resolved.src; + try { + const resolved = await resolveMediaElementSource(audioPath); + const latestAudio = existing.get(audioPath); + + if ( + cancelled || + latestAudio !== audio || + sourceAudioElementResourcesRef.current.get(audioPath) !== audioPath + ) { + resolved.revoke(); + return; + } + + sourceAudioElementRevokersRef.current.set(audioPath, resolved.revoke); + latestAudio.src = resolved.src; + } catch (error) { + if (!cancelled) { + sourceAudioElementResourcesRef.current.delete(audioPath); + toast.warning( + `Could not load companion audio source: ${summarizeErrorMessage(getErrorMessage(error))}`, + { id: SOURCE_AUDIO_FALLBACK_TOAST_ID, duration: 10000 }, + ); + } + } })();🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/video-editor/VideoEditor.tsx` around lines 3491 - 3506, The async IIFE that calls resolveMediaElementSource(audioPath) should catch rejections and clean up: wrap the await call in try/catch (or attach .catch) inside the IIFE so failures from resolveMediaElementSource(audioPath) are handled; on error, ensure you don't leave stale revokers/resources (use sourceAudioElementResourcesRef.current and sourceAudioElementRevokersRef.current to remove any partial entries for audioPath) and avoid further DOM updates to latestAudio; keep the existing guard checks (cancelled, latestAudio !== audio, resource mismatch) but ensure the catch block calls resolved.revoke if a partial resource exists or otherwise performs equivalent cleanup and logs or swallows the error to prevent an unhandled rejection.src/lib/exporter/audioEncoder.ts (1)
160-186:⚠️ Potential issue | 🟡 MinorResolve embedded fallbacks before choosing the sidecar fast path.
Line 184 still treats a one-item fallback list as a sidecar even when that item is the video file itself. That keeps the embedded-audio-as-sidecar behavior alive for trim-only exports with no edits.
🐛 Proposed fix
const sortedSourceAudioFallbackPaths = sourceAudioFallbackPaths ? sourceAudioFallbackPaths.filter( (audioPath) => typeof audioPath === "string" && audioPath.trim().length > 0, ) : []; + const { + hasEmbeddedSourceAudio, + externalAudioPaths, + } = resolveSourceAudioFallbackPaths(videoUrl, sortedSourceAudioFallbackPaths); + const needsSourceAudioMixing = + externalAudioPaths.length > 1 || + (hasEmbeddedSourceAudio && externalAudioPaths.length > 0); // When speed edits, audio regions, or multiple audio sources need mixing, use offline AudioContext pipeline. if ( sortedSpeedRegions.length > 0 || sortedAudioRegions.length > 0 || - sortedSourceAudioFallbackPaths.length > 1 + needsSourceAudioMixing ) { await this.renderAndMuxOfflineAudio( videoUrl, @@ } // Single sidecar audio with no speed/audio edits: demux directly (skips slow real-time rendering). - if (sortedSourceAudioFallbackPaths.length === 1) { + if (!hasEmbeddedSourceAudio && externalAudioPaths.length === 1) { const sidecarDemuxer = await this.loadAudioFileDemuxer( - sortedSourceAudioFallbackPaths[0], + externalAudioPaths[0], );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/exporter/audioEncoder.ts` around lines 160 - 186, The fast-path erroneously treats a single fallback that points to the input video as a sidecar; before taking the sidecar branch, resolve embedded fallbacks by checking whether sortedSourceAudioFallbackPaths[0] references the video (compare against videoUrl or otherwise-detected embedded source) and only call loadAudioFileDemuxer when the single fallback is a true external sidecar; if it is the video (embedded), call renderAndMuxOfflineAudio (or fall through to the offline pipeline) instead. Ensure this change touches the conditional around sortedSourceAudioFallbackPaths, and references renderAndMuxOfflineAudio, loadAudioFileDemuxer, sortedSourceAudioFallbackPaths, and videoUrl.
🧹 Nitpick comments (4)
src/components/video-editor/VideoPlayback.tsx (2)
1511-1521: Cleanup: detach filters independently ofblurFilterRef, and usenullfor consistency.In the teardown,
videoContainer.filters = []is only executed inside theif (blurFilterRef.current)branch. With the new lifecycle, the filters array — when attached — containsmotionBlurFilter, notblurFilter. IfblurFilterRefis ever nulled before cleanup (or the two refs ever drift apart), the motion blur filter would be destroyed while still referenced fromvideoContainer.filters, which can leave a dangling reference in the scene graph.In practice both refs are created/cleared together today, so this isn't a live bug, but the coupling is fragile. Also, using
[]here while the rest of the new code usesnull(which is the idiom the PixiJS v8 filter docs recommend for "remove filters") is inconsistent.♻️ Suggested cleanup ordering
- if (blurFilterRef.current) { - videoContainer.filters = []; - blurFilterRef.current.destroy(); - blurFilterRef.current = null; - } - if (motionBlurFilterRef.current) { - motionBlurFilterRef.current.destroy(); - motionBlurFilterRef.current = null; - } + // Always detach before destroying so Pixi never holds a reference + // to a destroyed filter in its scene graph. + videoContainer.filters = null; + if (blurFilterRef.current) { + blurFilterRef.current.destroy(); + blurFilterRef.current = null; + } + if (motionBlurFilterRef.current) { + motionBlurFilterRef.current.destroy(); + motionBlurFilterRef.current = null; + }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/video-editor/VideoPlayback.tsx` around lines 1511 - 1521, The teardown currently clears videoContainer.filters only when blurFilterRef.current exists, which risks leaving motionBlurFilter referenced after it is destroyed; update the cleanup in VideoPlayback so you detach filters from videoContainer unconditionally (set videoContainer.filters = null), then separately check and destroy blurFilterRef.current and motionBlurFilterRef.current (destroy each if present and then set each ref to null), and also set maskGraphicsRef.current to null; ensure destruction happens before nulling the refs and use null (not []) to remove filters per PixiJS v8 guidance.
1745-1758: Conditional motion-blur attach/detach logic is sound — minor robustness enhancements suggested.Hysteresis is in the correct direction (attach at
> 0.008, release at> 0.002), preventing flicker near the threshold. A couple of small observations:
filtersActivechecks only whether the filters array is non-empty; it doesn't verify that the attached filter is specificallymotionBlurFilterRef.current. While this is safe today (this component is the sole writer ofvideoContainer.filters), identity-checking againstmotionBlurFilterRef.currentwould be more defensive and prevent subtle bugs if other code paths ever attach additional filters to this container.needsFiltersintentionally gates onisPlayingRef.current, so camera motion during pause/seek won't trigger the filter. This is consistent with the pause branch elsewhere, but a brief comment would help future readers understand this is deliberate.Minor note: cleanup at line 1514 sets
filters = []while this branch usesfilters = null. Both are valid in PixiJS v8, but usingnullconsistently throughout would be cleaner.♻️ Optional tightening
- const filtersActive = Array.isArray(videoContainer.filters) && videoContainer.filters.length > 0; + const motionBlur = motionBlurFilterRef.current; + const filtersActive = + Array.isArray(videoContainer.filters) && + motionBlur !== null && + videoContainer.filters.includes(motionBlur); const cameraIsMoving = filtersActive ? motionIntensity > 0.002 : motionIntensity > 0.008; const needsFilters = zoomMotionBlurRef.current > 0 && isPlayingRef.current && cameraIsMoving; - if (needsFilters && !filtersActive && motionBlurFilterRef.current) { - videoContainer.filters = [motionBlurFilterRef.current]; + if (needsFilters && !filtersActive && motionBlur) { + videoContainer.filters = [motionBlur]; } else if (!needsFilters && filtersActive) { videoContainer.filters = null; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/video-editor/VideoPlayback.tsx` around lines 1745 - 1758, Update the attach/detach logic to defensively ensure we only treat the motionBlur filter as active: change filtersActive to check that videoContainer.filters is an array whose sole (or included) element is identity-equal to motionBlurFilterRef.current (use Array.isArray(videoContainer.filters) && videoContainer.filters.includes(motionBlurFilterRef.current)). Keep the hysteresis and needsFilters logic but add a short inline comment by the needsFilters definition stating that isPlayingRef.current intentionally gates filter application during pause/seek. Finally, make filter-clearing consistent across the component by changing any cleanup that sets videoContainer.filters = [] to use videoContainer.filters = null so the two branches use the same sentinel (null) when no filters are present.electron/ipc/recording/windowsFallbacks.test.ts (1)
5-23: Add negative-path coverage.The suite proves the short-circuit on
capturesMicrophone: falsebut not the other branches of the predicate. Consider adding cases forcapturesMicrophone: truewith output that doesn't contain the warning, and for an omittedoptionsargument, so regressions in the substring match or the optional-chaining are caught.🧪 Proposed additional tests
it("returns false when microphone capture was not requested", () => { expect( shouldUseWindowsBrowserMicrophoneFallback( "WARNING: Failed to initialize WASAPI mic capture\nRecording started", { capturesMicrophone: false }, ), ).toBe(false); }); + + it("returns false when the warning is absent even if microphone capture was requested", () => { + expect( + shouldUseWindowsBrowserMicrophoneFallback( + "Recording started\nCapture running", + { capturesMicrophone: true }, + ), + ).toBe(false); + }); + + it("returns false when options are omitted", () => { + expect( + shouldUseWindowsBrowserMicrophoneFallback( + "WARNING: Failed to initialize WASAPI mic capture", + ), + ).toBe(false); + }); });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@electron/ipc/recording/windowsFallbacks.test.ts` around lines 5 - 23, Add two negative-path tests for shouldUseWindowsBrowserMicrophoneFallback: one where capturesMicrophone: true but the log string does not contain "Failed to initialize WASAPI mic capture" and assert it returns false, and one that calls shouldUseWindowsBrowserMicrophoneFallback with only the log string (omit the options argument) and assert the function safely handles undefined options and returns false; reference the function name shouldUseWindowsBrowserMicrophoneFallback to locate where to add these test cases.electron/ipc/recording/windowsFallbacks.ts (1)
1-11: LGTM — helper is small, focused, and matches the native emission site.The warning string matches the
std::cerrline atelectron/native/wgc-capture/src/main.cpp:301-305, and thecapturesMicrophoneshort-circuit avoids false positives when mic capture wasn't requested. Worth keeping in mind that the detection is a brittle substring match; if the native helper ever rewords the warning, fallback will silently stop engaging — consider emitting a structured token (e.g.MICROPHONE_CAPTURE_UNAVAILABLE, matching the macOS path on line 452 ofrecording.ts) from the Windows helper in a follow-up.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@electron/ipc/recording/windowsFallbacks.ts` around lines 1 - 11, Current Windows fallback checks for a brittle substring using WINDOWS_MIC_CAPTURE_INIT_WARNING inside shouldUseWindowsBrowserMicrophoneFallback; change the native helper to emit a structured token (e.g. "MICROPHONE_CAPTURE_UNAVAILABLE") and update shouldUseWindowsBrowserMicrophoneFallback to detect that token instead of relying on the std::cerr text match, preserving the capturesMicrophone short-circuit; reference WINDOWS_MIC_CAPTURE_INIT_WARNING and shouldUseWindowsBrowserMicrophoneFallback when making the native and JS changes so both sides agree on the new token.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@electron/ipc/register/recording.ts`:
- Around line 210-218: The current code sets config.micOutputPath and calls
setWindowsMicAudioPath before the native helper successfully starts, which can
leave an empty .mic.wav file if WASAPI init fails; change the flow so
setWindowsMicAudioPath (and persisting micOutputPath) only happens after the
helper reports successful start, or if you keep the current order, perform
best-effort cleanup in the fallback path by checking for and unlinking the
microphonePath when handling the mic-init failure (the code that nulls
config.captureMic/config.micOutputPath). Update the logic around
config.micOutputPath, setWindowsMicAudioPath, and the fallback cleanup block so
any created microphonePath file is removed when fallback occurs.
In `@src/lib/exporter/audioEncoder.test.ts`:
- Around line 11-24: The test currently uses unsafe "as never" casts to access
private AudioProcessor methods; instead declare a small test-harness type (e.g.,
TestAudioProcessor) that exposes decodeAudioFromUrl(url: string):
Promise<AudioBuffer | null>, getMediaDurationSec(): Promise<number>, and
prepareOfflineRender(...): Promise<...> and cast the processor to that type
(processor as unknown as TestAudioProcessor) in the test; then replace uses of
(processor as never) with the new typed alias so spies on decodeAudioFromUrl,
getMediaDurationSec and calls to prepareOfflineRender are properly typed and get
IDE/type-checking support.
In `@src/lib/exporter/sourceAudioFallback.ts`:
- Around line 7-20: Normalized path comparison is missing: before checking
hasEmbeddedSourceAudio and filtering externalAudioPaths in
sourceAudioFallback.ts, normalize both each entry in normalizedPaths and the
localVideoSourcePath (use getLocalFilePathFromResource output) to a canonical
form (e.g., convert file:// URIs to local paths, unify separators, and apply
case-insensitive comparison on Windows) and then perform the includes/filter
against those normalized strings (introduce a small helper like normalizePath
and use it when computing hasEmbeddedSourceAudio and when building
externalAudioPaths).
---
Outside diff comments:
In `@src/components/video-editor/VideoEditor.tsx`:
- Around line 3491-3506: The async IIFE that calls
resolveMediaElementSource(audioPath) should catch rejections and clean up: wrap
the await call in try/catch (or attach .catch) inside the IIFE so failures from
resolveMediaElementSource(audioPath) are handled; on error, ensure you don't
leave stale revokers/resources (use sourceAudioElementResourcesRef.current and
sourceAudioElementRevokersRef.current to remove any partial entries for
audioPath) and avoid further DOM updates to latestAudio; keep the existing guard
checks (cancelled, latestAudio !== audio, resource mismatch) but ensure the
catch block calls resolved.revoke if a partial resource exists or otherwise
performs equivalent cleanup and logs or swallows the error to prevent an
unhandled rejection.
In `@src/lib/exporter/audioEncoder.ts`:
- Around line 160-186: The fast-path erroneously treats a single fallback that
points to the input video as a sidecar; before taking the sidecar branch,
resolve embedded fallbacks by checking whether sortedSourceAudioFallbackPaths[0]
references the video (compare against videoUrl or otherwise-detected embedded
source) and only call loadAudioFileDemuxer when the single fallback is a true
external sidecar; if it is the video (embedded), call renderAndMuxOfflineAudio
(or fall through to the offline pipeline) instead. Ensure this change touches
the conditional around sortedSourceAudioFallbackPaths, and references
renderAndMuxOfflineAudio, loadAudioFileDemuxer, sortedSourceAudioFallbackPaths,
and videoUrl.
---
Nitpick comments:
In `@electron/ipc/recording/windowsFallbacks.test.ts`:
- Around line 5-23: Add two negative-path tests for
shouldUseWindowsBrowserMicrophoneFallback: one where capturesMicrophone: true
but the log string does not contain "Failed to initialize WASAPI mic capture"
and assert it returns false, and one that calls
shouldUseWindowsBrowserMicrophoneFallback with only the log string (omit the
options argument) and assert the function safely handles undefined options and
returns false; reference the function name
shouldUseWindowsBrowserMicrophoneFallback to locate where to add these test
cases.
In `@electron/ipc/recording/windowsFallbacks.ts`:
- Around line 1-11: Current Windows fallback checks for a brittle substring
using WINDOWS_MIC_CAPTURE_INIT_WARNING inside
shouldUseWindowsBrowserMicrophoneFallback; change the native helper to emit a
structured token (e.g. "MICROPHONE_CAPTURE_UNAVAILABLE") and update
shouldUseWindowsBrowserMicrophoneFallback to detect that token instead of
relying on the std::cerr text match, preserving the capturesMicrophone
short-circuit; reference WINDOWS_MIC_CAPTURE_INIT_WARNING and
shouldUseWindowsBrowserMicrophoneFallback when making the native and JS changes
so both sides agree on the new token.
In `@src/components/video-editor/VideoPlayback.tsx`:
- Around line 1511-1521: The teardown currently clears videoContainer.filters
only when blurFilterRef.current exists, which risks leaving motionBlurFilter
referenced after it is destroyed; update the cleanup in VideoPlayback so you
detach filters from videoContainer unconditionally (set videoContainer.filters =
null), then separately check and destroy blurFilterRef.current and
motionBlurFilterRef.current (destroy each if present and then set each ref to
null), and also set maskGraphicsRef.current to null; ensure destruction happens
before nulling the refs and use null (not []) to remove filters per PixiJS v8
guidance.
- Around line 1745-1758: Update the attach/detach logic to defensively ensure we
only treat the motionBlur filter as active: change filtersActive to check that
videoContainer.filters is an array whose sole (or included) element is
identity-equal to motionBlurFilterRef.current (use
Array.isArray(videoContainer.filters) &&
videoContainer.filters.includes(motionBlurFilterRef.current)). Keep the
hysteresis and needsFilters logic but add a short inline comment by the
needsFilters definition stating that isPlayingRef.current intentionally gates
filter application during pause/seek. Finally, make filter-clearing consistent
across the component by changing any cleanup that sets videoContainer.filters =
[] to use videoContainer.filters = null so the two branches use the same
sentinel (null) when no filters are present.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: c22287aa-c6d8-46c8-ab65-e85501fe7a54
📒 Files selected for processing (11)
electron/ipc/recording/windowsFallbacks.test.tselectron/ipc/recording/windowsFallbacks.tselectron/ipc/register/recording.tssrc/components/video-editor/VideoEditor.tsxsrc/components/video-editor/VideoPlayback.tsxsrc/lib/exporter/audioEncoder.test.tssrc/lib/exporter/audioEncoder.tssrc/lib/exporter/mediaResource.test.tssrc/lib/exporter/mediaResource.tssrc/lib/exporter/sourceAudioFallback.test.tssrc/lib/exporter/sourceAudioFallback.ts
| if (options?.capturesMicrophone) { | ||
| const micPath = path.join(recordingsDir, `recording-${timestamp}.mic.wav`) | ||
| microphonePath = path.join(recordingsDir, `recording-${timestamp}.mic.wav`) | ||
| config.captureMic = true | ||
| config.micOutputPath = micPath | ||
| config.micOutputPath = microphonePath | ||
| if (options.microphoneLabel) { | ||
| config.micDeviceName = options.microphoneLabel | ||
| } | ||
| setWindowsMicAudioPath(micPath) | ||
| setWindowsMicAudioPath(microphonePath) | ||
| } |
There was a problem hiding this comment.
Fallback may leave an orphan .mic.wav on disk.
Line 213 hands config.micOutputPath = microphonePath to the spawned native helper before it attempts WASAPI init. If the helper creates (or truncates) the file handle prior to failing init — even to 0 bytes — the fallback path on 278-281 only nulls the in-memory state and doesn't unlink the file. Downstream mux / sidecar discovery code that scans the recordings directory by filename pattern could then pick up an empty .mic.wav alongside the real browser-captured sidecar.
Consider best-effort removal of the file when fallback is required, and/or only persisting setWindowsMicAudioPath after a successful start.
🧹 Proposed cleanup on fallback
if (microphoneFallbackRequired) {
+ const orphanMicPath = microphonePath
microphonePath = null
setWindowsMicAudioPath(null)
+ if (orphanMicPath) {
+ void fs.unlink(orphanMicPath).catch(() => { /* best-effort */ })
+ }
}Also applies to: 274-281
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@electron/ipc/register/recording.ts` around lines 210 - 218, The current code
sets config.micOutputPath and calls setWindowsMicAudioPath before the native
helper successfully starts, which can leave an empty .mic.wav file if WASAPI
init fails; change the flow so setWindowsMicAudioPath (and persisting
micOutputPath) only happens after the helper reports successful start, or if you
keep the current order, perform best-effort cleanup in the fallback path by
checking for and unlinking the microphonePath when handling the mic-init failure
(the code that nulls config.captureMic/config.micOutputPath). Update the logic
around config.micOutputPath, setWindowsMicAudioPath, and the fallback cleanup
block so any created microphonePath file is removed when fallback occurs.
| const decodeAudioFromUrl = vi | ||
| .spyOn(processor as never, "decodeAudioFromUrl") | ||
| .mockImplementation(async (url: string) => { | ||
| if (url === "file:///tmp/recording.mp4") { | ||
| return mainBuffer; | ||
| } | ||
| if (url === "/tmp/recording.mic.wav") { | ||
| return micBuffer; | ||
| } | ||
| return null; | ||
| }); | ||
| vi.spyOn(processor as never, "getMediaDurationSec").mockResolvedValue(10); | ||
|
|
||
| const prepared = await (processor as never).prepareOfflineRender( |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Description: Find `as never` casts used for private test access.
# Expected after the fix: no matches in the changed audio encoder test.
rg -n -C2 '\bas never\b' --glob '*.{ts,tsx}'Repository: webadderall/Recordly
Length of output: 947
🏁 Script executed:
#!/bin/bash
# Check TypeScript configuration for test type checking
find . -name "tsconfig*.json" -o -name "vitest.config.*" -o -name "vite.config.*" | head -10Repository: webadderall/Recordly
Length of output: 136
🏁 Script executed:
#!/bin/bash
# Search for other `as never` patterns in test files
rg -n '\bas never\b' --glob '*.test.ts' --glob '*.test.tsx' --glob '*.spec.ts'Repository: webadderall/Recordly
Length of output: 383
🏁 Script executed:
#!/bin/bash
# Check the AudioProcessor class to understand if methods are private
rg -A 10 'class AudioProcessor' src/lib/exporter/audioEncoder.ts | head -50Repository: webadderall/Recordly
Length of output: 339
🏁 Script executed:
#!/bin/bash
# Check if the methods being spied on are marked as private in AudioProcessor
rg -n '(private|protected).*?(decodeAudioFromUrl|getMediaDurationSec|prepareOfflineRender)' src/lib/exporter/audioEncoder.tsRepository: webadderall/Recordly
Length of output: 258
🏁 Script executed:
#!/bin/bash
# Check the tsconfig.json for strict type checking settings
cat tsconfig.json | head -40Repository: webadderall/Recordly
Length of output: 729
🏁 Script executed:
#!/bin/bash
# Get full context of the three methods being spied on
rg -B 2 -A 8 'decodeAudioFromUrl|getMediaDurationSec|prepareOfflineRender' src/lib/exporter/audioEncoder.ts | head -80Repository: webadderall/Recordly
Length of output: 2558
Replace as never casts with an explicit test harness type.
The three private AudioProcessor methods—decodeAudioFromUrl, getMediaDurationSec, and prepareOfflineRender—are accessed via as never casts. While test files are excluded from type-checking in the main tsconfig.json config, using an explicit test harness type makes the intent clearer and provides better IDE support within the test file.
Proposed test typing fix
import { AudioProcessor } from "./audioEncoder";
+type OfflineRenderTestHarness = AudioProcessor & {
+ decodeAudioFromUrl(url: string): Promise<AudioBuffer | null>;
+ getMediaDurationSec(url: string): Promise<number>;
+ prepareOfflineRender(
+ videoUrl: string,
+ trimRegions: never[],
+ speedRegions: never[],
+ audioRegions: never[],
+ sourceAudioFallbackPaths: string[],
+ ): Promise<{
+ mainBuffer: AudioBuffer | null;
+ companionEntries: Array<{ buffer: AudioBuffer; startDelaySec: number }>;
+ }>;
+};
+
describe("AudioProcessor offline render preparation", () => {
it("keeps embedded source audio separate from external companion sidecars", async () => {
- const processor = new AudioProcessor();
+ const processor = new AudioProcessor() as OfflineRenderTestHarness;
const mainBuffer = { duration: 10, numberOfChannels: 2 } as AudioBuffer;
const micBuffer = { duration: 9.5, numberOfChannels: 1 } as AudioBuffer;
const decodeAudioFromUrl = vi
- .spyOn(processor as never, "decodeAudioFromUrl")
+ .spyOn(processor, "decodeAudioFromUrl")
.mockImplementation(async (url: string) => {
@@
return null;
});
- vi.spyOn(processor as never, "getMediaDurationSec").mockResolvedValue(10);
+ vi.spyOn(processor, "getMediaDurationSec").mockResolvedValue(10);
- const prepared = await (processor as never).prepareOfflineRender(
+ const prepared = await processor.prepareOfflineRender(📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const decodeAudioFromUrl = vi | |
| .spyOn(processor as never, "decodeAudioFromUrl") | |
| .mockImplementation(async (url: string) => { | |
| if (url === "file:///tmp/recording.mp4") { | |
| return mainBuffer; | |
| } | |
| if (url === "/tmp/recording.mic.wav") { | |
| return micBuffer; | |
| } | |
| return null; | |
| }); | |
| vi.spyOn(processor as never, "getMediaDurationSec").mockResolvedValue(10); | |
| const prepared = await (processor as never).prepareOfflineRender( | |
| const decodeAudioFromUrl = vi | |
| .spyOn(processor, "decodeAudioFromUrl") | |
| .mockImplementation(async (url: string) => { | |
| if (url === "file:///tmp/recording.mp4") { | |
| return mainBuffer; | |
| } | |
| if (url === "/tmp/recording.mic.wav") { | |
| return micBuffer; | |
| } | |
| return null; | |
| }); | |
| vi.spyOn(processor, "getMediaDurationSec").mockResolvedValue(10); | |
| const prepared = await processor.prepareOfflineRender( |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/lib/exporter/audioEncoder.test.ts` around lines 11 - 24, The test
currently uses unsafe "as never" casts to access private AudioProcessor methods;
instead declare a small test-harness type (e.g., TestAudioProcessor) that
exposes decodeAudioFromUrl(url: string): Promise<AudioBuffer | null>,
getMediaDurationSec(): Promise<number>, and prepareOfflineRender(...):
Promise<...> and cast the processor to that type (processor as unknown as
TestAudioProcessor) in the test; then replace uses of (processor as never) with
the new typed alias so spies on decodeAudioFromUrl, getMediaDurationSec and
calls to prepareOfflineRender are properly typed and get IDE/type-checking
support.
| const normalizedPaths = (sourceAudioFallbackPaths ?? []).filter( | ||
| (audioPath) => typeof audioPath === "string" && audioPath.trim().length > 0, | ||
| ); | ||
| const localVideoSourcePath = videoResource | ||
| ? getLocalFilePathFromResource(videoResource) | ||
| : null; | ||
| const hasEmbeddedSourceAudio = | ||
| Boolean(localVideoSourcePath) && normalizedPaths.includes(localVideoSourcePath); | ||
|
|
||
| return { | ||
| hasEmbeddedSourceAudio, | ||
| externalAudioPaths: hasEmbeddedSourceAudio | ||
| ? normalizedPaths.filter((audioPath) => audioPath !== localVideoSourcePath) | ||
| : normalizedPaths, |
There was a problem hiding this comment.
Normalize local paths before comparing embedded audio.
Exact string matching will miss equivalent Windows paths such as file:///C:/tmp/recording.mp4 vs C:\tmp\recording.mp4, leaving the video file in externalAudioPaths and reintroducing embedded audio as a sidecar.
🐛 Proposed fix
import { getLocalFilePathFromResource } from "./mediaResource";
+function getLocalPathComparisonKey(resource: string): string {
+ const localPath = getLocalFilePathFromResource(resource) ?? resource;
+ const normalized = localPath.trim().replace(/\\/g, "/");
+ return /^[A-Za-z]:\//.test(normalized) || normalized.startsWith("//")
+ ? normalized.toLowerCase()
+ : normalized;
+}
+
export function resolveSourceAudioFallbackPaths(
videoResource: string | null | undefined,
sourceAudioFallbackPaths: string[] | null | undefined,
) {
- const normalizedPaths = (sourceAudioFallbackPaths ?? []).filter(
- (audioPath) => typeof audioPath === "string" && audioPath.trim().length > 0,
- );
+ const normalizedPaths = (sourceAudioFallbackPaths ?? [])
+ .filter((audioPath) => typeof audioPath === "string")
+ .map((audioPath) => audioPath.trim())
+ .filter((audioPath) => audioPath.length > 0);
const localVideoSourcePath = videoResource
? getLocalFilePathFromResource(videoResource)
: null;
+ const localVideoSourceKey = localVideoSourcePath
+ ? getLocalPathComparisonKey(localVideoSourcePath)
+ : null;
+ const fallbackPathEntries = normalizedPaths.map((audioPath) => ({
+ audioPath,
+ comparisonKey: getLocalPathComparisonKey(audioPath),
+ }));
const hasEmbeddedSourceAudio =
- Boolean(localVideoSourcePath) && normalizedPaths.includes(localVideoSourcePath);
+ Boolean(localVideoSourceKey) &&
+ fallbackPathEntries.some(
+ ({ comparisonKey }) => comparisonKey === localVideoSourceKey,
+ );
return {
hasEmbeddedSourceAudio,
externalAudioPaths: hasEmbeddedSourceAudio
- ? normalizedPaths.filter((audioPath) => audioPath !== localVideoSourcePath)
+ ? fallbackPathEntries
+ .filter(({ comparisonKey }) => comparisonKey !== localVideoSourceKey)
+ .map(({ audioPath }) => audioPath)
: normalizedPaths,
};
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const normalizedPaths = (sourceAudioFallbackPaths ?? []).filter( | |
| (audioPath) => typeof audioPath === "string" && audioPath.trim().length > 0, | |
| ); | |
| const localVideoSourcePath = videoResource | |
| ? getLocalFilePathFromResource(videoResource) | |
| : null; | |
| const hasEmbeddedSourceAudio = | |
| Boolean(localVideoSourcePath) && normalizedPaths.includes(localVideoSourcePath); | |
| return { | |
| hasEmbeddedSourceAudio, | |
| externalAudioPaths: hasEmbeddedSourceAudio | |
| ? normalizedPaths.filter((audioPath) => audioPath !== localVideoSourcePath) | |
| : normalizedPaths, | |
| function getLocalPathComparisonKey(resource: string): string { | |
| const localPath = getLocalFilePathFromResource(resource) ?? resource; | |
| const normalized = localPath.trim().replace(/\\/g, "/"); | |
| return /^[A-Za-z]:\//.test(normalized) || normalized.startsWith("//") | |
| ? normalized.toLowerCase() | |
| : normalized; | |
| } | |
| export function resolveSourceAudioFallbackPaths( | |
| videoResource: string | null | undefined, | |
| sourceAudioFallbackPaths: string[] | null | undefined, | |
| ) { | |
| const normalizedPaths = (sourceAudioFallbackPaths ?? []) | |
| .filter((audioPath) => typeof audioPath === "string") | |
| .map((audioPath) => audioPath.trim()) | |
| .filter((audioPath) => audioPath.length > 0); | |
| const localVideoSourcePath = videoResource | |
| ? getLocalFilePathFromResource(videoResource) | |
| : null; | |
| const localVideoSourceKey = localVideoSourcePath | |
| ? getLocalPathComparisonKey(localVideoSourcePath) | |
| : null; | |
| const fallbackPathEntries = normalizedPaths.map((audioPath) => ({ | |
| audioPath, | |
| comparisonKey: getLocalPathComparisonKey(audioPath), | |
| })); | |
| const hasEmbeddedSourceAudio = | |
| Boolean(localVideoSourceKey) && | |
| fallbackPathEntries.some( | |
| ({ comparisonKey }) => comparisonKey === localVideoSourceKey, | |
| ); | |
| return { | |
| hasEmbeddedSourceAudio, | |
| externalAudioPaths: hasEmbeddedSourceAudio | |
| ? fallbackPathEntries | |
| .filter(({ comparisonKey }) => comparisonKey !== localVideoSourceKey) | |
| .map(({ audioPath }) => audioPath) | |
| : normalizedPaths, | |
| }; | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/lib/exporter/sourceAudioFallback.ts` around lines 7 - 20, Normalized path
comparison is missing: before checking hasEmbeddedSourceAudio and filtering
externalAudioPaths in sourceAudioFallback.ts, normalize both each entry in
normalizedPaths and the localVideoSourcePath (use getLocalFilePathFromResource
output) to a canonical form (e.g., convert file:// URIs to local paths, unify
separators, and apply case-insensitive comparison on Windows) and then perform
the includes/filter against those normalized strings (introduce a small helper
like normalizePath and use it when computing hasEmbeddedSourceAudio and when
building externalAudioPaths).
There was a problem hiding this comment.
🧹 Nitpick comments (2)
src/hooks/useScreenRecorder.ts (2)
1000-1042: LGTM — browser-mic fallback path is clearly communicated.Diagnostics logging + warning toast before attempting fallback, and an error toast (with recording continuing) if
getUserMediafails, matches the PR intent of making audio export issues easier to diagnose. Distinct toast ids for warning vs. error avoid overwriting each other.One small note: if the user declines the mic permission prompt,
getUserMediarejects with aNotAllowedErrorwhose.messageis browser-specific and sometimes terse ("Permission denied"). Consider special-casingerror.name === "NotAllowedError"here for a friendlier message — optional.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/hooks/useScreenRecorder.ts` around lines 1000 - 1042, The catch block for the browser fallback getUserMedia should special-case permission-denied errors to show a friendlier message; inside the catch for the getUserMedia call (located in the if block checking nativeResult.microphoneFallbackRequired && microphoneEnabled in useScreenRecorder.ts) inspect micError.name (e.g., "NotAllowedError" or "SecurityError") and, when matched, call toast.error with a tailored message like "Microphone permission denied. Recording will continue without microphone audio." (use the existing MICROPHONE_FALLBACK_ERROR_TOAST_ID and duration); otherwise continue to use getErrorMessage(micError) as before and keep logging the original micError to console for diagnostics.
83-89: Optional: handle non-Error object-like throws more gracefully.
String(error)on a plain object yields"[object Object]", which would leak into user-facing toasts (e.g., the mic fallback error on Line 1038). Consider serializing object-like errors or falling back to a generic string.♻️ Proposed refinement
function getErrorMessage(error: unknown) { if (error instanceof Error && error.message) { return error.message; } - - return String(error); + if (typeof error === "string") { + return error; + } + try { + const serialized = JSON.stringify(error); + if (serialized && serialized !== "{}") { + return serialized; + } + } catch { + /* ignore */ + } + return "Unknown error"; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/hooks/useScreenRecorder.ts` around lines 83 - 89, getErrorMessage currently falls back to String(error) which produces "[object Object]" for plain objects and can leak useless text into user-facing toasts (e.g., the mic fallback toast that calls getErrorMessage). Update getErrorMessage to detect object-like throws (typeof error === 'object' && error !== null) and attempt a safe JSON.stringify with a try/catch (optionally limit length), falling back to error.toString() if JSON fails, and finally return a generic message like "An unexpected error occurred" when nothing useful can be derived; keep the existing Error instance branch intact.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@src/hooks/useScreenRecorder.ts`:
- Around line 1000-1042: The catch block for the browser fallback getUserMedia
should special-case permission-denied errors to show a friendlier message;
inside the catch for the getUserMedia call (located in the if block checking
nativeResult.microphoneFallbackRequired && microphoneEnabled in
useScreenRecorder.ts) inspect micError.name (e.g., "NotAllowedError" or
"SecurityError") and, when matched, call toast.error with a tailored message
like "Microphone permission denied. Recording will continue without microphone
audio." (use the existing MICROPHONE_FALLBACK_ERROR_TOAST_ID and duration);
otherwise continue to use getErrorMessage(micError) as before and keep logging
the original micError to console for diagnostics.
- Around line 83-89: getErrorMessage currently falls back to String(error) which
produces "[object Object]" for plain objects and can leak useless text into
user-facing toasts (e.g., the mic fallback toast that calls getErrorMessage).
Update getErrorMessage to detect object-like throws (typeof error === 'object'
&& error !== null) and attempt a safe JSON.stringify with a try/catch
(optionally limit length), falling back to error.toString() if JSON fails, and
finally return a generic message like "An unexpected error occurred" when
nothing useful can be derived; keep the existing Error instance branch intact.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 4920e75d-8ec3-42fd-9435-ea1d7dcb1fda
📒 Files selected for processing (1)
src/hooks/useScreenRecorder.ts
|
Quick triage pass: this still looks blocked by two real runtime concerns plus the rebase. The important ones are (1) orphan |
Summary
Validation
Summary by CodeRabbit
New Features
Tests