Conversation
- Add zustand dependency - Extract export format generators to lib/export-formats.ts - Extract formatDuration/formatFileSize to lib/format-utils.ts - Extract speaker colors to lib/speaker-colors.ts - Add useDebounce hook - Consolidate Firebase initialization (single app instance) - Add AIFeatures, TranscriptionOptions, AudioSource to types - Create types barrel export - Delete dead code: App.tsx, v2-debug.ts, useV2Announcement.ts Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Create options-store with language, diarize, AI features state - Create history-store with IndexedDB-backed persistence - Both stores provide reactive state for the new route-based architecture Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- New upload page (/) - thin, focused on upload + redirect - New /transcribe/[id] page - processing, polling, results - New /studio/[id] page - fetches from API, full studio - New /history page - IndexedDB-backed history list - New /about page - marketing features overview - Unified Header with responsive nav + mobile drawer - Simplified Footer with link-based navigation - Delete god components: TranscriptionForm, TranscriptionProcessing, TranscriptionResult, TranscriptionError, SessionRecoveryPrompt - Delete mobile duplicates: MobileHeader, MobileFooter, MobileTranscriptionResult - Delete useSessionPersistence hook (replaced by URL routing) - State now lives in the URL (prediction ID = route) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Split the 1583-line god component into focused modules: - AudioPlayer, FileDetails, TranscriptStatistics, ExportControls - EnhancedTranscript, KeyboardShortcutsModal - useAudioPlayer hook for keyboard shortcuts - Merged ActionButtons into ExportControls - Deleted dead MainLayout.tsx - Fixed storage-service import for getStorage() TranscriptionStudio.tsx: 1583 → 240 lines Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Rewrote Changelog.tsx as responsive accordion (no JS mobile detection) - Deleted MobileChangelog.tsx, MobileFeedbackForm.tsx, MobileFeedbackModals.tsx - Deleted unused mobile CSS files (mobile.css, mobile-changelog.css, mobile-feedback.css) 6 files deleted, 1 unified responsive component Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Updated CSS variables: warm off-white light, deep navy-black dark - Added success/warning semantic tokens - Fixed tailwind.config.js to preserve Tailwind default colors - Deleted dead V3AnnouncementModal and ChangelogModal Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Removed verbose console.logs from client-side code - Deleted dead TranscriptionHistory.tsx (replaced by /history page) - Deleted dead sequential-reveal-list.tsx (no importers) - Kept console.error for meaningful error tracing Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Return the transcript audio URL from the prediction API so completed jobs can restore the original audio source from AssemblyAI metadata instead of depending on a stale client cache. Update the transcribe and studio pages to prefer the backend-provided audio URL, persist it into localStorage as a fallback, and use it when rebuilding history and studio state. Make TranscriptionStudio prefer the passed audioSource URL over studioAudioUrl so one transcription cannot override another after navigation or refresh.
Replace the legacy feedback flow with an embedded Tally form that forwards the current page, query params, browser, and OS details, and mount the feedback modal from the root layout so it is available across the app. Swap the previous analytics stack for Vercel Web Analytics, preserve the browser opt-out control, and strip query strings from tracked URLs before events are sent. Add focused 404 and error code pages, simplify the docs and changelog layouts to match the rest of the product, and tighten the shortcuts hotkey handling in the studio audio player.
Rewrite the privacy and terms pages as text-first documents that use the shared site layout, static update dates, and copy aligned with the current product flow around AssemblyAI, Firebase, Tally, and Vercel Web Analytics. Refresh the project dependency set and lockfile, including the Next.js, React, lucide-react, and related tooling versions now reflected in package.json and bun.lock. Adjust the mobile navigation GitHub link icon for lucide-react compatibility and regenerate the tracked TypeScript build info after validating the app with bun install --frozen-lockfile, bun x tsc --noEmit --ignoreDeprecations 6.0, and bun run build.
Remove the in-form Tally disclosure copy, tighten the modal padding, and keep the feedback sheet inset from the viewport edges so it does not press against small screens. Scale the embedded Tally iframe against the available viewport height, preserve the action buttons below it, and keep the modal usable without clipping the content above the fold. Verified with bun x tsc --noEmit --ignoreDeprecations 6.0 and bun run build.
Add top mobile tabs with an active More menu for secondary routes. Rework documentation and legal pages into clearer, more readable reference layouts.
Document the route refactor, mobile navigation, support pages, documentation refresh, and fixes shipped since the previous changelog entry.
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughMigrates UI to Next.js App Router pages, introduces polling-driven transcription and studio flows, adds Zustand stores with IndexedDB-backed history, modularizes studio playback/transcript/export components, replaces Netlify/consent analytics with Tally + Vercel Analytics, moves Firebase to lazy getters, adds export utilities, and removes many legacy mobile/feedback/transcription modules. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant U as User
participant B as Browser (UI)
participant S as Server API
participant H as IndexedDB (History)
participant F as Firebase Storage
U->>B: Upload file or provide audioUrl
B->>S: POST /transcribe (audio data or audioUrl + options)
S-->>B: { id } (prediction id)
B->>H: add({ predictionId:id, status:"processing", audioSource })
loop polling (initial delay then interval)
B->>S: GET /prediction/{id}
alt status == succeeded
S-->>B: { status:"succeeded", output, audio_url? }
alt audio_url present
B->>F: resolve/download or cache audioUrl
B->>localStorage: persist audioUrl_<id>
end
B->>H: patch(id, { status:"succeeded", result, audioSource })
B->>B: navigate to /studio/{id} (render studio)
else status == failed
S-->>B: { status:"failed", error }
B->>H: patch(id, { status:"failed", error })
B->>B: show error UI / allow retry
end
end
Estimated code review effort🎯 5 (Critical) | ⏱️ ~120 minutes Poem
✨ Finishing Touches🧪 Generate unit tests (beta)
|
Code Review Roast 🔥Verdict: No Issues Found | Recommendation: Merge Oh wait, this PR is actually clean. I need to sit down. I had my flamethrower warmed up and everything. 📊 Overall: Like finding a unicorn in production — I didn't think clean PRs existed anymore, but here we are. Files Reviewed (1 file)
Reviewed by grok-code-fast-1:optimized:free · 378,021 tokens |
There was a problem hiding this comment.
Code Review
This pull request implements a major refactor of the application's routing, state management, and component architecture. It introduces a route-based flow, replaces legacy components with focused studio modules, and migrates to Zustand for state management. However, the review identified three critical issues: the history update logic in TranscribePage overwrites existing metadata, the IndexedDB migration lacks data preservation for existing users, and the reliance on a global localStorage key for audio URLs causes potential cross-tab collisions.
There was a problem hiding this comment.
Pull request overview
Ground-up refactor of the Transcriptr frontend into a Next.js App Router, route-based workflow with unified layout/navigation, extracted Studio modules, refreshed design tokens, revamped analytics/feedback, and cleanup of legacy/mobile-duplicate components.
Changes:
- Migrates to route-based app flow (
/,/transcribe/[id],/studio/[id],/history,/about,/changelog,/errors/[code]) with unifiedHeader/Footer. - Extracts Studio functionality into focused components/hooks and adds export/formatting utilities.
- Replaces custom analytics stack with Vercel Analytics (with opt-out) and removes legacy mobile-only UI/CSS and announcement/consent code.
Reviewed changes
Copilot reviewed 75 out of 79 changed files in this pull request and generated 8 comments.
Show a summary per file
| File | Description |
|---|---|
| tailwind.config.js | Adds semantic tokens (success/warning) and animations; simplifies plugins. |
| src/types/transcription.ts | Adds option/audio source types used across new route-based flow. |
| src/types/index.ts | Central export barrel for transcription-related types. |
| src/styles/mobile.css | Removes legacy mobile-only global CSS. |
| src/styles/mobile-feedback.css | Removes legacy mobile feedback CSS. |
| src/styles/mobile-changelog.css | Removes legacy mobile changelog styling (mobile UI now unified/responsive). |
| src/stores/options-store.ts | Adds Zustand store for transcription options + AI feature toggles. |
| src/stores/history-store.ts | Adds IndexedDB-backed history store for route-based history page. |
| src/lib/v2-debug.ts | Removes legacy announcement debug utility. |
| src/lib/storage-service.ts | Updates Firebase storage usage and removes verbose logging. |
| src/lib/speaker-colors.ts | Adds centralized speaker color mapping utility for Studio UI. |
| src/lib/format-utils.ts | Adds shared formatting helpers (duration/file size). |
| src/lib/firebase.ts | Refactors Firebase initialization into cached getApp/getStorage getters. |
| src/lib/firebase-utils.ts | Updates Firebase utils to use the new storage getter. |
| src/lib/export-formats.ts | Adds export generators (SRT/VTT/JSON/CSV/Markdown/DOCX). |
| src/lib/analytics.ts | Removes legacy GA/Clarity analytics implementation. |
| src/index.css | Refreshes design tokens and adds shared .legal-doc styling. |
| src/hooks/useV2Announcement.ts | Removes legacy announcement hook. |
| src/hooks/useTranscriptionPolling.ts | Cleans up client polling logs (no functional change intended). |
| src/hooks/useSessionPersistence.ts | Removes legacy session persistence hook (replaced by route/history approach). |
| src/hooks/useDebounce.ts | Adds shared debounce hook. |
| src/hooks/useAudioPlayer.ts | Adds shared Studio keyboard/audio controls hook. |
| src/data/changelog.ts | Adds new release entry + formatting/style consistency updates. |
| src/components/ui/sequential-reveal-list.tsx | Removes unused legacy animated list component. |
| src/components/ui/mobile-navigation.tsx | Updates icon usage to match unified nav approach. |
| src/components/ui/animated-backdrop.tsx | Adjusts padding to improve modal fit on small screens. |
| src/components/transcription/TranscriptionResult-new.tsx | Removes legacy alternate result component. |
| src/components/transcription/TranscriptionProcessing.tsx | Removes legacy processing UI component (handled by route pages). |
| src/components/transcription/TranscriptionHistory.tsx | Removes legacy history modal component (replaced by /history). |
| src/components/transcription/TranscriptionError.tsx | Removes legacy error component (replaced by route-based error states). |
| src/components/transcription/SessionRecoveryPrompt.tsx | Removes legacy session recovery UI. |
| src/components/transcription/MobileTranscriptionResult.tsx | Removes mobile-only result UI (now responsive/unified). |
| src/components/studio/TranscriptStatistics.tsx | Adds extracted Studio statistics card. |
| src/components/studio/KeyboardShortcutsModal.tsx | Adds extracted Studio keyboard shortcuts modal. |
| src/components/studio/FileDetails.tsx | Adds extracted Studio file details card. |
| src/components/studio/EnhancedTranscript.tsx | Adds extracted transcript UI with search/highlighting + karaoke words. |
| src/components/layout/MobileHeader.tsx | Removes legacy mobile header (replaced by unified Header). |
| src/components/layout/MobileFooter.tsx | Removes legacy mobile footer (replaced by unified Footer). |
| src/components/layout/MainLayout.tsx | Removes monolithic SPA layout wrapper (now App Router layout). |
| src/components/layout/Header.tsx | Replaces old header with unified responsive navigation + More menu. |
| src/components/layout/Footer.tsx | Replaces old footer with unified footer + feedback trigger. |
| src/components/feedback/MobileFeedbackModals.tsx | Removes mobile-only feedback modal implementation. |
| src/components/feedback/FeedbackModals.tsx | Updates global feedback modal mounting (root layout) + improves sizing/scroll. |
| src/components/errors/ErrorState.tsx | Adds reusable error-state presentation component. |
| src/components/analytics/VercelAnalytics.tsx | Adds Vercel Analytics with query stripping + local opt-out support. |
| src/components/analytics/ConsentManager.tsx | Removes legacy cookie consent + analytics manager hook. |
| src/components/analytics/AnalyticsOptOut.tsx | Updates opt-out UI to match Vercel analytics + simplifies behavior. |
| src/components/V3AnnouncementModal.tsx | Removes legacy V3 announcement modal. |
| src/components/MobileChangelog.tsx | Removes mobile-only changelog UI (now single responsive component). |
| src/components/CookieConsent.tsx | Removes legacy cookie consent banner/toast implementation. |
| src/components/ChangelogModal.tsx | Removes modal wrapper (changelog is now a route page). |
| src/components/Changelog.tsx | Refactors changelog UI into an accordion-style responsive component. |
| src/app/terms/page.tsx | Updates legal page to new layout + shared .legal-doc styling. |
| src/app/studio/page.tsx | Converts old query-param studio entry to a redirect/info page. |
| src/app/studio/[id]/page.tsx | Adds Studio route that loads prediction output and renders Studio UI. |
| src/app/page.tsx | Replaces monolithic SPA landing with upload route + route-based flow start. |
| src/app/not-found.tsx | Adds polished 404 route using shared ErrorState. |
| src/app/layout.tsx | Mounts FeedbackModals + VercelAnalytics app-wide; removes legacy scripts/forms. |
| src/app/history/page.tsx | Adds history page backed by IndexedDB store + search/delete UI. |
| src/app/global-error.tsx | Adds root error boundary page using shared ErrorState. |
| src/app/errors/[code]/page.tsx | Adds status-code error pages with static params + recovery actions. |
| src/app/error.tsx | Adds route-level error boundary page. |
| src/app/changelog/page.tsx | Adds changelog route page with summary cards and embedded Changelog. |
| src/app/api/prediction/[id]/route.ts | Returns audioUrl from AssemblyAI transcript audio_url when available. |
| src/app/about/page.tsx | Adds about page aligned with new architecture and UI tokens. |
| src/App.tsx | Removes legacy SPA entry component (App Router now owns rendering). |
| package.json | Updates deps (adds Vercel Analytics + Zustand, removes Clarity/GA stack) and bumps tooling versions. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: dd95822acf
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
There was a problem hiding this comment.
Actionable comments posted: 34
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
package.json (1)
12-12:⚠️ Potential issue | 🟠 MajorDrop the stale
vite previewscript.This repo has migrated to Next.js (currently using v16.2.4), but Line 12 still references Vite. Since
viteis not in the manifest,npm run previewwill fail on a clean checkout.Change it to use
next startto match the framework:Suggested fix
- "preview": "vite preview", + "preview": "next start",🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@package.json` at line 12, Update the package.json "preview" npm script to use Next.js instead of Vite: replace the stale "preview": "vite preview" entry with a Next-compatible command such as "preview": "next start" so npm run preview works on a clean checkout of the migrated Next.js app; locate the "preview" script key in package.json and modify its value accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/app/global-error.tsx`:
- Around line 9-15: The component props should be marked read-only; update the
parameter type for GlobalError so the object is Readonly—e.g. change the type
from { error: Error & { digest?: string }; reset: () => void } to Readonly<{
error: Error & { digest?: string }; reset: () => void }>, leaving the
destructured names (error, reset) and function body unchanged so the props are
treated as immutable like the other App Router entry points.
In `@src/app/history/page.tsx`:
- Around line 141-147: The status rendering treats "starting" as a failure;
update the conditional that currently checks entry.status === "processing" to
treat both "processing" and "starting" as in-progress so the Clock icon (or
in-progress styling) is used for queued jobs; locate the JSX where FileAudio,
Clock, and AlertCircle are rendered (the ternary using entry.status) and change
the second branch to something like entry.status === "processing" ||
entry.status === "starting" so "starting" no longer falls through to the
destructive AlertCircle; apply the same change to the other identical
status-rendering block that uses the same entry.status check.
- Around line 133-137: The Card elements rendered in src/app/history/page.tsx
are only clickable with a mouse and not keyboard-operable; update the Card usage
(the JSX that calls <Card key={entry.predictionId} ... onClick={() =>
handleOpen(entry)}>) to make it focusable and activatable via keyboard by adding
tabIndex={0}, role="button" (or an appropriate semantic element), and an
onKeyDown handler that calls handleOpen(entry) when Enter or Space is pressed,
and ensure any aria-label or descriptive text is present for screen readers;
also keep the existing onClick so both mouse and keyboard trigger the same
handleOpen function.
In `@src/app/page.tsx`:
- Around line 20-27: handleUpload lacks a synchronous reentrancy guard so rapid
double-submits can run before setIsSubmitting re-renders; add an immediate guard
(e.g., a module/local ref like isSubmittingRef or a boolean flag checked at top
of handleUpload) that returns early if already submitting, set the guard true
before any async work, and clear it in a finally block alongside calling
setIsSubmitting(false) to ensure the flag is always reset; update references in
handleUpload, and ensure any upload/transcription POST logic is skipped when the
guard is set.
In `@src/app/privacy/page.tsx`:
- Around line 47-58: The FeedbackForm component currently builds URLSearchParams
from window.location.search and forwards it to Tally via params.toString(),
which leaks sensitive query params; change FeedbackForm.tsx so instead of
initializing URLSearchParams with the raw search you create a new
URLSearchParams and explicitly set only non-sensitive, whitelisted keys (e.g.
"alignLeft","hideTitle","transparentBackground","dynamicHeight","feedbackType"
using the existing initialType, plus "browser" and "os"/"operatingSystem" using
the values you already compute), and use window.location.pathname (not full href
or query) for origin page; remove any logic that copies or appends
window.location.search or unknown query params before building the Tally embed
URL.
In `@src/app/studio/`[id]/page.tsx:
- Around line 96-101: The cached audio URL uses a global key "studioAudioUrl" so
audio for one transcription can be read for another; update the get/set to scope
the key by the transcription id (e.g., build the key as `studioAudioUrl:${id}`
or `studioAudioUrl:${result.id}`) when reading into `audioUrl` and when calling
localStorage.setItem, ensuring you reference the same id in both the read and
write paths in page.tsx around the `audioUrl` and `result.audioUrl` logic.
- Around line 109-123: The redirect via router.replace(`/transcribe/${id}`)
still lets the function reach the finally block which calls setIsLoading(false)
and setError, causing a flash; fix by introducing a local flag (e.g.,
didRedirect) set to true immediately before calling router.replace in the branch
that handles result.status === "processing" or "starting", and then in the
finally block only call setIsLoading(false) / setError when didRedirect is false
(or return early from the caller if you prefer); update references around
router.replace, setIsLoading, and setError to use this redirect guard so UI
teardown is skipped when a redirect occurred.
In `@src/app/studio/page.tsx`:
- Around line 14-22: Replace reading window.location.search inside the
mount-only useEffect with Next's useSearchParams from next/navigation: import
and call useSearchParams() in the component body, get session via
searchParams.get("session"), and if session exists call router.replace("/")
immediately and return early from the component to prevent the empty-state flash
(remove or no longer rely on the useEffect that uses window.location.search).
Update references to useEffect and router accordingly so the redirect happens
before rendering.
In `@src/app/transcribe/`[id]/page.tsx:
- Around line 190-197: The handleRetry function starts a new interval without
clearing any existing timer, which can create duplicate poll loops; modify
handleRetry to check pollRef.current and call clearInterval(pollRef.current)
(and set pollRef.current = null) before calling poll() and assigning
pollRef.current = setInterval(poll, 5000) so any previous interval is stopped
first.
- Around line 65-70: The code currently uses a shared "studioAudioUrl"
localStorage key which lets one prediction's cached URL leak into another;
change the storage key to be unique per prediction (e.g. include the prediction
id) and use that composed key for both the getItem and setItem calls so audioUrl
= data.audioUrl || localStorage.getItem(`studioAudioUrl:${id}`) || undefined and
only write localStorage.setItem(`studioAudioUrl:${id}`, data.audioUrl) when
data.audioUrl exists (references: the audioUrl variable and the
localStorage.getItem/setItem calls in page.tsx).
- Line 291: The word count expression using
result.transcription.split(/\s+/).length returns 1 for an empty string; update
the calculation to handle empty/whitespace-only transcripts by trimming or
filtering empty tokens — e.g. compute words as result.transcription.trim() ===
'' ? 0 : result.transcription.trim().split(/\s+/).length or use
result.transcription.split(/\s+/).filter(Boolean).length and replace the current
{result.transcription.split(/\s+/).length} usage in the component.
In `@src/components/feedback/FeedbackForm.tsx`:
- Around line 55-66: The code in FeedbackForm.tsx currently seeds params from
window.location.search and sets originUrl to window.location.href, which can
leak sensitive query parameters; change the logic in the block that builds
params (the URLSearchParams creation and subsequent params.set calls) to not
read or forward the page query string and to send a sanitized origin URL (use
window.location.pathname or construct window.location.origin +
window.location.pathname without search/hash) instead of window.location.href;
ensure you remove the initial URLSearchParams(window.location.search) usage and
replace originUrl with the sanitized value while keeping other params (e.g.,
feedbackType, browser, operatingSystem) intact.
In `@src/components/layout/Footer.tsx`:
- Around line 10-20: Replace the anchor used as a feedback trigger with a
semantic button in Footer.tsx: remove href="#" and the e.preventDefault() logic,
change the <a> to <button type="button">, keep the className
("hover:text-foreground") and the onClick handler that calls
window.openFeedbackModal("general") (guarding typeof window !== "undefined" &&
window.openFeedbackModal), and add an accessible label if the visible text isn't
descriptive (e.g., aria-label="Open feedback dialog") so the control correctly
represents UI state rather than navigation.
In `@src/components/studio/AudioPlayer.tsx`:
- Around line 243-251: The progress bar div uses role="slider" but lacks
keyboard interaction and focusability; make it keyboard-accessible by adding
tabIndex={0}, implementing an onKeyDown handler (e.g., handleProgressBarKeyDown)
that responds to ArrowLeft/ArrowRight (and optionally Home/End/PageUp/PageDown)
to decrement/increment/seek to start/end and calls the same seek logic as
handleProgressBarClick, and ensure aria-valuenow (currentTime) and
aria-valuetext/aria-valuemin/aria-valuemax remain updated; update the component
to export/define handleProgressBarKeyDown and reuse existing seek/update
functions so both mouse and keyboard controls change playback consistently.
- Around line 122-133: The togglePlay handler flips isPlaying regardless of
whether audioRef.current.play() succeeds, causing the UI to show "Pause" when
playback failed; update togglePlay so that when isPlaying is false you call
audioRef.current.play() and only setIsPlaying(true) in the play() promise
resolution (e.g., use play().then(() => setIsPlaying(true)).catch(...)), and
when isPlaying is true call audioRef.current.pause() and setIsPlaying(false)
immediately; reference togglePlay, audioRef.current.play(),
audioRef.current.pause(), isPlaying, and setIsPlaying to locate and adjust the
logic.
- Around line 371-390: The A/B loop controls are hidden when both loopStart and
loopEnd are null, preventing the user from ever setting the first loop point;
update the render logic in AudioPlayer.tsx so the Button group with A/B labels
(the elements using loopStart, loopEnd, setLoopPoint and formatDuration) is
always rendered (or at least rendered when audio is loaded) instead of gated by
(loopStart !== null || loopEnd !== null), and only disable the buttons when
there is no loaded audio/duration rather than hiding them—this ensures clicking
the A or B Button will call setLoopPoint("start") / setLoopPoint("end") to
create the initial loop points.
In `@src/components/studio/EnhancedTranscript.tsx`:
- Around line 183-197: The segment container in EnhancedTranscript.tsx is
currently a non-focusable <div>, so make it keyboard-accessible by either
converting it to a semantic <button> or adding role="button" and tabIndex={0} to
the existing container; add an onKeyDown handler that listens for Enter and
Space and calls onSegmentClick?.(segment.start) (same action as onClick),
preserve ref assignment (activeSegmentRef) and all className logic, and ensure
the title and aria-label (e.g., `aria-label={`Play from ${segment.start}s`}`)
are present to announce purpose to screen readers.
- Around line 231-233: The code constructs a RegExp from raw user input
(`searchTerm`) which can contain special regex characters and throw at render
time; fix by escaping `searchTerm` before building the regex (add a small helper
like `escapeRegExp(term: string) => term.replace(/[.*+?^${}()|[\]\\]/g,
'\\$&')`) and use the escaped value in the `new RegExp(...)` call in the split
expression in EnhancedTranscript (keep the "gi" flags and the capturing group
around the escaped term).
In `@src/components/studio/ExportControls.tsx`:
- Around line 22-37: The timestamp helpers formatTimeForSRT and formatTimeForVTT
can produce "1000" milliseconds due to rounding, creating invalid timestamps;
change both functions to compute total milliseconds (Math.round(seconds *
1000)), derive hours/minutes/seconds/milliseconds from that total (so
milliseconds is always 0-999 and seconds/minutes/hours carry correctly), and
replace duplicated logic by moving this shared formatter into the existing
export helper used by src/lib/export-formats.ts (e.g., extract into a single
formatTimeForSubtitle util and call it from formatTimeForSRT/formatTimeForVTT or
have them both reuse the util) so rounding and carry are handled consistently
across exports.
In `@src/components/studio/FileDetails.tsx`:
- Around line 48-52: The Type row currently defaults the Badge to "URL" when
audioSource is missing; update FileDetails.tsx so the Badge displays an explicit
placeholder (e.g., "Unknown") or hide the entire Type row until audioSource is
defined: check the audioSource variable in the component and render either a
disabled/placeholder Badge with text "Unknown" (or similar) when audioSource is
null/undefined, or conditionally skip rendering the <div className="flex
items-center justify-between"> for Type; reference the audioSource prop/state
and the Badge component to implement this change.
In `@src/components/studio/KeyboardShortcutsModal.tsx`:
- Around line 34-72: The modal lacks proper dialog semantics and keyboard
handling: update the KeyboardShortcutsModal wrapper to include role="dialog",
aria-modal="true" and add an id to the title heading (e.g.,
keyboard-shortcuts-title) and set aria-labelledby on the dialog container so it
is announced; add an accessible name to the close Button (e.g.,
aria-label="Close") instead of relying on the icon; and implement
Escape-to-close by adding a useEffect in the KeyboardShortcutsModal component
that registers a keydown listener which calls the existing onClose prop when
event.key === "Escape" (and cleans up on unmount). Ensure the inner content
still stops propagation (e.stopPropagation()) so backdrop clicks close but
clicks inside do not.
In `@src/components/studio/TranscriptStatistics.tsx`:
- Around line 41-52: The current words.forEach block builds wordFreq by
stripping everything except ASCII a–z which drops accented and non-Latin
characters; update the cleaning step used inside words.forEach (the expression
that builds "word") to use a Unicode-aware regex with property escapes (e.g.,
remove characters not in \p{L} or \p{N} and use the u flag) and use a
locale-aware lowercasing (toLocaleLowerCase()) so words like "mañana" and
"résumé" are preserved; keep the existing length filter (>3) and the downstream
topWords sorting logic unchanged.
In `@src/hooks/useAudioPlayer.ts`:
- Around line 35-75: The handler handleKeyDown is only skipping text inputs but
still captures keys when focus is on other interactive elements (buttons, links,
selects, sliders, etc.); change the isInputField check to a broader
isInteractiveElement that returns true for target elements matching input,
textarea, select, button, a[href], [contenteditable="true"] or common ARIA roles
(e.g., role="button", "link", "slider", "combobox", "option"), or when the
element has a non-negative tabindex/disabled state, and use that to
early-return; update uses in handleKeyDown (and the same pattern in the rest of
the hook) so keyboard shortcuts (Space, arrows, etc.) are ignored whenever focus
is on an interactive control instead of only on text inputs.
- Around line 56-66: The current key handler in useAudioPlayer.ts intercepts
Ctrl/Cmd+C even when the user has a text selection, causing the entire
transcription (transcription) to be copied instead of the selected text; update
the handler in the useAudioPlayer keydown logic (the block that checks
(e.ctrlKey || e.metaKey) && e.key === "c" && !isInputField) to first check the
current window selection (window.getSelection() or similar) and only run
navigator.clipboard.writeText(transcription) when there is no non-empty
selection (selection.toString() is empty); otherwise, allow the event to
propagate or do nothing so the native copy of the selection is preserved.
In `@src/hooks/useTranscriptionPolling.ts`:
- Around line 171-175: The initial delayed poll created with setTimeout is not
tracked or cleared, so it may fire after stopPolling() or unmount; store the
timeout ID (e.g. in a new ref like initialPollTimeoutRef) when calling
setTimeout(() => poll(), 500) and clear it with
clearTimeout(initialPollTimeoutRef.current) inside stopPolling() and the hook's
cleanup (alongside clearing pollIntervalRef.current via clearInterval) to
prevent a stale poll from running; make sure to initialize and null-check the
ref when clearing.
In `@src/index.css`:
- Around line 154-190: Enable Tailwind directives in the Biome CSS parser so the
new .legal-doc Tailwind utilities and existing directives stop causing parse
errors: edit your Biome config (biome.json) and under "css.parser" set
"tailwindDirectives" to true (ensure "css.linter.enabled" is true and
"css.parser.cssModules" remains as needed) so the parser accepts Tailwind
directives used in src/index.css (the .legal-doc component and other Tailwind
rules).
In `@src/lib/export-formats.ts`:
- Around line 115-116: The exported Markdown shows "--:--" for zero-second
timestamps because formatDuration(0) returns "--:--"; update the Markdown
assembly in export-formats.ts to handle zeros explicitly (e.g., replace uses of
formatDuration(ch.start) and formatDuration(ch.end) with a small conditional
that prints "0:00" when the value === 0, or call a new helper like
formatDurationOrZero that returns "0:00" for 0 and defers to formatDuration
otherwise) so chapter/segment lines using ch.start and ch.end render correctly.
- Around line 5-19: The current implementations of formatTimeForSRT and
formatTimeForVTT can produce milliseconds value of 1000 due to rounding,
producing invalid timestamps; fix both by converting the input seconds into
totalMilliseconds (use Math.round(seconds * 1000)), compute ms =
totalMilliseconds % 1000, derive totalSeconds = Math.floor(totalMilliseconds /
1000) and then compute hours/minutes/seconds from totalSeconds so any carried
milliseconds roll into the seconds correctly before formatting the final string.
In `@src/lib/firebase-utils.ts`:
- Line 25: The call to getStorage() is done outside the try/catch, which can
throw during lazy Firebase init and bypass the catch; move the getStorage()
invocation into the existing try block in src/lib/firebase-utils.ts so that
storageInstance (the variable currently set via const storageInstance =
getStorage()) is created inside the try and any errors are handled by the
existing catch that normalizes errors for functions like (where storage is used)
— update references to storageInstance within the file accordingly so the
variable is initialized in the try scope and the catch continues to wrap/return
error.code and error.message.
In `@src/lib/format-utils.ts`:
- Around line 1-5: The function formatDuration treats 0 as missing because it
uses a falsy check; update the guard in formatDuration to only return the
placeholder for null or undefined (e.g., seconds === null || seconds ===
undefined) so a real 0 value formats as "0:00", keeping the rest of the logic
(Math.floor for mins/secs and padStart) unchanged.
- Around line 8-12: The function formatFileSize treats 0 as falsy and returns a
placeholder; change the guard to an explicit null/undefined check so zero bytes
are formatted normally. In formatFileSize, replace the `if (!bytes) return "--"`
with a nullish check (e.g., `if (bytes == null) return "--"` or `if (bytes ===
undefined || bytes === null) return "--"`), leaving the rest of the computation
(`const mb = bytes / (1024 * 1024)` and `return `${mb.toFixed(1)} MB``)
unchanged so 0 becomes "0.0 MB".
In `@src/stores/history-store.ts`:
- Around line 69-75: The promise currently resolves on the individual request
success (request.onsuccess) which can precede transaction commit; change it to
resolve only when the transaction (tx) completes: capture request.result in a
local variable when request.onsuccess fires, then call resolve(result) from
tx.oncomplete, and call reject with the transaction error on tx.onerror and
tx.onabort (include the underlying error/event). Update the Promise logic in the
function using db, HISTORY_STORE_NAME, mode, fn, tx and request so that
transaction lifecycle handlers drive final resolve/reject rather than
request.onsuccess/request.onerror.
In `@tailwind.config.js`:
- Around line 24-31: The tailwind color keys success and warning in
tailwind.config.js currently reference non-existent CSS variables
var(--color-success) and var(--color-warning); update those references to match
the actual variables defined in src/index.css (use var(--success) and
var(--warning)), or alternatively rename the CSS variables in src/index.css to
--color-success and --color-warning so the keys in tailwind.config.js (success,
warning) resolve correctly; adjust whichever file is simpler to change so
utilities like text-success/bg-success and text-warning/bg-warning work.
---
Outside diff comments:
In `@package.json`:
- Line 12: Update the package.json "preview" npm script to use Next.js instead
of Vite: replace the stale "preview": "vite preview" entry with a
Next-compatible command such as "preview": "next start" so npm run preview works
on a clean checkout of the migrated Next.js app; locate the "preview" script key
in package.json and modify its value accordingly.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro Plus
Run ID: 5265c688-70c5-4fe8-b27f-37ec88dae122
⛔ Files ignored due to path filters (1)
bun.lockis excluded by!**/*.lock
📒 Files selected for processing (78)
package.jsonsrc/App.tsxsrc/app/about/page.tsxsrc/app/api/prediction/[id]/route.tssrc/app/changelog/page.tsxsrc/app/documentation/page.tsxsrc/app/error.tsxsrc/app/errors/[code]/page.tsxsrc/app/global-error.tsxsrc/app/history/page.tsxsrc/app/layout.tsxsrc/app/not-found.tsxsrc/app/page.tsxsrc/app/privacy/page.tsxsrc/app/studio/[id]/page.tsxsrc/app/studio/page.tsxsrc/app/terms/page.tsxsrc/app/transcribe/[id]/page.tsxsrc/components/Changelog.tsxsrc/components/ChangelogModal.tsxsrc/components/CookieConsent.tsxsrc/components/MobileChangelog.tsxsrc/components/V3AnnouncementModal.tsxsrc/components/analytics/AnalyticsOptOut.tsxsrc/components/analytics/ConsentManager.tsxsrc/components/analytics/VercelAnalytics.tsxsrc/components/errors/ErrorState.tsxsrc/components/feedback/FeedbackForm.tsxsrc/components/feedback/FeedbackModals.tsxsrc/components/feedback/MobileFeedbackForm.tsxsrc/components/feedback/MobileFeedbackModals.tsxsrc/components/layout/Footer.tsxsrc/components/layout/Header.tsxsrc/components/layout/MainLayout.tsxsrc/components/layout/MobileFooter.tsxsrc/components/layout/MobileHeader.tsxsrc/components/studio/AudioPlayer.tsxsrc/components/studio/EnhancedTranscript.tsxsrc/components/studio/ExportControls.tsxsrc/components/studio/FileDetails.tsxsrc/components/studio/KeyboardShortcutsModal.tsxsrc/components/studio/TranscriptStatistics.tsxsrc/components/transcription/MobileTranscriptionResult.tsxsrc/components/transcription/SessionRecoveryPrompt.tsxsrc/components/transcription/TranscriptionError.tsxsrc/components/transcription/TranscriptionForm.tsxsrc/components/transcription/TranscriptionHistory.tsxsrc/components/transcription/TranscriptionProcessing.tsxsrc/components/transcription/TranscriptionResult-new.tsxsrc/components/transcription/TranscriptionResult.tsxsrc/components/transcription/TranscriptionStudio.tsxsrc/components/ui/animated-backdrop.tsxsrc/components/ui/mobile-navigation.tsxsrc/components/ui/sequential-reveal-list.tsxsrc/data/changelog.tssrc/hooks/useAudioPlayer.tssrc/hooks/useDebounce.tssrc/hooks/useSessionPersistence.tssrc/hooks/useTranscriptionPolling.tssrc/hooks/useV2Announcement.tssrc/index.csssrc/lib/analytics.tssrc/lib/export-formats.tssrc/lib/firebase-utils.tssrc/lib/firebase.tssrc/lib/format-utils.tssrc/lib/speaker-colors.tssrc/lib/storage-service.tssrc/lib/v2-debug.tssrc/stores/history-store.tssrc/stores/options-store.tssrc/styles/mobile-changelog.csssrc/styles/mobile-feedback.csssrc/styles/mobile.csssrc/types/index.tssrc/types/transcription.tstailwind.config.jstsconfig.tsbuildinfo
💤 Files with no reviewable changes (26)
- src/App.tsx
- src/components/V3AnnouncementModal.tsx
- src/components/feedback/MobileFeedbackModals.tsx
- src/styles/mobile.css
- src/components/ui/sequential-reveal-list.tsx
- src/hooks/useV2Announcement.ts
- src/components/layout/MobileFooter.tsx
- src/components/analytics/ConsentManager.tsx
- src/components/MobileChangelog.tsx
- src/components/transcription/SessionRecoveryPrompt.tsx
- src/components/layout/MainLayout.tsx
- src/components/transcription/TranscriptionResult-new.tsx
- src/components/feedback/MobileFeedbackForm.tsx
- src/styles/mobile-feedback.css
- src/components/transcription/MobileTranscriptionResult.tsx
- src/components/transcription/TranscriptionError.tsx
- src/components/layout/MobileHeader.tsx
- src/components/ChangelogModal.tsx
- src/hooks/useSessionPersistence.ts
- src/components/transcription/TranscriptionHistory.tsx
- src/lib/analytics.ts
- src/components/transcription/TranscriptionResult.tsx
- src/components/CookieConsent.tsx
- src/lib/v2-debug.ts
- src/components/transcription/TranscriptionProcessing.tsx
- src/components/transcription/TranscriptionForm.tsx
| const handleUpload = useCallback( | ||
| async ( | ||
| data: FormData | { audioUrl: string }, | ||
| options: { language: string; diarize: boolean; aiFeatures: AIFeatures }, | ||
| ) => { | ||
| setIsSubmitting(true) | ||
|
|
||
| try { |
There was a problem hiding this comment.
Prevent duplicate uploads/transcriptions in handleUpload.
This callback kicks off a file upload and a POST, but it has no reentrancy guard. A fast double-submit can run both requests before the isSubmitting rerender lands, producing duplicate transcriptions and duplicate history entries.
🛡️ Add a synchronous guard
-import { useState, useCallback } from "react"
+import { useState, useCallback, useRef } from "react"
...
export default function UploadPage() {
const router = useRouter()
const [isSubmitting, setIsSubmitting] = useState(false)
+ const submittingRef = useRef(false)
const addToHistory = useHistoryStore((s) => s.add)
...
async (
data: FormData | { audioUrl: string },
options: { language: string; diarize: boolean; aiFeatures: AIFeatures },
) => {
+ if (submittingRef.current) return
+ submittingRef.current = true
setIsSubmitting(true)
...
} catch (err) {
+ submittingRef.current = false
console.error("Upload failed:", err)
const errorInfo = getUserFriendlyErrorMessage(err)
toast.error(errorInfo.userMessage)
setIsSubmitting(false)
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/app/page.tsx` around lines 20 - 27, handleUpload lacks a synchronous
reentrancy guard so rapid double-submits can run before setIsSubmitting
re-renders; add an immediate guard (e.g., a module/local ref like
isSubmittingRef or a boolean flag checked at top of handleUpload) that returns
early if already submitting, set the guard true before any async work, and clear
it in a finally block alongside calling setIsSubmitting(false) to ensure the
flag is always reset; update references in handleUpload, and ensure any
upload/transcription POST logic is skipped when the guard is set.
Use globalThis.window guards for browser-only APIs to avoid SSR errors.
Rename Error component to RouteError to avoid reserved identifier conflict.
Apply Readonly<{}> to props types, fix Tailwind class ordering, and tighten
minor type issues flagged by tsc across studio, feedback, history, and layout components.
Add a patch() method to useHistoryStore that reads the existing IndexedDB record and merges only the provided fields, preserving the original options, language, diarize settings, and metadata written at upload time. Update the transcribe page to call patch() on completion so only status, result, and audioUrl are updated.
During onupgradeneeded, when the old transcription-sessions store exists but the new transcription-history store does not, iterate all old sessions and write any with a predictionId into the new store — mapping the status, audioSource, options, and result fields to the HistoryEntry shape. Prevents users upgrading from the previous version from losing their history.
Replace the global studioAudioUrl key with audioUrl_${id} so concurrent
tabs for different transcriptions never overwrite each other.
- page.tsx: remove pre-ID writes; write once after POST returns the ID
- transcribe/[id] and studio/[id]: read and write with the scoped key
- TranscriptionStudio: drop global fallback — pages now always supply url
Derive all time components from total integer milliseconds so carry-over into seconds/minutes/hours is automatic and ms stays within 0-999.
Remove eslint, eslint-plugin-react-hooks, eslint-plugin-react-refresh, and typescript-eslint dependencies and configuration in favor of Biome.
Enable Tailwind directive support in the Biome CSS parser, rename linting scripts in package.json, and remove legacy npm configuration. Co-authored-by: Copilot <copilot@github.com>
- Refactor `useTranscriptionPolling` to use `useCallback` and fix dependency arrays to prevent unnecessary re-renders and remove lint suppressions. - Improve type safety in `transcribe/[id]/page.tsx` by introducing `RawSegment` type. - Clean up `useEffect` dependency arrays in `UploadAudio` and `FeedbackForm`. - Remove unused `baseUrl` from `tsconfig.json`. - Minor CSS variable adjustment for `--chart-2`.
- Reformat large portions of the codebase to use consistent semicolon usage and spacing. - Standardize import styles and line breaks across components and hooks. - Update `biome.json` and `tsconfig.json` to reflect project-wide formatting rules. - Refactor various UI components and utility functions for better readability and adherence to the new style guide.
- Standardize import ordering and grouping across all components, hooks, and library files. - Refactor `useTranscriptionPolling` to improve stability by implementing explicit `stopPolling` and `startPolling` mechanisms using `useCallback`. - Clean up unused imports and reorganize type exports in `src/types/index.ts`. - Improve type safety and consistency in API routes and utility functions.
- Implement cryptographically secure random string generation using `crypto.getRandomValues` for filenames and session IDs. - Add `Suspense` boundary to `StudioRedirectPage` to handle client-side navigation requirements. - Improve server-side security by validating and sanitizing Firebase Storage URLs and Printerz template IDs. - Refactor `AudioPlayer` playback logic for better guard clauses and error handling. - Update `persistence-service` to use `globalThis.indexedDB` for improved environment compatibility. - Optimize data processing in `firebase-utils` using `codePointAt` for safer character handling.
Add package-lock.json to .gitignore to prevent it from being tracked by version control.
There was a problem hiding this comment.
Actionable comments posted: 32
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (12)
src/components/ui/animated-list.tsx (1)
17-38: 🧹 Nitpick | 🔵 TrivialAvoid recreating Framer Motion variants objects on every render.
containeranditemvariants are plain objects allocated each render. For small lists this is fine, but it’s an easy win to hoist them outside the component or memoize them so their reference identity stays stable (can reduce avoidable work inside Framer Motion / downstream memoization).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/ui/animated-list.tsx` around lines 17 - 38, The container and item variant objects are being recreated on every render; hoist the variant definitions (container and item) outside the component scope or memoize them with useMemo(() => ({...}), []) so their reference identity is stable; update references in the component to use the hoisted/memoized container and item variables to avoid re-allocating the Framer Motion variants each render.postcss.config.js (1)
2-5: 🧹 Nitpick | 🔵 TrivialOptional: consider dropping
autoprefixerif Tailwind v4 already handles prefixes.Tailwind v4’s processing pipeline often removes the need for a separate
autoprefixerstep. Not required for correctness, but if build time matters you may want to confirm whether your setup still benefits from keeping it.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@postcss.config.js` around lines 2 - 5, The config currently includes both "@tailwindcss/postcss" and "autoprefixer" in the plugins block; if you want to drop unnecessary build steps, evaluate whether Tailwind v4 already covers vendor prefixing and, if confirmed, remove the "autoprefixer" plugin entry from the plugins object in postcss.config.js (leaving "@tailwindcss/postcss": {} in place) to reduce build time.scripts/setup-firebase-cors.js (1)
16-30:⚠️ Potential issue | 🟡 MinorSecurity/config note:
origin: ["*"]is very permissive.If the storage objects are intended to be restricted, this should be tightened (e.g., to your app’s origins). If you intentionally want public access, add a comment documenting that intent.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/setup-firebase-cors.js` around lines 16 - 30, The CORS config uses origin: ["*"] in the corsConfig object which is overly permissive; update corsConfig.origin to list the specific allowed origins (e.g., your web app URLs) instead of "*" or, if public access is intentional, replace the wildcard with a clear inline comment explaining that public access is deliberate and why; locate and modify the corsConfig object (the origin property inside the cors array) and ensure any environment-specific origins are loaded from configuration/ENV rather than hard-coding.src/components/ui/progress.tsx (1)
14-27: 🧹 Nitpick | 🔵 TrivialOptional hardening: clamp
valueto [0, 100].If any caller can pass values outside the expected range, the
translateXcalculation can produce unintended progress indicator positions. Consider clampingvalue ?? 0before using it in the transform.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/ui/progress.tsx` around lines 14 - 27, Clamp the progress value before using it in the transform to ensure it stays within [0,100]; in the ProgressPrimitive.Indicator transform replace direct use of (value || 0) with a bounded value (e.g., compute safeValue = Math.max(0, Math.min(100, value ?? 0))) and use safeValue in the translateX calculation so the indicator cannot be moved outside its intended range.src/lib/firebase-proxy.ts (2)
93-113:⚠️ Potential issue | 🟡 MinorRemove unused function or add runtime guard.
createDownloadableDataUrl()is exported but never invoked anywhere in the codebase. If this is intentional dead code, remove it; if it's meant as a public API, add a guard for server contexts to prevent runtime errors:if (typeof window === "undefined") throw new Error("This function requires a DOM environment").🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/firebase-proxy.ts` around lines 93 - 113, The exported function createDownloadableDataUrl is unused and will error in non-DOM/server environments; either remove the function if it's dead code or add a runtime guard at the top of createDownloadableDataUrl that checks if typeof window === "undefined" and throws a clear Error indicating a DOM environment is required, so callers on the server fail fast; ensure the change is applied inside the createDownloadableDataUrl function (or delete its export) and update any public API docs/tests accordingly.
75-88:⚠️ Potential issue | 🟡 MinorAdd
"use client"directive or SSR guards to prevent future footguns with unguardedwindow/documentaccess.The functions in this file (
determineServerUrl,proxyFirebaseDownload,generatePdf,createDownloadableDataUrl) are currently unused throughout the codebase. However, they are exported from a module lacking the"use client"directive, creating a latent risk: if imported into server context (route handler, server action, etc.),window.location.hostnameand DOM APIs will crash at runtime.Either:
- Add
"use client"to the top offirebase-proxy.tsto explicitly mark it as client-only, or- Add
if (typeof window === "undefined") return ""guard todetermineServerUrl()as proposed in the original reviewRecommend the first approach (adding
"use client") since all four functions depend on browser APIs and have no server-side purpose.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/firebase-proxy.ts` around lines 75 - 88, This module exports browser-only functions (determineServerUrl, proxyFirebaseDownload, generatePdf, createDownloadableDataUrl) but lacks a client-only marker; add the "use client" directive as the very first line of the file to ensure the module is never imported in a server context, preventing runtime errors from window/document access. If you choose not to add "use client", alternatively add a runtime guard at the top of determineServerUrl (e.g., if (typeof window === "undefined") return "") and similar guards in the other exported functions to safely no-op on the server, but prefer the "use client" approach since all four functions are browser-only.src/app/api/printerz-proxy/route.ts (2)
32-42:⚠️ Potential issue | 🟠 MajorProtect upstream call with a timeout
The Printerz fetch call has no timeout. Add
AbortSignal.timeout(...)so slow upstreams don’t hold request handlers indefinitely.⏱️ Proposed fix
const response = await fetch( `https://api.printerz.dev/templates/${templateId}/render`, { method: "POST", headers: { "x-api-key": apiKey, "Content-Type": "application/json", }, body: JSON.stringify(printerzData), + signal: AbortSignal.timeout(15_000), }, )🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/app/api/printerz-proxy/route.ts` around lines 32 - 42, The fetch to Printerz in the route handler (the call that assigns response) lacks a timeout and can hang; fix by creating an AbortSignal via AbortSignal.timeout(<ms>) and pass it as the signal option in the fetch options object (alongside method, headers, body), choosing a sensible timeout (e.g. 3–10s). Update the fetch invocation that uses templateId, apiKey, and printerzData to include signal: AbortSignal.timeout(timeoutMs) (or create an AbortController if you need to clear it elsewhere) so slow upstreams are aborted instead of blocking the request handler.
33-34:⚠️ Potential issue | 🟠 MajorEncode
templateIdbefore URL interpolationAt Line 33, direct interpolation allows path-shaping via special characters (
/,?,#). Encode the path segment before building the URL.🔧 Proposed fix
- `https://api.printerz.dev/templates/${templateId}/render`, + `https://api.printerz.dev/templates/${encodeURIComponent(templateId)}/render`,🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/app/api/printerz-proxy/route.ts` around lines 33 - 34, The URL is built by directly interpolating templateId into the path (`https://api.printerz.dev/templates/${templateId}/render`) which allows special characters to change the path; fix it by encoding the path segment before interpolation—replace usage of templateId in the URL construction with encodeURIComponent(templateId) so the templateId variable is safely escaped when building the fetch/HTTP request in this route handler.src/lib/pdf-generation.ts (1)
201-207:⚠️ Potential issue | 🟠 MajorEscape user content in HTML fallback.
The fallback document injects raw
title/contentTextinto HTML. If transcript text contains tags/scripts, the generated file can execute unintended markup.🔧 Proposed fix
- const title = (data.title as string) || "Transcription" - const contentText = (data.content as string) || "" + const title = (data.title as string) || "Transcription" + const contentText = (data.content as string) || "" + const escapeHtml = (value: string) => + value + .replaceAll("&", "&") + .replaceAll("<", "<") + .replaceAll(">", ">") + .replaceAll('"', """) + .replaceAll("'", "'") + const safeTitle = escapeHtml(title) + const safeContent = escapeHtml(contentText) @@ - <title>${title}</title> + <title>${safeTitle}</title> @@ - <div class="title">${title}</div> + <div class="title">${safeTitle}</div> @@ - <div class="content">${contentText}</div> + <div class="content">${safeContent}</div>Also applies to: 247-251
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/pdf-generation.ts` around lines 201 - 207, The HTML fallback injects raw title and contentText into the template (the htmlContent string), allowing script/HTML injection; fix it by HTML-escaping those values before interpolation: add or reuse a helper like escapeHtml/escapeHtmlEntities and call it for title and contentText (and any other user-derived fields used in the template), then interpolate the escaped strings into htmlContent (and the other similar fallback block around contentText). Ensure all user content rendered into the fallback document uses this escape function.src/server/index.ts (1)
64-71:⚠️ Potential issue | 🟠 MajorAdd a timeout to every outbound
fetch.These handlers proxy third-party services on the request path. Right now a hung upstream can hold the Node request open indefinitely and tie up server capacity. Please wrap these calls in a shared timeout/abort helper and translate aborts into a
504.Suggested hardening
+const fetchWithTimeout = async ( + input: RequestInfo | URL, + init: RequestInit = {}, + timeoutMs = 30000, +) => { + const controller = new AbortController() + const timeout = setTimeout(() => controller.abort(), timeoutMs) + + try { + return await fetch(input, { ...init, signal: controller.signal }) + } finally { + clearTimeout(timeout) + } +} + -const response = await fetch("https://api.assemblyai.com/v2/transcript", { +const response = await fetchWithTimeout("https://api.assemblyai.com/v2/transcript", { method: "POST", headers: { Authorization: process.env.ASSEMBLYAI_API_KEY || "", "Content-Type": "application/json", }, body: JSON.stringify(params), })Also applies to: 102-109, 161-171, 224-225
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/server/index.ts` around lines 64 - 71, Wrap all outbound fetch calls (e.g., the call that assigns response = await fetch(...) in src/server/index.ts and the other fetches at the referenced locations) with a shared timeout/abort helper (e.g., fetchWithTimeout or abortableFetch) that uses AbortController and a configurable timeout (e.g., 5–10s), clears the timeout on completion, and rejects with a distinct AbortError when timed out; then update the handlers that call fetch (the functions/blocks that await response) to catch abort/timeouts and translate them into an HTTP 504 response. Ensure the helper is reusable across the other fetch sites (the calls at the other ranges noted) and that only aborts/timeouts (not other network errors) map to 504 while preserving normal error handling for other failures.src/lib/error-utils.ts (1)
15-33:⚠️ Potential issue | 🟡 MinorNormalize error messages before substring matching.
These checks are still case-sensitive, so variants like
Network request failed,Rate Limit, orPayload Too Largecan miss the intended branch and fall back to the generic message. Preserve the original text for display, but compare against a lowercased copy instead.♻️ Suggested fix
if (error instanceof TypeError) { + const message = error.message.toLowerCase() // fetch() throws TypeError for network failures return ( - error.message.includes("fetch") || - error.message.includes("network") || - error.message.includes("Failed to fetch") + message.includes("fetch") || + message.includes("network") || + message.includes("failed to fetch") ) } ... if (error instanceof Error) { const message = error.message + const normalizedMessage = message.toLowerCase() - if (message.includes("413") || message.includes("too large")) { + if ( + normalizedMessage.includes("413") || + normalizedMessage.includes("too large") + ) { ... - if (message.includes("429") || message.includes("rate limit")) { + if ( + normalizedMessage.includes("429") || + normalizedMessage.includes("rate limit") + ) { ... - if ( - message.includes("500") || - message.includes("502") || - message.includes("503") - ) { + if ( + normalizedMessage.includes("500") || + normalizedMessage.includes("502") || + normalizedMessage.includes("503") + ) {Also applies to: 81-103
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/error-utils.ts` around lines 15 - 33, The runtime checks currently do case-sensitive substring matching (notably the "error instanceof TypeError" branch) and can miss variants; create a lowercased copy of the message (e.g., const lower = error.message?.toLowerCase() || "") and use lower.includes(...) for all substring checks in the TypeError branch and the Error branch (replace direct error.message.includes calls), while leaving error.message intact for any display/logging; apply the same normalization change to the other identical block referenced (lines 81-103).src/lib/persistence-service.ts (1)
130-155:⚠️ Potential issue | 🟠 MajorPropagate IndexedDB write failures from
saveSession.Right now a failed
store.put()only gets logged, socreateSession()andupdateSession()still return success even when nothing was persisted. That will silently lose session state and break resume/history flows after reload.💡 Suggested fix
export const saveSession = async ( session: TranscriptionSession, ): Promise<void> => { try { const db = await initDb() @@ console.log("Session saved successfully:", session.id) } catch (error) { console.error("Error in saveSession:", error) + throw error instanceof Error + ? error + : new Error("Failed to save session") } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/persistence-service.ts` around lines 130 - 155, saveSession currently swallows IndexedDB write failures (store.put errors) so callers like createSession/updateSession believe persistence succeeded; change saveSession (the function) to propagate failures by rejecting when store.put fails (use the request.error/event to construct/reject with a meaningful Error) and do not swallow exceptions in the outer catch—log the error but rethrow it so callers receive the rejected Promise and can handle persistence failures. Ensure the Promise used for the put calls reject with request.error (or a descriptive Error) and that the outer catch rethrows the error instead of just console.error.
♻️ Duplicate comments (10)
src/components/ui/animated-backdrop.tsx (1)
22-28:⚠️ Potential issue | 🟠 MajorBackdrop should not render an inert full-screen button and should be layered behind content
At Line 25,
onClickis optional, but a focusable<button>is always rendered. This creates a dead control when undefined and may block child interactions without explicit z-index layering.♿ Proposed fix
- <button - type="button" - className="absolute inset-0 bg-black/50 backdrop-blur-sm" - onClick={onClick} - aria-label="Close overlay" - /> - {children} + {onClick ? ( + <button + type="button" + className="absolute inset-0 z-0 bg-black/50 backdrop-blur-sm" + onClick={onClick} + aria-label="Close overlay" + /> + ) : ( + <div + aria-hidden="true" + className="absolute inset-0 z-0 bg-black/50 backdrop-blur-sm" + /> + )} + <div className="relative z-10 w-full">{children}</div>🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/ui/animated-backdrop.tsx` around lines 22 - 28, The backdrop currently always renders a focusable <button> inside AnimatedBackdrop even when the onClick prop is undefined; change AnimatedBackdrop to conditionally render an interactive <button> only when onClick is provided (attach onClick, aria-label and focusable behavior), and render a non-interactive element (e.g., a <div> with the same className/bg/backdrop styles and aria-hidden="true" and pointer-events-none) when onClick is absent so it can't receive focus or block child interactions; ensure the interactive branch keeps the same className and semantics and the non-interactive branch is layered correctly behind {children}.src/app/studio/page.tsx (1)
13-15:⚠️ Potential issue | 🟠 MajorDon't navigate during render.
Calling
router.replace()here introduces a render-phase side effect. In App Router with React concurrent rendering, this can trigger state-update warnings and unpredictable behavior because Next.js's router is React-state-driven. Move the redirect into auseEffect, or convert this page to a Server Component and useredirect("/").Client-side fix
import { FileAudio } from "lucide-react" import { useRouter, useSearchParams } from "next/navigation" +import { useEffect } from "react" import { Button } from "@/components/ui/button" export default function StudioRedirectPage() { const router = useRouter() const searchParams = useSearchParams() + const session = searchParams.get("session") - if (searchParams.get("session")) { - router.replace("/") - return null - } + useEffect(() => { + if (session) router.replace("/") + }, [router, session]) + + if (session) return null🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/app/studio/page.tsx` around lines 13 - 15, The redirect using router.replace() is being performed during render (when checking searchParams.get("session")), which causes render-phase side effects; move that navigation into a useEffect inside the component: import and call useEffect, read searchParams (or derive the session flag) during render but only call router.replace("/") from inside useEffect when session is present, and keep the component rendering null or a loading state while the effect runs; update references to router.replace and searchParams.get("session") in the component (page/default export) accordingly.src/app/error.tsx (1)
18-20:⚠️ Potential issue | 🟠 MajorDo not log the raw route error object in the browser.
Line 19 still prints the full exception to the client console, which can leak internal messages or stack details to end users. Send the full error to telemetry instead, and only log a minimal digest in development if you need a browser-side breadcrumb.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/app/error.tsx` around lines 18 - 20, The useEffect in src/app/error.tsx currently calls console.error("Route error boundary triggered:", error) which leaks the raw error to the browser; replace that raw client-side logging by sending the full error object to your telemetry/reporter (e.g., call your telemetry API such as telemetry.reportError(error) or sendErrorTelemetry(error) inside the useEffect) and only output a minimal, non-sensitive digest to the browser in development (e.g., if (process.env.NODE_ENV === 'development') console.debug('Route error boundary triggered')) so the full stack/exception is never printed to end-user consoles; update the useEffect that references error accordingly.src/app/page.tsx (1)
101-103:⚠️ Potential issue | 🟡 MinorKeep the
localStoragewrite non-fatal.This write is still unguarded. If
localStorage.setItem(...)throws here, the upload flow reports failure after the transcription has already been created server-side, and the user never gets routed to it.💡 Suggested fix
if (audioUrl) { - localStorage.setItem(`audioUrl_${resultData.id}`, audioUrl) + try { + localStorage.setItem(`audioUrl_${resultData.id}`, audioUrl) + } catch (error) { + console.warn("Failed to persist audio URL locally:", error) + } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/app/page.tsx` around lines 101 - 103, The localStorage write using localStorage.setItem(`audioUrl_${resultData.id}`, audioUrl) must be made non-fatal: wrap that call in a try/catch so any exception is caught and handled (e.g., console.warn or processLogger.warn) and do not rethrow or change the success flow; ensure audioUrl and resultData.id are checked for presence before calling and that failure to write to localStorage does not prevent navigation or mark the upload/transcription as failed.src/components/studio/EnhancedTranscript.tsx (1)
73-85:⚠️ Potential issue | 🟠 MajorSearch still does nothing in the non-segmented fallback.
This handler only searches
segments. When the component falls back to rendering the fulltranscription, the search box still renders but never finds or highlights anything, so the feature silently breaks for that path. Either hide search in fallback mode or implement the same search/highlight behavior againsttranscription.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/studio/EnhancedTranscript.tsx` around lines 73 - 85, The handleSearch function currently only searches the segments array and ignores the non-segmented transcription fallback, so implement search for both paths: if segments exists keep the existing logic (function handleSearch -> segments), otherwise search the transcription string (use transcription.toLowerCase().includes(term.toLowerCase())) and compute appropriate result indices or ranges to highlight, then call setSearchResults with the corresponding positions; alternatively hide the search UI when transcription is non-segmented by checking segments before rendering the search box and only render it when segments is truthy (update render logic where the search input is displayed and the handleSearch handler is attached).src/components/studio/ExportControls.tsx (1)
307-323:⚠️ Potential issue | 🟡 MinorAssociate “Format” with the selector group.
This is still visually labeled only. Screen readers won't announce that these buttons belong to the “Format” control group unless you wrap them in a
fieldset/legendor add a group role witharia-labelledby.Suggested fix
- <div> - <p className="mb-2 text-xs text-gray-600">Format</p> - <div className="grid grid-cols-4 gap-1"> + <fieldset> + <legend className="mb-2 text-xs text-gray-600">Format</legend> + <div className="grid grid-cols-4 gap-1"> {( ["txt", "docx", "srt", "vtt", "json", "csv", "md"] as const ).map((format) => ( @@ - </div> - </div> + </div> + </fieldset>🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/studio/ExportControls.tsx` around lines 307 - 323, The "Format" label is only visual and not associated with the button group; update the ExportControls component so screen readers know the buttons are a single control by either wrapping the button list in a semantic fieldset/legend (replace the <p className="...">Format</p> with a <legend> inside a <fieldset>) or by keeping the visible <p> and giving it an id then wrapping the buttons in a container with role="group" and aria-labelledby pointing to that id; ensure this change is applied around the mapped Button elements (where selectedFormat, setSelectedFormat and the format map are used) so the group is correctly announced.src/components/studio/AudioPlayer.tsx (1)
210-217:⚠️ Potential issue | 🟠 MajorReject loop points that make
B <= A.This is still accepting loop markers in any order, but the rewind effect assumes
loopStart < loopEnd. If the user sets B before A, playback immediately jumps back to A and the loop becomes unusable. Validate the ordering when saving the points.Suggested fix
const setLoopPoint = (type: "start" | "end") => { if (type === "start") { + if (loopEnd !== null && currentTime >= loopEnd) { + toast.error("Loop start must be before loop end") + return + } setLoopStart(currentTime) toast.success(`Loop start: ${formatDuration(currentTime)}`) } else { + if (loopStart !== null && currentTime <= loopStart) { + toast.error("Loop end must be after loop start") + return + } setLoopEnd(currentTime) toast.success(`Loop end: ${formatDuration(currentTime)}`) } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/studio/AudioPlayer.tsx` around lines 210 - 217, The setLoopPoint handler currently allows creating invalid ranges (loopEnd <= loopStart); update setLoopPoint to validate ordering before calling setLoopStart/setLoopEnd: when type === "start", if currentTime >= loopEnd (or if loopEnd is defined and currentTime >= loopEnd) refuse to set and show a toast.error message; when type === "end", if currentTime <= loopStart (or if loopStart is defined and currentTime <= loopStart) refuse to set and show a toast.error message; otherwise call setLoopStart/setLoopEnd and toast.success as before. Use the existing symbols setLoopPoint, setLoopStart, setLoopEnd, currentTime, loopStart, loopEnd to locate and implement the checks.src/stores/history-store.ts (2)
64-99:⚠️ Potential issue | 🟡 MinorLog which legacy entry breaks the migration.
Any
histStore.put()failure will abort the upgrade transaction, but this loop still gives no visibility into which migrated row caused it. That makes one-time history migration failures very hard to diagnose in production. AddputReq.onerrorplustx.onerror/tx.onabortlogging withs.idands.predictionId.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/stores/history-store.ts` around lines 64 - 99, The migration loop inside sessStore.getAll().onsuccess calls histStore.put for each legacy session but lacks error handlers; add a variable to capture each put request (e.g., const putReq = histStore.put(...)) and attach putReq.onerror to log the failing legacy row including s.id and s.predictionId, and also attach the upgrade transaction handlers (tx.onerror and tx.onabort) to log the transaction failure with context (include s.id and s.predictionId or last attempted predictionId) so you can identify which migrated entry caused the abort; update the code around sessStore.getAll().onsuccess, histStore.put, and the upgrade tx to set these handlers.
167-180:⚠️ Potential issue | 🟠 MajorKeep
patchas a single read-write transaction.This still reads in a
readonlytransaction and writes in a second one. Concurrent patches for the samepredictionIdcan race and drop one caller's updates.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/stores/history-store.ts` around lines 167 - 180, The patch implementation reads with a "readonly" transaction then writes in a separate "readwrite" transaction which can race; change patch to perform both get and put inside a single dbOperation("readwrite", ...) callback so the read and update happen in one transaction (use dbOperation with "readwrite" to fetch the existing HistoryEntry and immediately put the merged object), keep using set to update the Zustand entries array (entries.map comparing e.predictionId to predictionId) and reference the same symbols: patch, dbOperation, HistoryEntry, set, entries, predictionId.src/lib/export-formats.ts (1)
38-46:⚠️ Potential issue | 🟠 MajorUse sequential cue numbers instead of
segment.id.SRT cue identifiers must be 1-based and sequential. Backend segment ids can be sparse or opaque, so
segment.id + 1can emit invalid cue ordering.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/export-formats.ts` around lines 38 - 46, The SRT generator is using segment.id (via segment.id + 1) which can be sparse/opaque; replace that with a 1-based sequential cue number generated from the map index so cues are always sequential; in the mapping where segments are transformed (the return block that calls formatTimeForSRT and builds speakerPrefix), change the code that emits the cue identifier to use the map index (i + 1) instead of segment.id (or otherwise generate sequential numbers) so SRT cue identifiers are strictly 1-based and ordered.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro Plus
Run ID: 16a459e0-5ed6-4db5-b549-bebcdac9a469
⛔ Files ignored due to path filters (2)
bun.lockis excluded by!**/*.lockpackage-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (97)
biome.jsoneslint.config.jsnext.config.mjspackage.jsonpostcss.config.jsscripts/setup-firebase-cors.jssrc/app/about/page.tsxsrc/app/api/firebase-proxy/route.tssrc/app/api/prediction/[id]/route.tssrc/app/api/printerz-proxy/route.tssrc/app/api/transcribe/route.tssrc/app/changelog/page.tsxsrc/app/documentation/page.tsxsrc/app/error.tsxsrc/app/errors/[code]/page.tsxsrc/app/global-error.tsxsrc/app/history/page.tsxsrc/app/not-found.tsxsrc/app/page.tsxsrc/app/studio/[id]/page.tsxsrc/app/studio/page.tsxsrc/app/transcribe/[id]/page.tsxsrc/components/Changelog.tsxsrc/components/Documentation.tsxsrc/components/UploadAudio.tsxsrc/components/feedback/Feedback.tsxsrc/components/feedback/FeedbackForm.tsxsrc/components/feedback/FeedbackModals.tsxsrc/components/layout/Header.tsxsrc/components/studio/AudioPlayer.tsxsrc/components/studio/ChaptersPanel.tsxsrc/components/studio/EnhancedTranscript.tsxsrc/components/studio/EntitiesPanel.tsxsrc/components/studio/ExportControls.tsxsrc/components/studio/FileDetails.tsxsrc/components/studio/KeyPhrasesPanel.tsxsrc/components/studio/KeyboardShortcutsModal.tsxsrc/components/studio/SentimentPanel.tsxsrc/components/studio/SummaryPanel.tsxsrc/components/studio/TranscriptStatistics.tsxsrc/components/transcription/FileUploadInput.tsxsrc/components/transcription/TranscriptionOptions.tsxsrc/components/transcription/TranscriptionStudio.tsxsrc/components/transcription/UrlInput.tsxsrc/components/ui/LoadingFallback.tsxsrc/components/ui/alert-dialog.tsxsrc/components/ui/animated-backdrop.tsxsrc/components/ui/animated-button.tsxsrc/components/ui/animated-card.tsxsrc/components/ui/animated-list.tsxsrc/components/ui/badge.tsxsrc/components/ui/button.tsxsrc/components/ui/card.tsxsrc/components/ui/dialog.tsxsrc/components/ui/dropdown-menu.tsxsrc/components/ui/input.tsxsrc/components/ui/label.tsxsrc/components/ui/mobile-button-variants.tssrc/components/ui/mobile-button.tsxsrc/components/ui/mobile-dialog.tsxsrc/components/ui/mobile-input.tsxsrc/components/ui/mobile-navigation.tsxsrc/components/ui/progress.tsxsrc/components/ui/scroll-area.tsxsrc/components/ui/scroll-reveal-section.tsxsrc/components/ui/select.tsxsrc/components/ui/separator.tsxsrc/components/ui/sonner.tsxsrc/components/ui/switch.tsxsrc/components/ui/tabs.tsxsrc/components/ui/textarea.tsxsrc/components/ui/tooltip.tsxsrc/hooks/use-file-input.tsxsrc/hooks/useAudioPlayer.tssrc/hooks/useDebounce.tssrc/hooks/useScrollAnimation.tsxsrc/hooks/useTranscriptionPolling.tssrc/index.csssrc/lib/animations.tssrc/lib/error-utils.tssrc/lib/export-formats.tssrc/lib/file-format-utils.tssrc/lib/firebase-proxy.tssrc/lib/firebase-utils.tssrc/lib/firebase.tssrc/lib/pdf-generation.tssrc/lib/persistence-service.tssrc/lib/speaker-colors.tssrc/lib/storage-service.tssrc/lib/utils.tssrc/server/index.tssrc/services/transcription.tssrc/stores/history-store.tssrc/types/index.tssrc/types/transcription.tstsconfig.jsontsconfig.tsbuildinfo
💤 Files with no reviewable changes (1)
- eslint.config.js
| const body = await request.json() | ||
| const { url } = body | ||
|
|
||
| if (!url?.includes("firebasestorage.googleapis.com")) { | ||
| return NextResponse.json( | ||
| { error: "Invalid or missing Firebase Storage URL" }, | ||
| { status: 400 }, | ||
| ); | ||
| ) |
There was a problem hiding this comment.
Harden Firebase URL validation to prevent SSRF bypass
At Line 9, url?.includes("firebasestorage.googleapis.com") is not a safe origin check and can be bypassed by attacker-controlled URLs containing that substring. Parse the URL and strictly validate protocol + hostname.
🔒 Proposed fix
- const body = await request.json()
- const { url } = body
-
- if (!url?.includes("firebasestorage.googleapis.com")) {
+ const body = await request.json()
+ const { url } = body ?? {}
+
+ if (typeof url !== "string") {
+ return NextResponse.json(
+ { error: "Invalid or missing Firebase Storage URL" },
+ { status: 400 },
+ )
+ }
+
+ let parsedUrl: URL
+ try {
+ parsedUrl = new URL(url)
+ } catch {
+ return NextResponse.json(
+ { error: "Invalid or missing Firebase Storage URL" },
+ { status: 400 },
+ )
+ }
+
+ if (
+ parsedUrl.protocol !== "https:" ||
+ parsedUrl.hostname !== "firebasestorage.googleapis.com"
+ ) {
return NextResponse.json(
{ error: "Invalid or missing Firebase Storage URL" },
{ status: 400 },
)
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const body = await request.json() | |
| const { url } = body | |
| if (!url?.includes("firebasestorage.googleapis.com")) { | |
| return NextResponse.json( | |
| { error: "Invalid or missing Firebase Storage URL" }, | |
| { status: 400 }, | |
| ); | |
| ) | |
| const body = await request.json() | |
| const { url } = body ?? {} | |
| if (typeof url !== "string") { | |
| return NextResponse.json( | |
| { error: "Invalid or missing Firebase Storage URL" }, | |
| { status: 400 }, | |
| ) | |
| } | |
| let parsedUrl: URL | |
| try { | |
| parsedUrl = new URL(url) | |
| } catch { | |
| return NextResponse.json( | |
| { error: "Invalid or missing Firebase Storage URL" }, | |
| { status: 400 }, | |
| ) | |
| } | |
| if ( | |
| parsedUrl.protocol !== "https:" || | |
| parsedUrl.hostname !== "firebasestorage.googleapis.com" | |
| ) { | |
| return NextResponse.json( | |
| { error: "Invalid or missing Firebase Storage URL" }, | |
| { status: 400 }, | |
| ) | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/app/api/firebase-proxy/route.ts` around lines 6 - 13, The current check
using url?.includes("firebasestorage.googleapis.com") is unsafe; replace it by
parsing the incoming url (the body.url variable) with the URL constructor in a
try/catch, then validate that urlObj.protocol === "https:" and urlObj.hostname
=== "firebasestorage.googleapis.com" (reject anything else) before proceeding;
if parsing fails or the protocol/hostname check fails, return the same
NextResponse.json({ error: "Invalid or missing Firebase Storage URL" }, {
status: 400 }) to block SSRF attempts.
Introduce automated CI pipelines for Biome linting/formatting and TypeScript type-checking to ensure code quality and prevent regressions. Also includes various stability improvements: - Refactor `useHistoryStore` to use more robust IndexedDB transaction handling. - Enhance error boundary logging with environment-aware logic. - Improve `AudioPlayer` loop point validation logic. - Add search highlighting to `EnhancedTranscript`. - Implement safer `localStorage` access in analytics and page components.
Remove the Printerz API proxy route, server-side endpoint, and local PDF generation logic. This includes cleaning up associated environment variables and removing unused dependencies like jspdf and pdfmake. - Delete `src/app/api/printerz-proxy/route.ts` and `src/lib/pdf-generation.ts` - Remove `/api/printerz/render` endpoint from `src/server/index.ts` - Clean up `package.json` and `.env.example` - Simplify `EnhancedTranscript` and `ErrorState` components by removing unnecessary lint suppressions and logic complexity Co-authored-by: Copilot <copilot@github.com>
There was a problem hiding this comment.
Actionable comments posted: 19
♻️ Duplicate comments (14)
src/components/ui/animated-backdrop.tsx (1)
22-30:⚠️ Potential issue | 🟡 MinorRender a non-interactive backdrop when
onClickis absent.At Line 22, the current conditional removes the backdrop entirely if
onClickis not passed. That makes non-dismissible dialogs lose the dim/blur layer. Keep the visual backdrop in both cases; only switch between clickable and non-clickable elements.Suggested patch
- {onClick && ( - <button - type="button" - className="absolute inset-0 bg-black/50 backdrop-blur-sm" - onClick={onClick} - aria-label="Close overlay" - /> - )} - <div className="relative z-10">{children}</div> + {onClick ? ( + <button + type="button" + className="absolute inset-0 z-0 bg-black/50 backdrop-blur-sm" + onClick={onClick} + aria-label="Close overlay" + /> + ) : ( + <div + aria-hidden="true" + className="absolute inset-0 z-0 bg-black/50 backdrop-blur-sm" + /> + )} + <div className="relative z-10">{children}</div>🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/ui/animated-backdrop.tsx` around lines 22 - 30, The backdrop is removed when onClick is missing; always render the visual backdrop but make it non-interactive if no handler is provided. Update the JSX in the AnimatedBackdrop component to render the same element with className "absolute inset-0 bg-black/50 backdrop-blur-sm" in both cases: if onClick exists render a <button> with onClick and aria-label="Close overlay", otherwise render a non-interactive <div> (or <span>) with aria-hidden="true" (or role="presentation") so the dim/blur layer remains for non-dismissible dialogs while preserving accessibility and behavior.src/components/analytics/VercelAnalytics.tsx (1)
5-11:⚠️ Potential issue | 🟡 MinorHandle
localStorageread failures defensively.Line 7 and Line 8 can still throw in restricted contexts (storage blocked/private mode), which can break
beforeSend. Wrap the read intry/catchand fail closed for privacy.🔧 Proposed fix
function beforeSend(event: BeforeSendEvent) { - if ( - typeof globalThis.localStorage !== "undefined" && - globalThis.localStorage.getItem("analytics_opt_out") === "true" - ) { + let optedOut = false + try { + optedOut = globalThis.localStorage?.getItem("analytics_opt_out") === "true" + } catch { + return null + } + + if (optedOut) { return null }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/analytics/VercelAnalytics.tsx` around lines 5 - 11, The localStorage read in beforeSend (globalThis.localStorage.getItem) can throw in restricted/private contexts; wrap the access in a try/catch inside beforeSend and treat any exception as an opt-out (return null) to fail closed for privacy, so on error you log/ignore internally and return null to stop sending the event.src/lib/export-formats.ts (3)
148-149:⚠️ Potential issue | 🟡 MinorWord count calculation inconsistent with
generateJSON.Line 149 uses
transcription.split(/\s+/).length, which returns1for empty/whitespace-only strings.generateJSONat line 84 uses.trim().split(/\s+/).filter(Boolean).length. Use the same normalization for consistency.🔧 Proposed fix
- md += `*Word count: ${transcription.split(/\s+/).length}*\n` + md += `*Word count: ${transcription.trim().split(/\s+/).filter(Boolean).length}*\n`🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/export-formats.ts` around lines 148 - 149, The word count appended to the Markdown is inconsistent with generateJSON's counting; update the calculation that builds md (using the transcription variable) to match generateJSON by trimming and splitting then filtering out empty tokens (e.g., use transcription.trim().split(/\s+/).filter(Boolean).length) so whitespace-only or empty transcriptions yield 0 and both outputs stay consistent.
8-18:⚠️ Potential issue | 🟡 MinorClamp negative timestamps before computing time components.
The
splitTimestampfunction does not guard against negative input. Ifsecondsis slightly negative (e.g., from floating-point drift),Math.round(seconds * 1000)produces a negativetotalMs, which cascades into negativehours/minutes/secs/msand invalid SRT/VTT timestamps.🛡️ Proposed fix
const splitTimestamp = (seconds: number) => { - const totalMs = Math.round(seconds * 1000) + const totalMs = Math.max(0, Math.round(seconds * 1000)) const ms = totalMs % 1000 const totalSecs = Math.floor(totalMs / 1000)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/export-formats.ts` around lines 8 - 18, splitTimestamp can produce negative components when seconds is slightly negative; clamp the time to zero before computing components. In the splitTimestamp function, ensure you use a non-negative millisecond value (e.g., compute totalMs as Math.max(0, Math.round(seconds * 1000))) so ms, totalSecs, hours, minutes and secs are always >= 0; update any dependent calculations in splitTimestamp to use that clamped totalMs.
98-101:⚠️ Potential issue | 🟡 MinorCSV fallback schema differs from segmented path.
The no-segments branch emits
id,start,end,text(4 columns), while the segmented path at lines 103-108 emitsid,start,end,duration,text(5 columns). Downstream consumers must special-case the export based on whether segments existed.🔧 Proposed fix to unify schemas
if (!segments || segments.length === 0) { return ( - 'id,start,end,text\n1,0,0,"' + transcription.replace(/"/g, '""') + '"' + 'id,start,end,duration,text\n1,0,0,0,"' + transcription.replace(/"/g, '""') + '"' ) }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/export-formats.ts` around lines 98 - 101, The CSV fallback when segments is empty uses a 4-column schema ('id,start,end,text') while the segmented branch emits 5 columns including duration; change the no-segments return to match the segmented schema by adding a duration column (e.g. header 'id,start,end,duration,text' and a duration value like 0 in the row) so consumers always receive the same columns; update the return that builds the single-row CSV (which uses transcription) to include the extra ",0" duration field.src/components/studio/AudioPlayer.tsx (1)
332-343:⚠️ Potential issue | 🟠 MajorAdd media event handlers to sync UI state from external playback changes.
The
<video>element only updates React state (isPlaying,volume,isMuted) from local click handlers. When theuseAudioPlayerhook or keyboard shortcuts callplay()/pause()/volumedirectly on the media element, UI state drifts. AddonPlay,onPause, andonVolumeChangehandlers to derive state from actual media events.🐛 Proposed fix
<video ref={audioRef} src={audioUrl} + onPlay={() => setIsPlaying(true)} + onPause={() => setIsPlaying(false)} + onVolumeChange={() => { + const media = audioRef.current + if (!media) return + setIsMuted(media.muted) + setVolume(media.volume) + }} onTimeUpdate={handleTimeUpdate} onLoadedMetadata={handleDurationChange} onEnded={() => setIsPlaying(false)} preload="metadata"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/studio/AudioPlayer.tsx` around lines 332 - 343, The UI state can drift because playback/volume may be changed externally (e.g. via useAudioPlayer or keyboard shortcuts); add media event handlers on the <video> element to derive React state from actual media events: add onPlay and onPause to call setIsPlaying(true/false) (use event.target or audioRef.current), and add onVolumeChange to read event.target.volume and event.target.muted and call setVolume(...) and setIsMuted(...). Hook these into the same component that currently uses audioRef, handleTimeUpdate, handleDurationChange and setIsPlaying so all state (isPlaying, volume, isMuted) is kept in sync with the media element.src/server/index.ts (3)
219-234:⚠️ Potential issue | 🔴 CriticalValidate the full Firebase URL before fetching it.
The new check only constrains
hostname. That still accepts non-HTTPS URLs and any bucket/object hosted onfirebasestorage.googleapis.com, so this proxy can fetch arbitrary Firebase objects. Enforcehttps:and verify the pathname/bucket against your configured storage bucket before callingfetch.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/server/index.ts` around lines 219 - 234, The code currently only checks parsedUrl.hostname before calling fetch; update the validation to require parsedUrl.protocol === "https:" and validate parsedUrl.pathname to ensure it references your configured Firebase storage bucket (e.g., compare the bucket name parsed from parsedUrl.pathname against your storageBucket or process.env.FIREBASE_STORAGE_BUCKET), returning 400 JSON errors for protocol or bucket mismatches; perform these checks (using parsedUrl.protocol and parsedUrl.pathname) before invoking fetch(parsedUrl.toString()) so only HTTPS requests for the allowed bucket are proxied.
82-84:⚠️ Potential issue | 🟠 MajorStop logging raw AssemblyAI payloads.
Both handlers dump the upstream JSON response directly to server logs. Those payloads can contain transcript text and other user data; keep only safe identifiers, status, and coarse metrics.
Also applies to: 123-125
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/server/index.ts` around lines 82 - 84, The handlers currently call response.json() into data then log the entire payload with console.log("AssemblyAI API response:", data) and return it via res.json(data); remove the raw payload logging and avoid returning full upstream JSON to server logs — instead extract and log only safe fields (e.g., data.id, data.status, coarse metrics like data.audio_duration or transcript length) and ensure res.json only sends necessary safe identifiers/status/metrics to clients; update both places where this pattern occurs (the block using response.json()/console.log/res.json and the similar block around lines 123-125) so no raw transcript or user content is written to logs.
243-248:⚠️ Potential issue | 🟠 MajorThis still buffers the entire audio object in memory.
await response.arrayBuffer()loads the full upstream file before sending it, so the “stream directly” comment is no longer true. For large recordings this adds latency and memory pressure; piperesponse.bodytoresinstead.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/server/index.ts` around lines 243 - 248, The code currently buffers the entire response via response.arrayBuffer(), which defeats streaming; change the logic around the response and res handling to stream directly from response.body into the Express response instead: set Content-Type using response.headers.get("content-type") (fallback to application/octet-stream), set status using response.status (or 200 fallback), ensure response.body is non-null, and pipe or use stream.pipeline(response.body, res, onError) to forward the upstream stream and propagate errors rather than awaiting arrayBuffer(). Use the existing symbols response, response.arrayBuffer(), response.body, res, and response.headers.get(...) to locate and replace the buffering code.src/lib/firebase-utils.ts (1)
49-50:⚠️ Potential issue | 🟠 MajorDo not log tokenized Firebase download URLs.
This still writes the full
downloadURLto logs, which exposes the long-lived access token embedded in the URL. Logsnapshot.metadata.fullPathorfilePathinstead.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/firebase-utils.ts` around lines 49 - 50, The code currently logs the full Firebase download URL (via getDownloadURL and console.log), which exposes the embedded access token; update the logging to avoid printing downloadURL and instead log a non-token identifier such as snapshot.metadata.fullPath or the local filePath variable. Locate the getDownloadURL call and the console.log that prints downloadURL (references: getDownloadURL, snapshot, snapshot.metadata.fullPath, filePath, console.log) and replace the log to output snapshot.metadata.fullPath or filePath while keeping the downloadURL value for internal use only if needed (do not log it).src/stores/history-store.ts (1)
173-191:⚠️ Potential issue | 🟠 MajorDon't resolve
patch()before the write commits.This promise resolves inside
getReq.onsuccessright afterstore.put(result), so a laterputfailure or transaction abort still updates Zustand as if persistence succeeded. Capture the merged entry, wait fortx.oncomplete, and wrapgetReq.errorin anErrorwhen rejecting.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/stores/history-store.ts` around lines 173 - 191, The promise for merging history resolves too early inside getReq.onsuccess right after store.put(result), so change the logic in the Promise (the block creating merged in patch()) to: compute the merged result in getReq.onsuccess and call store.put(result) but do NOT resolve there; instead attach tx.oncomplete to resolve(result) only after the transaction commits, and keep tx.onerror and tx.onabort rejecting; also change getReq.onerror to reject(new Error(String(getReq.error))) so the rejection wraps the underlying error. Ensure you reference HISTORY_STORE_NAME, HistoryEntry, store.put(result), getReq, tx.oncomplete, tx.onerror, and tx.onabort when making the change.src/components/feedback/FeedbackModals.tsx (2)
38-40:⚠️ Potential issue | 🟠 MajorGive the dialog a programmatic name.
role="dialog"andaria-modal="true"are present, but the modal is still unnamed to screen readers. Wire the existing<h2>to the container witharia-labelledby.Suggested fix
export function FeedbackModals() { + const titleId = React.useId() const [activeModal, setActiveModal] = useState<FeedbackType | null>(null) ... <motion.div role="dialog" aria-modal="true" + aria-labelledby={titleId} className="relative mx-auto max-h-[calc(100vh-1.5rem)] w-full max-w-[min(40rem,calc(100vw-1.5rem))] overflow-y-auto rounded-xl bg-white shadow-xl dark:bg-gray-800" ... - <h2 className="text-xl font-semibold text-gray-900 dark:text-gray-100"> + <h2 + id={titleId} + className="text-xl font-semibold text-gray-900 dark:text-gray-100" + >Also applies to: 52-54
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/feedback/FeedbackModals.tsx` around lines 38 - 40, The dialog container in FeedbackModals.tsx is missing a programmatic name for screen readers; add an aria-labelledby attribute on the container element (the element with role="dialog" and aria-modal="true" in the FeedbackModals component) that points to the id of the existing heading element (<h2>) inside the modal, and give that <h2> a unique id (e.g., feedback-modal-title); apply the same change to the second modal instance referenced around lines 52-54 so both dialog containers use aria-labelledby and their headings have matching ids.
37-40:⚠️ Potential issue | 🟠 MajorRestore click isolation on the dialog container.
Clicks inside the modal still bubble to
AnimatedBackdrop, so interacting with the form can close it unexpectedly. Add astopPropagation()handler back on the dialog wrapper.Suggested fix
<AnimatedBackdrop onClick={() => setActiveModal(null)}> <motion.div + onClick={(event) => event.stopPropagation()} role="dialog" aria-modal="true" className="relative mx-auto max-h-[calc(100vh-1.5rem)] w-full max-w-[min(40rem,calc(100vw-1.5rem))] overflow-y-auto rounded-xl bg-white shadow-xl dark:bg-gray-800"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/feedback/FeedbackModals.tsx` around lines 37 - 40, The dialog container in FeedbackModals.tsx (the motion.div with role="dialog") is missing a click stopPropagation so clicks inside bubble up to AnimatedBackdrop and can close the modal; add an onClick handler on that motion.div (the dialog wrapper) that calls event.stopPropagation() to prevent events from reaching AnimatedBackdrop and closing the modal when interacting with the form.src/app/transcribe/[id]/page.tsx (1)
150-164:⚠️ Potential issue | 🟠 MajorReturn after terminal poll states.
A
"succeeded"or explicit"failed"response on the 120th attempt still falls through to the timeout block and gets overwritten to"failed".Suggested fix
} else if (data.status === "succeeded") { handleTranscriptionSuccess(data.output, audioUrl) + return } else if (data.status === "failed") { stopPolling() setStatus("failed") setError(data.error || "Transcription failed") patchHistory(id, { status: "failed" }) + return } - if (attemptsRef.current >= 120) { + if ( + (data.status === "starting" || data.status === "processing") && + attemptsRef.current >= 120 + ) { stopPolling() setStatus("failed") setError("Transcription timed out") patchHistory(id, { status: "failed" }) }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/app/transcribe/`[id]/page.tsx around lines 150 - 164, The code currently falls through after handling "succeeded" or "failed" and then still executes the timeout check; update the polling branch in page.tsx so terminal states short-circuit: either make the attempts timeout check an else-if (else if (attemptsRef.current >= 120) ...) or add explicit returns immediately after handling data.status === "succeeded" (handleTranscriptionSuccess(...); return) and after the data.status === "failed" block (stopPolling(); setStatus(...); setError(...); patchHistory(...); return) so handleTranscriptionSuccess, stopPolling, setStatus, setError, patchHistory are not overwritten by the timeout logic.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/biome.yml:
- Around line 17-19: Replace the nondeterministic "bun-version: latest" used
with the CI action "uses: oven-sh/setup-bun@v2" by pinning to a specific Bun
release or a semver range (e.g., a concrete version like "1.x.y" or a range like
"^1.9.0") to ensure reproducible builds; update the bun-version field in the
workflow accordingly and document the chosen version strategy so future
maintainers know when to bump it.
- Around line 21-22: Add a dependency cache step for Bun before the "Install
dependencies" step to speed CI: use actions/cache with the cache-path(s) for Bun
(e.g., ~/.bun and the project's .bun directory or node_modules if applicable)
and set the key to include runner.os and the hash of bun.lockb (or package.json
+ bun.lockb) so the cache is restored before running the "Install dependencies"
step and saved after install; reference the existing "Install dependencies" step
name so the cache step is placed immediately before it.
In @.github/workflows/typecheck.yml:
- Around line 17-19: Replace the non-deterministic setting `bun-version: latest`
used with the `oven-sh/setup-bun@v2` action by pinning to a specific Bun version
or a constrained range (for example a full semver like "1.4.9" or a safe range
like ">=1.4.0 <2.0.0") to ensure CI stability and reproducible builds; update
the `bun-version` value accordingly in the workflow.
- Around line 24-25: Update the "Type check" GitHub Actions step so the bun
TypeScript invocation matches the PR verification command: add the
--ignoreDeprecations flag to the existing run command (the step labelled "Type
check" that currently runs `bun x tsc --noEmit`) so it becomes `bun x tsc
--noEmit --ignoreDeprecations 6.0` to suppress TypeScript 6.0 deprecation
warnings during CI.
- Around line 21-22: Add a caching step before the "Install dependencies" step
to persist Bun's cache and speed up subsequent runs: create an actions/cache@v3
step that caches Bun's cache directories (e.g., ~/.bun and ~/.cache/bun) with a
key like runner.os-bun-${{ hashFiles('**/bun.lockb') }} and a restore-keys
fallback (e.g., runner.os-bun-), placed after the Bun setup step and before the
step that runs the existing bun install --frozen-lockfile command; keep the
"Install dependencies" step unchanged so it restores the cache if available and
then runs bun install.
In `@src/app/studio/page.tsx`:
- Around line 14-16: The render is calling router.replace("/") directly which
triggers React state-update-during-render warnings; instead, keep the early
return (return null) when searchParams.get("session") is truthy but move the
redirect into a useEffect: import useEffect, then inside the component add
useEffect(() => { if (searchParams.get("session")) router.replace("/") },
[searchParams.get("session")]) so the navigation runs as a side effect after
render; ensure you still return null synchronously when session exists.
In `@src/app/transcribe/`[id]/page.tsx:
- Around line 176-179: The current polling uses setInterval to call poll() (see
poll, pollRef, stopPolling) which allows overlapping async fetches; change to a
self-scheduling approach or add an in-flight guard so only one poll runs at a
time: modify the effect that sets pollRef/current interval (and the analogous
usage around lines 217-218) to start by invoking poll(), then have poll schedule
the next run with setTimeout only after its async work completes (or add a
boolean like isPolling/inFlight checked at poll start and cleared on completion)
and ensure stopPolling clears any pending timeout and flips the guard.
- Around line 120-123: When calling patchHistory(id, avoid replacing the entire
nested audioSource object; instead preserve existing audioSource fields (name,
size, real type) and only update/add the url (and optionally type) so the
shallow-merge in history-store doesn't drop metadata. Locate the patchHistory
call and change the audioSource payload to merge the current/audioSource fields
(e.g., spread the existing audioSource object into the new object and then set
url) rather than assigning a new { type: "file", url: audioUrl } object;
reference the patchHistory call and the audioSource property in your change.
- Around line 137-142: The localStorage reads/writes around the poll loop (the
audioUrl assignment using localStorage.getItem(`audioUrl_${id}`) and the
conditional localStorage.setItem(`audioUrl_${id}`, data.audioUrl)) must be
best-effort: wrap both the getItem and setItem calls in try/catch so storage
errors don't trigger polling failure; on error, silently ignore or log at debug
level and continue using only data.audioUrl (i.e., fall back to undefined when
getItem fails) so the polling logic (where audioUrl is used) never throws due to
storage unavailability.
In `@src/components/errors/ErrorState.tsx`:
- Around line 133-136: The inline Biome suppression on the hints map iteration
is dead noise—remove the "// biome-ignore lint/suspicious/noArrayIndexKey: ..."
comment and keep the existing key={index} if you intentionally want to use the
index as the React key; alternatively, if you prefer to fix the lint concern,
replace key={index} with a stable unique identifier (e.g., use the hint string
itself or an id) in the hints.map inside the ErrorState component so the
suppression is no longer needed.
In `@src/components/studio/AudioPlayer.tsx`:
- Around line 150-156: The AudioPlayer component's cognitive complexity is high;
extract complex handler logic into small focused helpers or hooks to reduce
complexity. Move loop point management into a custom hook (e.g., useLoopPoints)
that accepts the audio element and loop state and exposes start/stop/check
functions; extract segment navigation logic into a helper or hook (e.g.,
useSegmentNavigation) that provides nextSegment, prevSegment, and jumpToSegment
handlers using the segments prop and currentSegment state; refactor onTimeUpdate
and related inline branching in AudioPlayer to call these new helpers/hooks so
the component body becomes a thin coordinator calling useLoopPoints,
useSegmentNavigation, and simple event bindings.
- Around line 202-208: Replace the explicit "=== false" boolean comparisons with
the idiomatic logical not operator: in toggleMute() change the nextMuted
assignment from "isMuted === false" to "!isMuted" and apply the same
simplification wherever the pattern appears (other boolean toggles that
reference isMuted or similar state), keeping the rest of the logic that sets
audioRef.current.muted and calls setIsMuted(nextMuted) intact.
In `@src/components/studio/EnhancedTranscript.tsx`:
- Around line 5-8: The helper function escapeRegExp is declared before the
module imports, breaking convention; move the escapeRegExp function declaration
so it appears after all import statements (e.g., after the imported symbols like
Copy and Search) in src/components/studio/EnhancedTranscript.tsx, keeping the
function name intact and ensuring any references to escapeRegExp in the file
continue to resolve.
- Around line 295-296: Remove the now-ineffective biome-ignore suppression
comment above the JSX mark element in EnhancedTranscript.tsx; specifically
delete the line "// biome-ignore lint/suspicious/noArrayIndexKey: split parts
have no stable identity" that appears immediately before the <mark key={i}
className="rounded bg-yellow-200 px-1"> so the code has no misleading/no-op lint
suppression comment left in the file.
In `@src/components/studio/ExportControls.tsx`:
- Around line 38-48: The SRT cue numbers are generated from segment.id which can
be non-sequential; change the map callback in generateSRT (in
ExportControls.tsx) to accept the index (e.g., (segment, index) => ...) and use
index + 1 for the cue number instead of segment.id + 1, matching the approach in
export-formats.ts to ensure sequential 1-based SRT numbering.
- Around line 307-324: Replace the div-based group with a semantic
fieldset/legend: change the container that currently uses role="group" and
aria-labelledby="format-label" to a <fieldset> containing a <legend> (use the
existing "format-label" text or id for the legend) while keeping the grid
container and the mapped Button elements (the map over formats and
setSelectedFormat, selectedFormat logic) intact and preserving classes/sizing;
ensure the legend provides the same visible label text and remove the
role/aria-labelledby attributes once the native fieldset/legend is used.
- Around line 34-136: The file duplicates export generators (generateSRT,
generateVTT, generateJSON, generateCSV, generateMarkdown); remove these local
definitions and import the canonical implementations from
src/lib/export-formats.ts, then update any local usages to call the imported
functions (passing the same inputs used here such as transcription, segments,
intelligence, and any formatting helpers required). Specifically: delete the
local functions named generateSRT, generateVTT, generateJSON, generateCSV,
generateMarkdown and add imports for those same function names from
export-formats.ts, then replace calls that previously referenced the local
functions to call the imported versions with the appropriate arguments
(transcription, segments, intelligence, etc.). Ensure there are no leftover
helper duplicates and run tests to confirm behavior (e.g., SRT cue numbering).
In `@src/lib/persistence-service.ts`:
- Around line 138-155: The current save flow resolves the write when store.put's
onsuccess fires but doesn't wait for the readwrite transaction to commit, so
change the Promise in the save path to resolve only on transaction.oncomplete
and reject on transaction.onerror and transaction.onabort; keep using
store.put(session) but remove resolving in request.onsuccess (you may still
handle request.onerror to attach context), and apply the same transaction-level
completion handling to the other write paths (deleteSession and
cleanupExpiredSessions) that use initDb, transaction and STORE_NAME so callers
only proceed when the transaction truly committed and failures abort as
expected.
In `@src/server/index.ts`:
- Around line 18-24: The current Express middleware (the app.use anonymous CORS
handler in src/server/index.ts) sets Access-Control-Allow-Origin to "*" which
exposes proxy routes; change it to enforce an explicit origin allowlist: define
an array of allowed origins (your app's frontend URLs), read req.headers.origin,
and if origin is in the allowlist set Access-Control-Allow-Origin to that origin
(otherwise omit it or return 403); ensure Access-Control-Allow-Methods and
Access-Control-Allow-Headers remain set and handle OPTIONS preflight by
short-circuiting with res.sendStatus(204) when appropriate; update the same
anonymous middleware (or replace it with a named function like corsHandler) so
only allowed origins can call server-side API-proxy endpoints.
---
Duplicate comments:
In `@src/app/transcribe/`[id]/page.tsx:
- Around line 150-164: The code currently falls through after handling
"succeeded" or "failed" and then still executes the timeout check; update the
polling branch in page.tsx so terminal states short-circuit: either make the
attempts timeout check an else-if (else if (attemptsRef.current >= 120) ...) or
add explicit returns immediately after handling data.status === "succeeded"
(handleTranscriptionSuccess(...); return) and after the data.status === "failed"
block (stopPolling(); setStatus(...); setError(...); patchHistory(...); return)
so handleTranscriptionSuccess, stopPolling, setStatus, setError, patchHistory
are not overwritten by the timeout logic.
In `@src/components/analytics/VercelAnalytics.tsx`:
- Around line 5-11: The localStorage read in beforeSend
(globalThis.localStorage.getItem) can throw in restricted/private contexts; wrap
the access in a try/catch inside beforeSend and treat any exception as an
opt-out (return null) to fail closed for privacy, so on error you log/ignore
internally and return null to stop sending the event.
In `@src/components/feedback/FeedbackModals.tsx`:
- Around line 38-40: The dialog container in FeedbackModals.tsx is missing a
programmatic name for screen readers; add an aria-labelledby attribute on the
container element (the element with role="dialog" and aria-modal="true" in the
FeedbackModals component) that points to the id of the existing heading element
(<h2>) inside the modal, and give that <h2> a unique id (e.g.,
feedback-modal-title); apply the same change to the second modal instance
referenced around lines 52-54 so both dialog containers use aria-labelledby and
their headings have matching ids.
- Around line 37-40: The dialog container in FeedbackModals.tsx (the motion.div
with role="dialog") is missing a click stopPropagation so clicks inside bubble
up to AnimatedBackdrop and can close the modal; add an onClick handler on that
motion.div (the dialog wrapper) that calls event.stopPropagation() to prevent
events from reaching AnimatedBackdrop and closing the modal when interacting
with the form.
In `@src/components/studio/AudioPlayer.tsx`:
- Around line 332-343: The UI state can drift because playback/volume may be
changed externally (e.g. via useAudioPlayer or keyboard shortcuts); add media
event handlers on the <video> element to derive React state from actual media
events: add onPlay and onPause to call setIsPlaying(true/false) (use
event.target or audioRef.current), and add onVolumeChange to read
event.target.volume and event.target.muted and call setVolume(...) and
setIsMuted(...). Hook these into the same component that currently uses
audioRef, handleTimeUpdate, handleDurationChange and setIsPlaying so all state
(isPlaying, volume, isMuted) is kept in sync with the media element.
In `@src/components/ui/animated-backdrop.tsx`:
- Around line 22-30: The backdrop is removed when onClick is missing; always
render the visual backdrop but make it non-interactive if no handler is
provided. Update the JSX in the AnimatedBackdrop component to render the same
element with className "absolute inset-0 bg-black/50 backdrop-blur-sm" in both
cases: if onClick exists render a <button> with onClick and aria-label="Close
overlay", otherwise render a non-interactive <div> (or <span>) with
aria-hidden="true" (or role="presentation") so the dim/blur layer remains for
non-dismissible dialogs while preserving accessibility and behavior.
In `@src/lib/export-formats.ts`:
- Around line 148-149: The word count appended to the Markdown is inconsistent
with generateJSON's counting; update the calculation that builds md (using the
transcription variable) to match generateJSON by trimming and splitting then
filtering out empty tokens (e.g., use
transcription.trim().split(/\s+/).filter(Boolean).length) so whitespace-only or
empty transcriptions yield 0 and both outputs stay consistent.
- Around line 8-18: splitTimestamp can produce negative components when seconds
is slightly negative; clamp the time to zero before computing components. In the
splitTimestamp function, ensure you use a non-negative millisecond value (e.g.,
compute totalMs as Math.max(0, Math.round(seconds * 1000))) so ms, totalSecs,
hours, minutes and secs are always >= 0; update any dependent calculations in
splitTimestamp to use that clamped totalMs.
- Around line 98-101: The CSV fallback when segments is empty uses a 4-column
schema ('id,start,end,text') while the segmented branch emits 5 columns
including duration; change the no-segments return to match the segmented schema
by adding a duration column (e.g. header 'id,start,end,duration,text' and a
duration value like 0 in the row) so consumers always receive the same columns;
update the return that builds the single-row CSV (which uses transcription) to
include the extra ",0" duration field.
In `@src/lib/firebase-utils.ts`:
- Around line 49-50: The code currently logs the full Firebase download URL (via
getDownloadURL and console.log), which exposes the embedded access token; update
the logging to avoid printing downloadURL and instead log a non-token identifier
such as snapshot.metadata.fullPath or the local filePath variable. Locate the
getDownloadURL call and the console.log that prints downloadURL (references:
getDownloadURL, snapshot, snapshot.metadata.fullPath, filePath, console.log) and
replace the log to output snapshot.metadata.fullPath or filePath while keeping
the downloadURL value for internal use only if needed (do not log it).
In `@src/server/index.ts`:
- Around line 219-234: The code currently only checks parsedUrl.hostname before
calling fetch; update the validation to require parsedUrl.protocol === "https:"
and validate parsedUrl.pathname to ensure it references your configured Firebase
storage bucket (e.g., compare the bucket name parsed from parsedUrl.pathname
against your storageBucket or process.env.FIREBASE_STORAGE_BUCKET), returning
400 JSON errors for protocol or bucket mismatches; perform these checks (using
parsedUrl.protocol and parsedUrl.pathname) before invoking
fetch(parsedUrl.toString()) so only HTTPS requests for the allowed bucket are
proxied.
- Around line 82-84: The handlers currently call response.json() into data then
log the entire payload with console.log("AssemblyAI API response:", data) and
return it via res.json(data); remove the raw payload logging and avoid returning
full upstream JSON to server logs — instead extract and log only safe fields
(e.g., data.id, data.status, coarse metrics like data.audio_duration or
transcript length) and ensure res.json only sends necessary safe
identifiers/status/metrics to clients; update both places where this pattern
occurs (the block using response.json()/console.log/res.json and the similar
block around lines 123-125) so no raw transcript or user content is written to
logs.
- Around line 243-248: The code currently buffers the entire response via
response.arrayBuffer(), which defeats streaming; change the logic around the
response and res handling to stream directly from response.body into the Express
response instead: set Content-Type using response.headers.get("content-type")
(fallback to application/octet-stream), set status using response.status (or 200
fallback), ensure response.body is non-null, and pipe or use
stream.pipeline(response.body, res, onError) to forward the upstream stream and
propagate errors rather than awaiting arrayBuffer(). Use the existing symbols
response, response.arrayBuffer(), response.body, res, and
response.headers.get(...) to locate and replace the buffering code.
In `@src/stores/history-store.ts`:
- Around line 173-191: The promise for merging history resolves too early inside
getReq.onsuccess right after store.put(result), so change the logic in the
Promise (the block creating merged in patch()) to: compute the merged result in
getReq.onsuccess and call store.put(result) but do NOT resolve there; instead
attach tx.oncomplete to resolve(result) only after the transaction commits, and
keep tx.onerror and tx.onabort rejecting; also change getReq.onerror to
reject(new Error(String(getReq.error))) so the rejection wraps the underlying
error. Ensure you reference HISTORY_STORE_NAME, HistoryEntry, store.put(result),
getReq, tx.oncomplete, tx.onerror, and tx.onabort when making the change.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro Plus
Run ID: f7a012e8-ae9a-4fc1-9bea-5c9851001289
📒 Files selected for processing (23)
.github/workflows/biome.yml.github/workflows/typecheck.yml.gitignoresrc/app/error.tsxsrc/app/global-error.tsxsrc/app/page.tsxsrc/app/studio/page.tsxsrc/app/transcribe/[id]/page.tsxsrc/components/analytics/VercelAnalytics.tsxsrc/components/errors/ErrorState.tsxsrc/components/feedback/FeedbackModals.tsxsrc/components/layout/Header.tsxsrc/components/studio/AudioPlayer.tsxsrc/components/studio/EnhancedTranscript.tsxsrc/components/studio/ExportControls.tsxsrc/components/ui/animated-backdrop.tsxsrc/lib/export-formats.tssrc/lib/firebase-utils.tssrc/lib/persistence-service.tssrc/lib/storage-service.tssrc/server/index.tssrc/stores/history-store.tstsconfig.tsbuildinfo
| <div> | ||
| <p id="format-label" className="mb-2 text-xs text-gray-600">Format</p> | ||
| <div role="group" aria-labelledby="format-label" className="grid grid-cols-4 gap-1"> | ||
| {( | ||
| ["txt", "docx", "srt", "vtt", "json", "csv", "md"] as const | ||
| ).map((format) => ( | ||
| <Button | ||
| key={format} | ||
| variant={selectedFormat === format ? "default" : "outline"} | ||
| size="sm" | ||
| onClick={() => setSelectedFormat(format)} | ||
| className="px-2 text-xs" | ||
| > | ||
| {format.toUpperCase()} | ||
| </Button> | ||
| ))} | ||
| </div> | ||
| </div> |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Accessibility grouping is adequate but consider native <fieldset>.
The role="group" with aria-labelledby="format-label" pattern works for screen readers. However, SonarCloud suggests using native semantic elements like <fieldset> with <legend> for better cross-device accessibility. This is a minor improvement opportunity.
♻️ Optional: Use native fieldset/legend
- <div>
- <p id="format-label" className="mb-2 text-xs text-gray-600">Format</p>
- <div role="group" aria-labelledby="format-label" className="grid grid-cols-4 gap-1">
+ <fieldset>
+ <legend className="mb-2 text-xs text-gray-600">Format</legend>
+ <div className="grid grid-cols-4 gap-1">
{(
["txt", "docx", "srt", "vtt", "json", "csv", "md"] as const
).map((format) => (
...
))}
</div>
- </div>
+ </fieldset>🧰 Tools
🪛 GitHub Check: SonarCloud Code Analysis
[warning] 309-309: Use
Details
, , , or instead of the "group" role to ensure accessibility across all devices.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/studio/ExportControls.tsx` around lines 307 - 324, Replace the
div-based group with a semantic fieldset/legend: change the container that
currently uses role="group" and aria-labelledby="format-label" to a <fieldset>
containing a <legend> (use the existing "format-label" text or id for the
legend) while keeping the grid container and the mapped Button elements (the map
over formats and setSelectedFormat, selectedFormat logic) intact and preserving
classes/sizing; ensure the legend provides the same visible label text and
remove the role/aria-labelledby attributes once the native fieldset/legend is
used.
| const db = await initDb() | ||
| const transaction = db.transaction([STORE_NAME], "readwrite") | ||
| const store = transaction.objectStore(STORE_NAME) | ||
|
|
||
| // Update timestamp | ||
| session.lastUpdatedAt = Date.now(); | ||
| session.lastUpdatedAt = Date.now() | ||
|
|
||
| // Save to store | ||
| await new Promise<void>((resolve, reject) => { | ||
| const request = store.put(session); | ||
| request.onsuccess = () => resolve(); | ||
| const request = store.put(session) | ||
| request.onsuccess = () => resolve() | ||
| request.onerror = (event) => { | ||
| console.error("Error saving session:", event); | ||
| reject(new Error("Failed to save session")); | ||
| }; | ||
| }); | ||
| console.error("Error saving session:", event) | ||
| reject(new Error("Failed to save session")) | ||
| } | ||
| }) | ||
|
|
||
| console.log("Session saved successfully:", session.id); | ||
| console.log("Session saved successfully:", session.id) |
There was a problem hiding this comment.
Resolve IndexedDB writes on transaction completion.
store.put(session) succeeding does not mean the readwrite transaction committed. If the transaction aborts afterward, this function still logs success and callers proceed as if persistence worked. Resolve from transaction.oncomplete and reject on onerror/onabort; the same pattern should be applied to the write paths in deleteSession() and cleanupExpiredSessions().
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/lib/persistence-service.ts` around lines 138 - 155, The current save flow
resolves the write when store.put's onsuccess fires but doesn't wait for the
readwrite transaction to commit, so change the Promise in the save path to
resolve only on transaction.oncomplete and reject on transaction.onerror and
transaction.onabort; keep using store.put(session) but remove resolving in
request.onsuccess (you may still handle request.onerror to attach context), and
apply the same transaction-level completion handling to the other write paths
(deleteSession and cleanupExpiredSessions) that use initDb, transaction and
STORE_NAME so callers only proceed when the transaction truly committed and
failures abort as expected.
Improve code readability and maintain consistent indentation in the `ExportControls` component by breaking down long JSX lines for the format label and group container.
Refactor the branch selection syntax in Biome and Typecheck workflows from a single-line array to a multi-line list format for improved readability and consistency with standard YAML practices. Co-authored-by: Copilot <copilot@github.com>
There was a problem hiding this comment.
Actionable comments posted: 7
♻️ Duplicate comments (10)
src/components/studio/ExportControls.tsx (2)
39-46:⚠️ Potential issue | 🟠 MajorUse sequential SRT cue numbers, not
segment.id.Line 45 uses
segment.id + 1, which can generate non-sequential cue numbers when IDs are sparse. SRT cue indexes should be contiguous 1-based.Proposed fix
- return segments - .map((segment) => { + return segments + .map((segment, index) => { const startTime = formatTimeForSRT(segment.start) const endTime = formatTimeForSRT(segment.end) const speakerPrefix = segment.speaker ? `[Speaker ${segment.speaker}] ` : "" - return `${segment.id + 1}\n${startTime} --> ${endTime}\n${speakerPrefix}${segment.text.trim()}\n` + return `${index + 1}\n${startTime} --> ${endTime}\n${speakerPrefix}${segment.text.trim()}\n` }) .join("\n")🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/studio/ExportControls.tsx` around lines 39 - 46, The SRT export is using segment.id for cue numbers which can be non-sequential; update the map in ExportControls.tsx so the callback takes the array index (e.g., (segment, index)) and generate the cue number using index + 1 instead of segment.id + 1; keep the rest unchanged (still use formatTimeForSRT for start/end and speakerPrefix) so SRT cue indexes are contiguous 1-based values.
307-329:⚠️ Potential issue | 🟡 MinorPrefer native grouping semantics for the format selector.
Line 312 uses
role="group", but this control behaves like a single-choice selector. A<fieldset><legend>plus pressed/checked semantics improves SR announcement and state clarity.Proposed fix
- <div> - <p id="format-label" className="mb-2 text-xs text-gray-600"> - Format - </p> - <div - role="group" - aria-labelledby="format-label" - className="grid grid-cols-4 gap-1" - > + <fieldset> + <legend className="mb-2 text-xs text-gray-600">Format</legend> + <div className="grid grid-cols-4 gap-1"> {( ["txt", "docx", "srt", "vtt", "json", "csv", "md"] as const ).map((format) => ( <Button key={format} variant={selectedFormat === format ? "default" : "outline"} size="sm" + aria-pressed={selectedFormat === format} onClick={() => setSelectedFormat(format)} className="px-2 text-xs" > {format.toUpperCase()} </Button> ))} </div> - </div> + </fieldset>🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/studio/ExportControls.tsx` around lines 307 - 329, Replace the generic div+role="group" with a native fieldset/legend structure and expose per-button pressed/checked semantics: wrap the options in <fieldset aria-labelledby="format-label"> and use <legend id="format-label">Format</legend> (instead of the p with id), keep the grid classes on the inner wrapper, and on each mapped Button (the mapping over ["txt","docx","srt","vtt","json","csv","md"] that uses selectedFormat and setSelectedFormat) add accessibility attributes role="radio" and aria-checked={selectedFormat === format} (or aria-pressed if your Button expects that), keeping the existing onClick handler and visual variant logic so screen readers get native group/selection semantics; ensure you remove role="group" from the previous container..github/workflows/typecheck.yml (2)
26-27:⚠️ Potential issue | 🟠 MajorMatch the TypeScript command used for PR verification.
Line 27 omits
--ignoreDeprecations 6.0, even though the PR notes verification was done with that flag. Leaving CI on a different command can make the workflow fail for deprecations that local verification intentionally suppressed.🔧 Proposed fix
- name: Type check - run: bun x tsc --noEmit + run: bun x tsc --noEmit --ignoreDeprecations 6.0🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/typecheck.yml around lines 26 - 27, Update the "Type check" step so the TypeScript command matches local PR verification by adding the missing flag; change the run invocation for the Type check job that currently runs "bun x tsc --noEmit" to include "--ignoreDeprecations 6.0" (i.e., run "bun x tsc --noEmit --ignoreDeprecations 6.0") so CI uses the same tsc flags as the PR verification.
19-21: 🧹 Nitpick | 🔵 TrivialPin Bun instead of using
latest.Line 21 still uses
bun-version: latest, which makes this workflow non-deterministic for the same commit.📌 Proposed fix
- uses: oven-sh/setup-bun@v2 with: - bun-version: latest + bun-version: "1.2"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/typecheck.yml around lines 19 - 21, The workflow currently uses oven-sh/setup-bun@v2 with bun-version: latest which is non-deterministic; replace the bun-version: latest value with a pinned semantic version (e.g., "0.7.10" or the exact tested Bun release for this project) so the same commit always uses the same runtime; update the bun-version field in the job that invokes uses: oven-sh/setup-bun@v2 and document/update that pinned version when you intentionally upgrade Bun..github/workflows/biome.yml (1)
19-21: 🧹 Nitpick | 🔵 TrivialPin Bun instead of using
latest.Line 21 still uses
bun-version: latest, so this workflow can change behavior without any repo change. Use a fixed version or a constrained 1.2 range to keep CI reproducible.📌 Proposed fix
- uses: oven-sh/setup-bun@v2 with: - bun-version: latest + bun-version: "1.2"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/biome.yml around lines 19 - 21, The workflow is using a floating bun-version which makes CI non-reproducible; update the GitHub Action step that uses "uses: oven-sh/setup-bun@v2" to replace the bun-version: latest with a pinned version or constrained range (e.g., a specific semver like 1.2.0 or a range like >=1.2 <1.3) so the setup-bun step always installs a deterministic Bun release.package.json (1)
13-14:⚠️ Potential issue | 🟡 Minor
format:checkis not actually a check command.Line 14 runs
biome format ., which does not enforce formatting drift the way a CI check should. Use--checkso the script exits non-zero when files are unformatted.♻️ Proposed fix
- "format:check": "biome format .", + "format:check": "biome format --check .",🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@package.json` around lines 13 - 14, The "format:check" npm script currently runs "biome format ." which rewrites files instead of verifying formatting; update the "format:check" script entry (the package.json script named format:check) to run the CLI with the check flag (e.g. use "biome format --check ." so the command exits non-zero on unformatted files), commit the change so CI will fail when formatting drift exists.src/server/index.ts (4)
147-157:⚠️ Potential issue | 🔴 CriticalRequire HTTPS for proxied Firebase URLs.
Lines 147-157 only validate the hostname, so
http://firebasestorage.googleapis.com/...still passes and gets fetched server-side. EnforceparsedUrl.protocol === "https:"before the fetch.🔐 Minimal fix
- if (parsedUrl.hostname !== "firebasestorage.googleapis.com") { + if ( + parsedUrl.protocol !== "https:" || + parsedUrl.hostname !== "firebasestorage.googleapis.com" + ) { return res .status(400) .json({ error: "Invalid or missing Firebase Storage URL" }) }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/server/index.ts` around lines 147 - 157, The URL validation currently only checks parsedUrl.hostname; update the validation around parsedUrl (the URL construction and subsequent checks) to also require parsedUrl.protocol === "https:" and return a 400 JSON error (e.g., same "Invalid or missing Firebase Storage URL" or a distinct "Firebase Storage URL must use https") when the protocol is not HTTPS so that non-HTTPS firebase storage URLs (like http://firebasestorage.googleapis.com/...) are rejected before any server-side fetch.
18-32:⚠️ Potential issue | 🔴 CriticalLock down CORS on the proxy server.
Lines 18-32 still send
Access-Control-Allow-Origin: *, which exposes/api/transcribeto any website even though it uses your server-side AssemblyAI key. Restrict this to an explicit frontend allowlist and vary byOrigin.🔒 Safer CORS shape
+const allowedOrigins = new Set( + (process.env.ALLOWED_ORIGINS ?? "") + .split(",") + .map((origin) => origin.trim()) + .filter(Boolean), +) + app.use(((req, res, next) => { - res.header("Access-Control-Allow-Origin", "*") + const origin = req.header("Origin") + if (origin && allowedOrigins.has(origin)) { + res.header("Access-Control-Allow-Origin", origin) + res.header("Vary", "Origin") + } res.header("Access-Control-Allow-Methods", "GET, POST, PUT, DELETE, OPTIONS") res.header( "Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept, Authorization", ) // Handle preflight requests if (req.method === "OPTIONS") { - return res.status(204).send() + return origin && allowedOrigins.has(origin) + ? res.sendStatus(204) + : res.sendStatus(403) }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/server/index.ts` around lines 18 - 32, The current CORS middleware registered with app.use sends Access-Control-Allow-Origin: * which is unsafe for the proxy; change it to consult an explicit frontend allowlist and echo back the incoming Origin only if it exists in the allowlist: in the anonymous middleware passed to app.use (the RequestHandler) read req.headers.origin, check it against a configured array/set of allowed origins, and if allowed set res.header("Access-Control-Allow-Origin", origin) otherwise do not set that header (or set a safe default like blocking); also set res.header("Vary", "Origin") so caches vary by origin and keep the existing Access-Control-Allow-Methods/Headers and OPTIONS preflight handling in the same middleware.
82-84:⚠️ Potential issue | 🟠 MajorStop logging full AssemblyAI payloads.
Lines 82-84 and 123-125 still print the entire upstream response. Those payloads can contain transcript text, speaker metadata, and URLs, so logs should be limited to IDs, status, and coarse metrics.
🧹 Safer logging pattern
- const data = await response.json() - console.log("AssemblyAI API response:", data) + const data = await response.json() + console.log("AssemblyAI API response:", { + id: data.id, + status: data.status, + audio_duration: data.audio_duration, + utterance_count: Array.isArray(data.utterances) ? data.utterances.length : undefined, + }) res.json(data)- const data = await response.json() - console.log("Transcription status data:", data) + const data = await response.json() + console.log("Transcription status data:", { + id: data.id, + status: data.status, + audio_duration: data.audio_duration, + utterance_count: Array.isArray(data.utterances) ? data.utterances.length : undefined, + }) res.json(data)Also applies to: 123-125
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/server/index.ts` around lines 82 - 84, The code currently logs full AssemblyAI responses via console.log("AssemblyAI API response:", data) and does the same later; replace those full-payload logs with a sanitized log that only extracts and logs coarse identifiers and metrics (e.g., data.id, data.status, data.audio_duration or data.duration, and any top-level error/code fields) and avoid including transcript text, speaker metadata, or URLs, then call res.json(data) as before; update both occurrences (the console.log that prints "AssemblyAI API response:" and the later duplicate) to build and log a small object like { id, status, duration, error } instead of the full data.
171-176:⚠️ Potential issue | 🟠 MajorActually stream the Firebase response.
Lines 171-176 say “Stream the response directly,” but
await response.arrayBuffer()buffers the whole object first. Large recordings will amplify memory usage and latency here.🚰 Forward the upstream body instead
Add this import near the top of
src/server/index.ts:import { Readable } from "node:stream"Then replace the buffering block:
- const contentType = response.headers.get("content-type") - res.setHeader("Content-Type", contentType || "application/octet-stream") - - const arrayBuffer = await response.arrayBuffer() - res.status(200).send(Buffer.from(arrayBuffer)) + const contentType = response.headers.get("content-type") + res.setHeader("Content-Type", contentType || "application/octet-stream") + + if (!response.body) { + return res.status(502).json({ error: "Firebase Storage returned an empty body" }) + } + + res.status(200) + Readable.fromWeb(response.body as ReadableStream).pipe(res)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/server/index.ts` around lines 171 - 176, The code is buffering the entire upstream response via response.arrayBuffer() instead of streaming; replace that block so you set res.setHeader("Content-Type", contentType || "application/octet-stream") and then stream the upstream body to the client by creating a Node Readable from response.body (use import { Readable } from "node:stream" and either Readable.fromWeb(response.body) or Readable.from(response.body) depending on environment) and pipe it into res (preferably using stream.pipeline or pipeline from "node:stream/promises" to propagate errors), removing the response.arrayBuffer()/Buffer.from usage and ensuring res.status(200) is set before piping.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/components/errors/ErrorState.tsx`:
- Around line 7-35: The toneStyles object duplicates the same class strings for
both badge and code across every ErrorTone; refactor it so each tone only
defines a single shared token (e.g., "base" or "classes") and a "glow" value,
then update any usage that reads toneStyles[...].badge or toneStyles[...].code
to read the single shared token (or continue to support badge/code by assigning
them to that token within the definition) so duplication is removed; modify the
toneStyles declaration and ensure components that reference badge/code (in
ErrorState.tsx) use the unified property name.
In `@src/components/studio/EnhancedTranscript.tsx`:
- Around line 187-193: The search Input currently only uses placeholder text
which is lost once typing begins; update the Input (the component using props
ref={searchInputRef}, placeholder="Search transcript... (Ctrl+F)",
value={searchTerm}, onChange={(e) => handleSearch(e.target.value)}) to include
an accessible name—either add an aria-label (e.g. aria-label="Search
transcript") on this Input or give it an id and add a corresponding visible
<label> tied to that id; ensure the label text is clear and concise and that the
id/aria-label matches the Input so assistive tech can always identify the field.
- Around line 73-90: The handler handleSearch treats any defined segments array
as segmented mode, causing inconsistency when segments === []; change the
branching to check for a non-empty segments array (use segments?.length > 0)
before running per-segment matching, and treat the empty-array case the same as
the fallback full-transcript branch (use transcription matching) so
setSearchResults and result counts match the render fallback; update the if/else
conditions around handleSearch, segments, transcription and setSearchResults
accordingly.
In `@src/components/studio/ExportControls.tsx`:
- Around line 76-77: The wordCount calculation in ExportControls.tsx incorrectly
returns 1 for empty or whitespace-only transcriptions because it uses
transcription.split(/\s+/).length; update the logic to trim the transcription
and return 0 when the trimmed string is empty, otherwise split the trimmed
string on whitespace to count words (apply the same fix where wordCount is
computed at both occurrences, and keep characterCount as transcription.length).
Use the variable names transcription and wordCount (and update the second
instance referenced at the other occurrence) to locate and fix the code.
- Around line 86-97: The CSV generator generateCSV currently only doubles quotes
but doesn't neutralize spreadsheet formula injection; update generateCSV to
sanitize text cells (transcription and seg.text) by detecting if the first
character is one of = + - @ and prefixing the cell with a safe character (e.g.,
a single quote or a space) before escaping quotes, so both the no-segments
branch (transcription) and the segments branch (seg.text) apply the same
sanitization logic; keep existing quote-escaping (replace(/"/g,'""')) and apply
the prefix step before assembling the CSV rows.
In `@src/lib/firebase-proxy.ts`:
- Around line 57-72: The download routine currently uses FileReader to
base64-encode the Blob (reader, reader.onloadend) which duplicates memory;
replace that with URL.createObjectURL(blob) to set the anchor href, call
a.click(), then call URL.revokeObjectURL(objectUrl) after removing the anchor
and resolving the promise (keep filename for a.download and retain the same
cleanup of the appended <a> element). Update the code path that uses
reader/readAsDataURL to instead create and revoke an object URL for the blob to
avoid base64 overhead.
In `@src/server/index.ts`:
- Around line 36-62: Validate req.body.audioUrl is a non-empty string and
req.body.options is an object before using them: update the handler around the
code that constructs TranscriptionParams (the const { audioUrl, options } =
req.body and subsequent use) to return a 400 response if audioUrl is missing/not
a string, and coerce/guard options (e.g., const opts = typeof options ===
"object" && options !== null ? options : {}; then use opts.diarize and
opts.language) so checks like options.diarize and options.language do not throw;
ensure speaker_labels/language_code/language_detection are only set when
validated values exist.
---
Duplicate comments:
In @.github/workflows/biome.yml:
- Around line 19-21: The workflow is using a floating bun-version which makes CI
non-reproducible; update the GitHub Action step that uses "uses:
oven-sh/setup-bun@v2" to replace the bun-version: latest with a pinned version
or constrained range (e.g., a specific semver like 1.2.0 or a range like >=1.2
<1.3) so the setup-bun step always installs a deterministic Bun release.
In @.github/workflows/typecheck.yml:
- Around line 26-27: Update the "Type check" step so the TypeScript command
matches local PR verification by adding the missing flag; change the run
invocation for the Type check job that currently runs "bun x tsc --noEmit" to
include "--ignoreDeprecations 6.0" (i.e., run "bun x tsc --noEmit
--ignoreDeprecations 6.0") so CI uses the same tsc flags as the PR verification.
- Around line 19-21: The workflow currently uses oven-sh/setup-bun@v2 with
bun-version: latest which is non-deterministic; replace the bun-version: latest
value with a pinned semantic version (e.g., "0.7.10" or the exact tested Bun
release for this project) so the same commit always uses the same runtime;
update the bun-version field in the job that invokes uses: oven-sh/setup-bun@v2
and document/update that pinned version when you intentionally upgrade Bun.
In `@package.json`:
- Around line 13-14: The "format:check" npm script currently runs "biome format
." which rewrites files instead of verifying formatting; update the
"format:check" script entry (the package.json script named format:check) to run
the CLI with the check flag (e.g. use "biome format --check ." so the command
exits non-zero on unformatted files), commit the change so CI will fail when
formatting drift exists.
In `@src/components/studio/ExportControls.tsx`:
- Around line 39-46: The SRT export is using segment.id for cue numbers which
can be non-sequential; update the map in ExportControls.tsx so the callback
takes the array index (e.g., (segment, index)) and generate the cue number using
index + 1 instead of segment.id + 1; keep the rest unchanged (still use
formatTimeForSRT for start/end and speakerPrefix) so SRT cue indexes are
contiguous 1-based values.
- Around line 307-329: Replace the generic div+role="group" with a native
fieldset/legend structure and expose per-button pressed/checked semantics: wrap
the options in <fieldset aria-labelledby="format-label"> and use <legend
id="format-label">Format</legend> (instead of the p with id), keep the grid
classes on the inner wrapper, and on each mapped Button (the mapping over
["txt","docx","srt","vtt","json","csv","md"] that uses selectedFormat and
setSelectedFormat) add accessibility attributes role="radio" and
aria-checked={selectedFormat === format} (or aria-pressed if your Button expects
that), keeping the existing onClick handler and visual variant logic so screen
readers get native group/selection semantics; ensure you remove role="group"
from the previous container.
In `@src/server/index.ts`:
- Around line 147-157: The URL validation currently only checks
parsedUrl.hostname; update the validation around parsedUrl (the URL construction
and subsequent checks) to also require parsedUrl.protocol === "https:" and
return a 400 JSON error (e.g., same "Invalid or missing Firebase Storage URL" or
a distinct "Firebase Storage URL must use https") when the protocol is not HTTPS
so that non-HTTPS firebase storage URLs (like
http://firebasestorage.googleapis.com/...) are rejected before any server-side
fetch.
- Around line 18-32: The current CORS middleware registered with app.use sends
Access-Control-Allow-Origin: * which is unsafe for the proxy; change it to
consult an explicit frontend allowlist and echo back the incoming Origin only if
it exists in the allowlist: in the anonymous middleware passed to app.use (the
RequestHandler) read req.headers.origin, check it against a configured array/set
of allowed origins, and if allowed set res.header("Access-Control-Allow-Origin",
origin) otherwise do not set that header (or set a safe default like blocking);
also set res.header("Vary", "Origin") so caches vary by origin and keep the
existing Access-Control-Allow-Methods/Headers and OPTIONS preflight handling in
the same middleware.
- Around line 82-84: The code currently logs full AssemblyAI responses via
console.log("AssemblyAI API response:", data) and does the same later; replace
those full-payload logs with a sanitized log that only extracts and logs coarse
identifiers and metrics (e.g., data.id, data.status, data.audio_duration or
data.duration, and any top-level error/code fields) and avoid including
transcript text, speaker metadata, or URLs, then call res.json(data) as before;
update both occurrences (the console.log that prints "AssemblyAI API response:"
and the later duplicate) to build and log a small object like { id, status,
duration, error } instead of the full data.
- Around line 171-176: The code is buffering the entire upstream response via
response.arrayBuffer() instead of streaming; replace that block so you set
res.setHeader("Content-Type", contentType || "application/octet-stream") and
then stream the upstream body to the client by creating a Node Readable from
response.body (use import { Readable } from "node:stream" and either
Readable.fromWeb(response.body) or Readable.from(response.body) depending on
environment) and pipe it into res (preferably using stream.pipeline or pipeline
from "node:stream/promises" to propagate errors), removing the
response.arrayBuffer()/Buffer.from usage and ensuring res.status(200) is set
before piping.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro Plus
Run ID: 12f4c286-2612-4400-9f60-ec85fff53bb9
⛔ Files ignored due to path filters (1)
bun.lockis excluded by!**/*.lock
📒 Files selected for processing (12)
.env.example.github/workflows/biome.yml.github/workflows/typecheck.ymlpackage.jsonsrc/app/api/printerz-proxy/route.tssrc/components/errors/ErrorState.tsxsrc/components/studio/EnhancedTranscript.tsxsrc/components/studio/ExportControls.tsxsrc/lib/firebase-proxy.tssrc/lib/pdf-generation.tssrc/server/index.tstsconfig.tsbuildinfo
💤 Files with no reviewable changes (3)
- .env.example
- src/app/api/printerz-proxy/route.ts
- src/lib/pdf-generation.ts
| const toneStyles: Record< | ||
| ErrorTone, | ||
| { | ||
| badge: string | ||
| code: string | ||
| glow: string | ||
| } | ||
| > = { | ||
| info: { | ||
| badge: "border-sky-500/20 bg-sky-500/10 text-sky-700 dark:text-sky-300", | ||
| code: "border-sky-500/20 bg-sky-500/10 text-sky-700 dark:text-sky-300", | ||
| glow: "from-sky-500/12 via-transparent to-transparent", | ||
| }, | ||
| warning: { | ||
| badge: | ||
| "border-amber-500/20 bg-amber-500/10 text-amber-700 dark:text-amber-300", | ||
| code: "border-amber-500/20 bg-amber-500/10 text-amber-700 dark:text-amber-300", | ||
| glow: "from-amber-500/12 via-transparent to-transparent", | ||
| }, | ||
| danger: { | ||
| badge: "border-red-500/20 bg-red-500/10 text-red-700 dark:text-red-300", | ||
| code: "border-red-500/20 bg-red-500/10 text-red-700 dark:text-red-300", | ||
| glow: "from-red-500/12 via-transparent to-transparent", | ||
| }, | ||
| neutral: { | ||
| badge: "border-border bg-muted/60 text-foreground", | ||
| code: "border-border bg-muted/60 text-foreground", | ||
| glow: "from-foreground/6 via-transparent to-transparent", | ||
| }, |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Reduce duplicated tone classes to avoid drift.
badge and code are identical for every tone. Consolidating to one token improves maintainability.
♻️ Suggested refactor
-const toneStyles: Record<
- ErrorTone,
- {
- badge: string
- code: string
- glow: string
- }
-> = {
+const toneStyles: Record<
+ ErrorTone,
+ {
+ accent: string
+ glow: string
+ }
+> = {
info: {
- badge: "border-sky-500/20 bg-sky-500/10 text-sky-700 dark:text-sky-300",
- code: "border-sky-500/20 bg-sky-500/10 text-sky-700 dark:text-sky-300",
+ accent: "border-sky-500/20 bg-sky-500/10 text-sky-700 dark:text-sky-300",
glow: "from-sky-500/12 via-transparent to-transparent",
},
warning: {
- badge:
+ accent:
"border-amber-500/20 bg-amber-500/10 text-amber-700 dark:text-amber-300",
- code: "border-amber-500/20 bg-amber-500/10 text-amber-700 dark:text-amber-300",
glow: "from-amber-500/12 via-transparent to-transparent",
},
danger: {
- badge: "border-red-500/20 bg-red-500/10 text-red-700 dark:text-red-300",
- code: "border-red-500/20 bg-red-500/10 text-red-700 dark:text-red-300",
+ accent: "border-red-500/20 bg-red-500/10 text-red-700 dark:text-red-300",
glow: "from-red-500/12 via-transparent to-transparent",
},
neutral: {
- badge: "border-border bg-muted/60 text-foreground",
- code: "border-border bg-muted/60 text-foreground",
+ accent: "border-border bg-muted/60 text-foreground",
glow: "from-foreground/6 via-transparent to-transparent",
},
}
@@
- styles.badge,
+ styles.accent,
@@
- styles.code,
+ styles.accent,🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/errors/ErrorState.tsx` around lines 7 - 35, The toneStyles
object duplicates the same class strings for both badge and code across every
ErrorTone; refactor it so each tone only defines a single shared token (e.g.,
"base" or "classes") and a "glow" value, then update any usage that reads
toneStyles[...].badge or toneStyles[...].code to read the single shared token
(or continue to support badge/code by assigning them to that token within the
definition) so duplication is removed; modify the toneStyles declaration and
ensure components that reference badge/code (in ErrorState.tsx) use the unified
property name.
| const handleSearch = (term: string) => { | ||
| setSearchTerm(term) | ||
| if (term.trim() && segments) { | ||
| const results: number[] = [] | ||
| segments.forEach((segment, index) => { | ||
| if (segment.text.toLowerCase().includes(term.toLowerCase())) { | ||
| results.push(index) | ||
| } | ||
| }) | ||
| setSearchResults(results) | ||
| } else if (term.trim() && !segments && transcription) { | ||
| const matches = transcription.match(new RegExp(escapeRegExp(term), "gi")) | ||
| setSearchResults( | ||
| matches ? Array.from({ length: matches.length }, (_, i) => i) : [], | ||
| ) | ||
| } else { | ||
| setSearchResults([]) | ||
| } |
There was a problem hiding this comment.
Keep the search branch consistent with the render fallback.
Line 224 falls back to the full-transcript view when segments.length === 0, but this handler treats any defined segments prop as segmented mode. With segments={[]}, matches still highlight in the fallback view, yet searchResults stays empty and the result count is wrong.
Suggested fix
const handleSearch = (term: string) => {
setSearchTerm(term)
- if (term.trim() && segments) {
+ if (term.trim() && segments && segments.length > 0) {
const results: number[] = []
segments.forEach((segment, index) => {
if (segment.text.toLowerCase().includes(term.toLowerCase())) {
results.push(index)
}
})
setSearchResults(results)
- } else if (term.trim() && !segments && transcription) {
+ } else if (term.trim() && transcription) {
const matches = transcription.match(new RegExp(escapeRegExp(term), "gi"))
setSearchResults(
matches ? Array.from({ length: matches.length }, (_, i) => i) : [],
)📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const handleSearch = (term: string) => { | |
| setSearchTerm(term) | |
| if (term.trim() && segments) { | |
| const results: number[] = [] | |
| segments.forEach((segment, index) => { | |
| if (segment.text.toLowerCase().includes(term.toLowerCase())) { | |
| results.push(index) | |
| } | |
| }) | |
| setSearchResults(results) | |
| } else if (term.trim() && !segments && transcription) { | |
| const matches = transcription.match(new RegExp(escapeRegExp(term), "gi")) | |
| setSearchResults( | |
| matches ? Array.from({ length: matches.length }, (_, i) => i) : [], | |
| ) | |
| } else { | |
| setSearchResults([]) | |
| } | |
| const handleSearch = (term: string) => { | |
| setSearchTerm(term) | |
| if (term.trim() && segments && segments.length > 0) { | |
| const results: number[] = [] | |
| segments.forEach((segment, index) => { | |
| if (segment.text.toLowerCase().includes(term.toLowerCase())) { | |
| results.push(index) | |
| } | |
| }) | |
| setSearchResults(results) | |
| } else if (term.trim() && transcription) { | |
| const matches = transcription.match(new RegExp(escapeRegExp(term), "gi")) | |
| setSearchResults( | |
| matches ? Array.from({ length: matches.length }, (_, i) => i) : [], | |
| ) | |
| } else { | |
| setSearchResults([]) | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/studio/EnhancedTranscript.tsx` around lines 73 - 90, The
handler handleSearch treats any defined segments array as segmented mode,
causing inconsistency when segments === []; change the branching to check for a
non-empty segments array (use segments?.length > 0) before running per-segment
matching, and treat the empty-array case the same as the fallback
full-transcript branch (use transcription matching) so setSearchResults and
result counts match the render fallback; update the if/else conditions around
handleSearch, segments, transcription and setSearchResults accordingly.
| <Input | ||
| ref={searchInputRef} | ||
| placeholder="Search transcript... (Ctrl+F)" | ||
| value={searchTerm} | ||
| onChange={(e) => handleSearch(e.target.value)} | ||
| className="pl-10" | ||
| /> |
There was a problem hiding this comment.
Add an accessible name to the search input.
This control currently relies on placeholder text alone. Once the user starts typing, assistive tech no longer has a reliable label for the field. Add a visible <label> or at least an aria-label.
Suggested fix
<Input
ref={searchInputRef}
+ aria-label="Search transcript"
placeholder="Search transcript... (Ctrl+F)"
value={searchTerm}
onChange={(e) => handleSearch(e.target.value)}
className="pl-10"
/>🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/studio/EnhancedTranscript.tsx` around lines 187 - 193, The
search Input currently only uses placeholder text which is lost once typing
begins; update the Input (the component using props ref={searchInputRef},
placeholder="Search transcript... (Ctrl+F)", value={searchTerm}, onChange={(e)
=> handleSearch(e.target.value)}) to include an accessible name—either add an
aria-label (e.g. aria-label="Search transcript") on this Input or give it an id
and add a corresponding visible <label> tied to that id; ensure the label text
is clear and concise and that the id/aria-label matches the Input so assistive
tech can always identify the field.
| wordCount: transcription.split(/\s+/).length, | ||
| characterCount: transcription.length, |
There was a problem hiding this comment.
Fix word count for empty/whitespace-only transcripts.
Line 76 and Line 134 use split(/\s+/).length, which reports 1 for empty content. This skews export metadata.
Proposed fix
+ const wordCount =
+ transcription.trim().length === 0
+ ? 0
+ : transcription.trim().split(/\s+/).length
+
const generateJSON = (): string => {
return JSON.stringify(
{
exportedAt: new Date().toISOString(),
transcription,
segments: segments || [],
intelligence: intelligence || undefined,
metadata: {
- wordCount: transcription.split(/\s+/).length,
+ wordCount,
characterCount: transcription.length,
segmentCount: segments?.length || 0,
},
},
@@
- md += `*Word count: ${transcription.split(/\s+/).length}*\n`
+ md += `*Word count: ${wordCount}*\n`
return md
}Also applies to: 134-134
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/studio/ExportControls.tsx` around lines 76 - 77, The wordCount
calculation in ExportControls.tsx incorrectly returns 1 for empty or
whitespace-only transcriptions because it uses
transcription.split(/\s+/).length; update the logic to trim the transcription
and return 0 when the trimmed string is empty, otherwise split the trimmed
string on whitespace to count words (apply the same fix where wordCount is
computed at both occurrences, and keep characterCount as transcription.length).
Use the variable names transcription and wordCount (and update the second
instance referenced at the other occurrence) to locate and fix the code.
| const generateCSV = (): string => { | ||
| if (!segments || segments.length === 0) { | ||
| return ( | ||
| 'id,start,end,text\n1,0,0,"' + transcription.replace(/"/g, '""') + '"' | ||
| ) | ||
| } | ||
| const header = "id,start,end,duration,text" | ||
| const rows = segments.map( | ||
| (seg) => | ||
| `${seg.id},${seg.start.toFixed(3)},${seg.end.toFixed(3)},${(seg.end - seg.start).toFixed(3)},"${seg.text.replace(/"/g, '""')}"`, | ||
| ) | ||
| return [header, ...rows].join("\n") |
There was a problem hiding this comment.
Sanitize CSV cells to prevent spreadsheet formula injection.
Line 89 and Line 95 only escape quotes. If transcript content starts with =, +, -, or @, opening the CSV in spreadsheet tools can execute formulas.
Proposed fix
const generateCSV = (): string => {
+ const sanitizeCsvCell = (value: string): string => {
+ const escaped = value.replace(/"/g, '""')
+ return /^[\t\r ]*[=+\-@]/.test(escaped) ? `'${escaped}` : escaped
+ }
+
if (!segments || segments.length === 0) {
return (
- 'id,start,end,text\n1,0,0,"' + transcription.replace(/"/g, '""') + '"'
+ 'id,start,end,text\n1,0,0,"' + sanitizeCsvCell(transcription) + '"'
)
}
const header = "id,start,end,duration,text"
const rows = segments.map(
(seg) =>
- `${seg.id},${seg.start.toFixed(3)},${seg.end.toFixed(3)},${(seg.end - seg.start).toFixed(3)},"${seg.text.replace(/"/g, '""')}"`,
+ `${seg.id},${seg.start.toFixed(3)},${seg.end.toFixed(3)},${(seg.end - seg.start).toFixed(3)},"${sanitizeCsvCell(seg.text)}"`,
)
return [header, ...rows].join("\n")
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/studio/ExportControls.tsx` around lines 86 - 97, The CSV
generator generateCSV currently only doubles quotes but doesn't neutralize
spreadsheet formula injection; update generateCSV to sanitize text cells
(transcription and seg.text) by detecting if the first character is one of = + -
@ and prefixing the cell with a safe character (e.g., a single quote or a space)
before escaping quotes, so both the no-segments branch (transcription) and the
segments branch (seg.text) apply the same sanitization logic; keep existing
quote-escaping (replace(/"/g,'""')) and apply the prefix step before assembling
the CSV rows.
| return new Promise((resolve) => { | ||
| const reader = new FileReader(); | ||
| const reader = new FileReader() | ||
| reader.onloadend = function () { | ||
| const a = document.createElement("a"); | ||
| a.href = reader.result as string; | ||
| a.download = filename; | ||
| document.body.appendChild(a); | ||
| a.click(); | ||
| const a = document.createElement("a") | ||
| a.href = reader.result as string | ||
| a.download = filename | ||
| document.body.appendChild(a) | ||
| a.click() | ||
|
|
||
| setTimeout(() => { | ||
| document.body.removeChild(a); | ||
| resolve(); | ||
| }, 100); | ||
| }; | ||
| reader.readAsDataURL(blob); | ||
| }); | ||
| document.body.removeChild(a) | ||
| resolve() | ||
| }, 100) | ||
| } | ||
| reader.readAsDataURL(blob) | ||
| }) |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Use an object URL instead of base64-encoding the whole blob.
Lines 57-72 convert the entire Blob into a data URL before download. That duplicates the payload in memory and adds base64 overhead for larger files; URL.createObjectURL is the cheaper path here.
♻️ Proposed refactor
export async function createDownloadableDataUrl(
blob: Blob,
filename: string,
): Promise<void> {
return new Promise((resolve) => {
- const reader = new FileReader()
- reader.onloadend = function () {
- const a = document.createElement("a")
- a.href = reader.result as string
- a.download = filename
- document.body.appendChild(a)
- a.click()
-
- setTimeout(() => {
- document.body.removeChild(a)
- resolve()
- }, 100)
- }
- reader.readAsDataURL(blob)
+ const objectUrl = URL.createObjectURL(blob)
+ const a = document.createElement("a")
+ a.href = objectUrl
+ a.download = filename
+ document.body.appendChild(a)
+ a.click()
+
+ setTimeout(() => {
+ a.remove()
+ URL.revokeObjectURL(objectUrl)
+ resolve()
+ }, 100)
})
}🧰 Tools
🪛 GitHub Check: SonarCloud Code Analysis
[warning] 67-67: Prefer childNode.remove() over parentNode.removeChild(childNode).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/lib/firebase-proxy.ts` around lines 57 - 72, The download routine
currently uses FileReader to base64-encode the Blob (reader, reader.onloadend)
which duplicates memory; replace that with URL.createObjectURL(blob) to set the
anchor href, call a.click(), then call URL.revokeObjectURL(objectUrl) after
removing the anchor and resolving the promise (keep filename for a.download and
retain the same cleanup of the appended <a> element). Update the code path that
uses reader/readAsDataURL to instead create and revoke an object URL for the
blob to avoid base64 overhead.
Refactor core logic and infrastructure to enhance system robustness and security. - Implement origin-based CORS allowlist in the server to replace wildcard access. - Strengthen Firebase proxy security with strict URL parsing and hostname validation. - Add request timeouts to external fetch calls to prevent hanging processes. - Centralize export logic by moving generation functions from components to lib. - Improve error handling by using generic error messages in API responses. - Optimize CI workflows with Bun caching and explicit versioning. - Refactor persistence service to use transaction completion events for reliability. Co-authored-by: Copilot <copilot@github.com>
- security: enforce HTTPS and validate input types in the transcription API - security: remove wildcard CORS from firebase-proxy and implement CSV cell sanitization to prevent formula injection - perf: stream audio files in firebase-proxy instead of buffering in memory - fix: prevent concurrent polling requests in useTranscriptionPolling using a flight guard - refactor: clean up unused console logs and improve CSV export formatting Co-authored-by: Copilot <copilot@github.com>
…ct uses bun.lock)
| const { id } = req.params | ||
|
|
||
| console.log(`Checking transcription status for ID: ${id}`); | ||
| console.log(`Checking transcription status for ID: ${id}`) |
|



Summary
A ground-up structural refactor of the Transcriptr app across 7 phases, followed by incremental feature and polish work. Net result: ~3,400 lines removed, clear route-based architecture, no more mobile/desktop component duplication, and a significantly smaller
TranscriptionStudio.What changed
Architecture (Phases 1–3)
/(upload),/transcribe/[id](processing + results),/studio/[id](full studio),/history,/about,/errors/[code]Headerwith responsive nav + mobile drawer replaces separateMobileHeader/MobileFooter/MainLayoutcomponentsComponent extraction (Phase 4)
TranscriptionStudiobig component into focused modules:AudioPlayer,FileDetails,TranscriptStatistics,ExportControls,EnhancedTranscript,KeyboardShortcutsModaluseAudioPlayerhook;TranscriptionStudiois now ~240 linesMobile/desktop deduplication (Phase 5)
MobileChangelog,MobileFeedbackForm,MobileFeedbackModals, and all mobile-only CSS filesDesign tokens (Phase 6)
V3AnnouncementModalandChangelogModalCleanup (Phase 7)
console.logcalls from client codeTranscriptionHistory.tsx,sequential-reveal-list.tsx, and otherdead files
Post-refactor features and fixes
Stats
Verified
bun x tsc --noEmit --ignoreDeprecations 6.0passesbun run buildpassesSummary by CodeRabbit
New Features
Bug Fixes
Chores