Conversation
* feat(content): add create-content API flow and run video hydration
* fix: address code review feedback on content-creation endpoints
- fix: z.coerce.boolean() treated 'false' as true in estimate query params
- fix: remove account_id from create content body (derive from auth only)
- fix: missing field now returns only required issues, not all issues
- fix: make video hydration best-effort so GET /tasks/runs doesn't crash
- refactor: derive DEFAULT_CONTENT_TEMPLATE from CONTENT_TEMPLATES array (DRY)
- refactor: export getContentValidateQuerySchema for reuse
* refactor: make config.json optional with sensible pipeline defaults
- config/content-creation/config.json is now 'recommended' not 'required'
- artists only need face-guide.png + at least one .mp3 to be ready
- pipeline will use default model/resolution settings if no config exists
- add test for config-optional readiness check
* feat: pass githubRepo to create-content task payload
- extend TriggerCreateContentPayload with githubRepo field
- getArtistContentReadiness now returns githubRepo
- createContentHandler passes githubRepo when triggering task
* feat: add caption_length param to content creation endpoint
- add caption_length (short/medium/long) to request schema, defaults to short
- pass through to task payload
- restore proper validation flow in handler
- update tests
* feat: add upscale param to content creation endpoint
* fix: validation now searches org submodule repos for artist files
- getArtistContentReadiness checks main repo first, falls back to org repos
- reads .gitmodules to discover org submodule URLs
- gatsby-grace now validates as ready=true
* feat: add batch mode — trigger N videos in parallel
- add batch param (1-30, default 1)
- batch mode returns { runIds: [...] } instead of { runId }
- each task runs independently with its own random selections
* refactor: extract getOrgRepoUrls to shared lib/github/ (DRY)
- move inline getOrgRepoUrls from getArtistContentReadiness into lib/github/
- reuse existing getRepoGitModules + parseGitModules instead of duplicating
- getArtistContentReadiness now imports the shared utility
* fix: address CodeRabbit review feedback (round 2)
- rename misleading 'missing_fields' to 'field' in validation error response
- don't expose internal error messages in 500 responses (security)
- log errors server-side, return generic message to clients
* fix: address all remaining CodeRabbit review comments
- distinguish 'no repo configured' vs 'query failed' in readiness check
- sanitize 500 error in validate handler (don't expose internals)
- make persistence idempotent (re-select on insert race condition)
- fix Zod .optional().default() ordering for correct behavior
- batch mode uses Promise.allSettled (partial failures return successful runIds)
- getOrgRepoUrls accepts configurable branch param
* fix: resolve TypeScript build error in batch mode filter
* refactor: always return runIds array (KISS — one response shape)
Per reviewer feedback: single mode and batch mode now both return
runIds array. Removes the runId/runIds split. Clients always handle
one shape.
* refactor: extract helpers into separate files for SRP
- getArtistRootPrefix → lib/content/getArtistRootPrefix.ts
- getArtistFileTree → lib/content/getArtistFileTree.ts
- isCompletedRun + TriggerRunLike → lib/content/isCompletedRun.ts
- booleanFromString → lib/content/booleanFromString.ts
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* feat: add artist_account_id support to content endpoints
- POST /api/content/create accepts artist_account_id OR artist_slug
- GET /api/content/validate accepts artist_account_id OR artist_slug
- resolveArtistSlug looks up artist name from accounts table, converts to slug
- matches docs contract (either identifier works)
- 404 if artist_account_id not found, 400 if neither provided
* refactor: use artist_account_id only, remove artist_slug (KISS)
Per reviewer feedback: one ID system, not two.
- artist_account_id is the only accepted identifier
- API resolves account_id → name → slug internally
- removes artist_slug from create and validate schemas
- matches how the rest of the Recoup platform works
* fix: use GitHub Contents API for .gitmodules (works for private repos)
raw.githubusercontent.com returns 404 for private repos.
Switched to api.github.com/repos/.../contents/ which works with Bearer token.
* fix: make validation best-effort so submodule artists aren't blocked
Validation can't reliably see files in org submodule repos (private repo
GitHub API limitations). The task has its own submodule-aware file discovery.
Validation now warns but doesn't block — task fails with clear error if
files are actually missing.
* refactor: remove account_id, use artist_account_id in content responses
- Add artistAccountId to validated types and pass through handlers
- Replace artist_slug with artist_account_id in API responses
- artistSlug remains as internal impl detail for file system paths
- Update all content tests to match
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* fix: top-level import for selectAccountSnapshots, install @chat-adapter/whatsapp
- Move dynamic import to static top-level import in createContentHandler
- Add proper mock in test instead of relying on dynamic import workaround
- Install missing @chat-adapter/whatsapp package (fixes pre-existing test failure)
All 1460 tests passing.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* fix: handle empty repo (409) gracefully in content validation
- Return ready=false instead of throwing when repo tree is null
- Add repo/branch context to GitHub API error logs
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* fix: strip internal githubRepo from content validate response
Prevents leaking the internal GitHub repository URL in the public API
response by destructuring it out before serializing.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* revert: remove video hydration from getTaskRunHandler
Decouples run status checks from video storage uploads. Video
persistence will be handled separately in a future iteration.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
---------
Co-authored-by: Sweets Sweetman <sweetmantech@gmail.com>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
📝 WalkthroughWalkthroughThis pull request introduces a comprehensive content creation API with four new Next.js route handlers (create, estimate, templates, validate) and supporting infrastructure for GitHub integration, readiness checks, file persistence, and asynchronous task triggering via Trigger.dev. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant API as /api/content/create
participant Validator as validateCreateContentBody
participant Readiness as getArtistContentReadiness
participant GitHub as GitHub API
participant Trigger as Trigger.dev
Client->>API: POST /api/content/create
API->>Validator: Validate body & auth
Validator->>Validator: Parse JSON, schema check
Validator->>Validator: Validate template support
Validator->>Validator: Resolve artist slug
Validator-->>API: Validated payload
API->>Readiness: Check content readiness
Readiness->>GitHub: Get account snapshots
Readiness->>GitHub: Fetch artist file tree
Readiness->>GitHub: Check required files
GitHub-->>Readiness: File structure data
Readiness-->>API: Readiness status & issues
alt Ready
API->>Trigger: Trigger CREATE_CONTENT tasks
Trigger-->>Trigger: Batch process (Promise.allSettled)
Trigger-->>API: Run IDs & results
API-->>Client: 202 with runIds & status
else Not Ready
API-->>Client: 400 with missing issues
end
sequenceDiagram
participant Client
participant API as /api/content/validate
participant Validator as validateGetContentValidateQuery
participant Readiness as getArtistContentReadiness
participant GitHub as GitHub API
Client->>API: GET /api/content/validate?artist_account_id=...
API->>Validator: Validate query & auth
Validator->>Validator: Check artist_account_id present
Validator->>Validator: Resolve artist slug
Validator-->>API: Query data (accountId, artistAccountId, artistSlug)
API->>Readiness: Get content readiness
Readiness->>GitHub: Fetch artist snapshots & file tree
GitHub-->>Readiness: Repository structure
Readiness->>Readiness: Categorize missing (required/recommended)
Readiness-->>API: Readiness with missing issues
API->>API: Strip githubRepo field
API-->>Client: 200 with readiness & issues (no repo URL)
sequenceDiagram
participant Client
participant API as /api/content/estimate
participant Validator as validateGetContentEstimateQuery
Client->>API: GET /api/content/estimate?lipsync=true&batch=5&compare=true
API->>Validator: Validate query & auth
Validator->>Validator: Parse lipsync, batch, compare
Validator->>Validator: Apply schema (booleans, int 1-100)
Validator-->>API: Validated query
API->>API: Calculate per_video_estimate<br/>(lipsync ? image_cost : audio_cost)
API->>API: Calculate total = per_video × batch
alt compare=true
API->>API: Add compare profiles<br/>(budget, mid, current)
end
API-->>Client: 200 with estimate_usd & breakdown
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ❌ 1❌ Failed checks (1 warning)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 11
🧹 Nitpick comments (5)
lib/supabase/files/selectFileByStorageKey.ts (1)
2-2: Avoid cross-module type coupling for read queries.Importing
FileRecordfromcreateFileRecordties read-path typing to write-path implementation. Prefer returningTables<"files"> | nulldirectly in this module.As per coding guidelines: “Return typed results using Tables<"table_name">.”
Also applies to: 17-17
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@lib/supabase/files/selectFileByStorageKey.ts` at line 2, The module imports FileRecord from createFileRecord which couples the read path to the write path; update selectFileByStorageKey to return Tables<"files"> | null instead of FileRecord and remove the cross-module type import (replace the import line and any type annotations referencing FileRecord in selectFileByStorageKey and its helpers). Ensure you import the Tables type from your supabase types module (or wherever Tables is defined) and update the function signature and any local variable types to use Tables<"files"> | null.lib/content/validateCreateContentBody.ts (1)
15-37: Export the schema-derived body type alongside the normalized result type.
ValidatedCreateContentBodyis handwritten and includes post-validation fields, so it can drift fromcreateContentBodySchema. Keep the normalized type, but also export the schema’s inferred input type so this validator follows the repo contract.♻️ Small type-safety add
export const createContentBodySchema = z.object({ artist_account_id: z.string({ message: "artist_account_id is required" }).uuid("artist_account_id must be a valid UUID"), template: z @@ batch: z.number().int().min(1).max(30).optional().default(1), }); +export type CreateContentBody = z.infer<typeof createContentBodySchema>; + export type ValidatedCreateContentBody = { accountId: string; artistAccountId: string; artistSlug: string;As per coding guidelines, "Create validate functions in
validate<EndpointName>Body.tsorvalidate<EndpointName>Query.tsfiles that export both the schema and inferred TypeScript type."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@lib/content/validateCreateContentBody.ts` around lines 15 - 37, Export the schema-inferred input type alongside the normalized result: add an exported type alias (e.g. CreateContentBodySchema or CreateContentBodyInput) that uses z.infer<typeof createContentBodySchema>, keep the existing ValidatedCreateContentBody for the normalized/post-validation shape, and ensure both types are exported from this module so callers can use the raw schema type as well as the normalized type; reference createContentBodySchema and ValidatedCreateContentBody when making the change.lib/content/getArtistFileTree.ts (1)
14-28: Extract duplicated artist-path detection into a shared helper.The same blob-path predicate appears twice (Line 15-17 and Line 26-28). Please centralize it to reduce drift risk.
♻️ Proposed refactor
export async function getArtistFileTree( githubRepo: string, artistSlug: string, ): Promise<FileTreeEntry[] | null> { + const hasArtistInTree = (tree: FileTreeEntry[]): boolean => { + const blobPaths = tree.filter(e => e.type === "blob").map(e => e.path); + return blobPaths.some( + p => p.startsWith(`artists/${artistSlug}/`) || p.startsWith(`${artistSlug}/`), + ); + }; + const mainTree = await getRepoFileTree(githubRepo); - if (mainTree) { - const blobPaths = mainTree.filter(e => e.type === "blob").map(e => e.path); - const hasArtist = blobPaths.some( - p => p.startsWith(`artists/${artistSlug}/`) || p.startsWith(`${artistSlug}/`), - ); - if (hasArtist) return mainTree; - } + if (mainTree && hasArtistInTree(mainTree)) return mainTree; const orgUrls = await getOrgRepoUrls(githubRepo); for (const orgUrl of orgUrls) { const orgTree = await getRepoFileTree(orgUrl); - if (orgTree) { - const blobPaths = orgTree.filter(e => e.type === "blob").map(e => e.path); - const hasArtist = blobPaths.some( - p => p.startsWith(`artists/${artistSlug}/`) || p.startsWith(`${artistSlug}/`), - ); - if (hasArtist) return orgTree; - } + if (orgTree && hasArtistInTree(orgTree)) return orgTree; }As per coding guidelines "
**/*.{ts,tsx}: Extract shared logic into reusable utilities following Don't Repeat Yourself (DRY) principle".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@lib/content/getArtistFileTree.ts` around lines 14 - 28, Extract the duplicated blob-path predicate into a small shared helper (e.g., isArtistBlobPath(path, artistSlug) or hasArtistInTree(tree, artistSlug)) in getArtistFileTree.ts or a nearby util; replace both inline uses that build blobPaths and call .some(...) with the new helper (update the checks inside the initial mainTree block and the orgTree loop to call the helper) so the path checks (p.startsWith(`artists/${artistSlug}/`) || p.startsWith(`${artistSlug}/`)) are centralized and reused.lib/content/createContentHandler.ts (1)
13-85: Split this handler into smaller units to keep domain functions concise.This function currently mixes validation handoff, repo resolution, orchestration, and response shaping in one block. Extracting helpers would improve readability and align with project limits.
As per coding guidelines "
lib/**/*.ts: For domain functions, ensure: ... Keep functions under 50 lines" and "lib/**/*.ts: DRY: Consolidate similar logic into shared utilities".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@lib/content/createContentHandler.ts` around lines 13 - 85, The createContentHandler is doing validation, repo resolution, orchestration, and response shaping in one large block; refactor by extracting three small helpers: (1) resolveGithubRepo(validated) — encapsulate the try/catch that calls getArtistContentReadiness and falls back to selectAccountSnapshots and throws a clear error if no repo; (2) triggerBatchCreates(payload, count) — perform the Promise.allSettled logic and return { runIds, failedCount } using triggerCreateContent; and (3) buildCreateResponse(result, validated) — compose the NextResponse body and status (202) using runIds/failedCount and validated fields. Keep each helper and the resulting createContentHandler under ~50 lines, reuse shared utilities for CORS (getCorsHeaders) and validation (validateCreateContentBody), and have createContentHandler call the new helpers and only handle high-level try/catch and response return.lib/content/getArtistContentReadiness.ts (1)
73-127: Consider extracting file validation logic to reduce function length.The function exceeds the 50-line guideline at ~110 lines. The file validation block (lines 73-127) is a self-contained unit that could be extracted into a helper function like
checkRequiredFiles(blobPaths, artistRootPrefix), improving readability and testability.This isn't urgent but would align better with the SRP guideline and make unit testing the validation rules simpler. As per coding guidelines: "Keep functions under 50 lines."
♻️ Potential extraction pattern
// Could be extracted to a separate file or kept as a local helper function checkArtistFiles( blobPaths: string[], artistRootPrefix: string ): ContentReadinessIssue[] { const issues: ContentReadinessIssue[] = []; const hasFile = (relativePath: string): boolean => blobPaths.some(path => path === `${artistRootPrefix}${relativePath}`); const hasAnyMp3 = blobPaths.some( path => path.startsWith(artistRootPrefix) && path.toLowerCase().endsWith(".mp3") ); if (!hasFile("context/images/face-guide.png")) { issues.push({ file: "context/images/face-guide.png", severity: "required", fix: "..." }); } // ... remaining checks return issues; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@lib/content/getArtistContentReadiness.ts` around lines 73 - 127, Extract the file-validation block in getArtistContentReadiness.ts into a new helper (e.g., checkArtistFiles or checkRequiredFiles) that returns ContentReadinessIssue[]; move the hasFile and hasAnyMp3 predicates into that helper, keep the same checks that push issues (face-guide, config.json, songs/*.mp3, context/*.md, context/era.json), and have getArtistContentReadiness call this helper and then derive requiredMissing and warnings from its returned array so the main function length is reduced and the validation is testable.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@app/api/content/create/route.ts`:
- Around line 20-24: Update the JSDoc in app/api/content/create/route.ts so it
documents that the endpoint returns multiple run IDs (an array) rather than a
singular run ID: change the description and `@returns` to reference runIds
(string[]) and plural wording for the response from the route/handler (the POST
handler in this file) so the comment matches the actual delegated handler
behavior.
In `@lib/content/createContentHandler.ts`:
- Around line 64-74: The current response always returns a 202 with possibly
empty runIds; if all triggers failed you should instead return an error
response: check runIds.length === 0 after the batch logic in
createContentHandler (the block that constructs the NextResponse.json using
runIds, failedCount, artist_account_id, template, lipsync and getCorsHeaders)
and if true return a failure NextResponse.json containing an error message (and
include failedCount and any relevant context) with an appropriate non-2xx status
(e.g., 500 or 422) rather than 202; otherwise keep the existing 202 response
path.
In `@lib/content/getArtistContentReadiness.ts`:
- Around line 35-42: The current truthiness check on snapshots is incorrect
because selectAccountSnapshots returns an array; replace the dead if
(!snapshots) branch in getArtistContentReadiness with a check for an empty array
(if (snapshots.length === 0)) and update the thrown error to reflect "No account
snapshots found" (or remove the branch entirely and rely on the subsequent
github_repo null check); locate the logic using selectAccountSnapshots,
snapshots, and githubRepo to make this change so the error message accurately
represents an empty result rather than a failed query.
In `@lib/content/persistCreateContentRunVideo.ts`:
- Around line 31-125: The persistCreateContentRunVideo function currently does
an un-timed fetch(output.videoSourceUrl) and is too large/complex; add an
explicit timeout to the download using AbortController (e.g., create an
AbortController, setTimeout to abort after a sensible deadline, and pass
controller.signal to fetch) and extract the download/upload responsibilities
into a new helper (e.g., downloadVideoWithTimeout or downloadAndUploadVideo)
that returns the uploaded file blob/metadata (mimeType and size or storageKey)
so persistCreateContentRunVideo only handles validation, DB lookups
(selectFileByStorageKey), record creation (createFileRecord), race handling, and
signed URL generation (createSignedFileUrlByKey); call the helper from
persistCreateContentRunVideo in place of the direct fetch/upload steps and
ensure the helper cleans up the timeout and throws clear errors on non-ok
responses.
In `@lib/content/resolveArtistSlug.ts`:
- Line 19: Replace the weak whitespace-only slug logic in resolveArtistSlug.ts
(the return in the exported function that currently does
name.toLowerCase().replace(/\s+/g, "-")) with a call to a shared canonical slug
utility (e.g., toArtistSlug); implement and export toArtistSlug as a reusable
function that: normalizes Unicode (NFKD), strips non-word/separator characters,
trims, lowercases, collapses whitespace to single hyphens and collapses multiple
hyphens (per the suggested implementation), then import and use toArtistSlug in
resolveArtistSlug and in the sandbox folder generation code so both use the same
robust slug algorithm.
In `@lib/content/validateGetContentEstimateQuery.ts`:
- Line 10: Update the batch constraint in the schema inside
validateGetContentEstimateQuery.ts to match the create endpoint by changing the
z.coerce.number() chain for the "batch" field from .max(100) to .max(30); locate
the validation where batch is defined (the line containing batch:
z.coerce.number().int().min(1).max(100).default(1)) and modify the max value to
30 so estimate requests reflect the create endpoint's limit.
In `@lib/content/validateGetContentValidateQuery.ts`:
- Around line 1-39: Add a Zod schema and exported inferred type, validate the
query against it, and reject invalid values before calling resolveArtistSlug:
define a schema (e.g. getContentValidateQuerySchema = z.object({
artist_account_id: z.string().uuid() })) and export its inferred type (e.g.
export type GetContentValidateQuery = z.infer<typeof
getContentValidateQuerySchema>), then in validateGetContentValidateQuery() build
a plain object from request.nextUrl.searchParams, use schema.safeParse()/parse
to validate and return a NextResponse 400 with getCorsHeaders() on failure; when
valid, use the parsed.artist_account_id value (not the raw string) as
artistAccountId and continue to resolveArtistSlug, returning the existing
ValidatedGetContentValidateQuery shape on success.
In `@lib/github/getOrgRepoUrls.ts`:
- Line 29: getOrgRepoUrls currently returns raw submodule.url values (see the
return from submodules.map(s => s.url)), which can include SSH or relative URLs
that break downstream GitHub parsing; update getOrgRepoUrls to normalize and
filter these: for each submodule.url, ignore relative paths and non-GitHub
hosts, convert SSH-style git@github.com:org/repo.git to
https://github.com/org/repo (strip the .git suffix), accept already-formed
https/http URLs but normalize to https and remove trailing slashes, and only
return cleaned URLs (or skip those that cannot be normalized) so downstream
org/repo discovery receives consistent, parseable GitHub URLs.
In `@lib/github/getRepoGitModules.ts`:
- Around line 23-31: The URL interpolation for the GitHub contents API uses the
raw branch value, which can break for branch names with reserved characters;
update the request URL in getRepoGitModules (the fetch call building
`https://api.github.com/repos/${owner}/${repo}/contents/.gitmodules?ref=${branch}`)
to URL-encode the branch value (use encodeURIComponent on `branch`) so the ref
query parameter is safe; keep the rest of the headers/logic unchanged.
In `@lib/supabase/files/selectFileByStorageKey.ts`:
- Around line 18-24: Add a DB uniqueness constraint on the files table for the
(owner_account_id, storage_key) pair to prevent duplicates and remove reliance
on .maybeSingle() in selectFileByStorageKey; update the schema/migration to add
UNIQUE(owner_account_id, storage_key). Also replace the FileRecord type
usage/import with the generated supabase type Tables<"files"> (from your db
types) wherever selectFileByStorageKey and related code reference FileRecord so
queries use Tables<"files"> for correct typing and to decouple from the create
module.
In `@lib/trigger/triggerCreateContent.ts`:
- Around line 4-15: The TriggerCreateContentPayload is missing artistAccountId
which causes downstream code to substitute output.accountId (see
validateCreateContentBody() and persistCreateContentRunVideo.ts); add
artistAccountId: string to the TriggerCreateContentPayload interface, ensure
validateCreateContentBody() includes and sets artistAccountId on the produced
payload, and update persistCreateContentRunVideo (and any callers) to use
payload.artistAccountId when persisting the created content instead of relying
on output.accountId; also update any associated type imports/usage to compile
with the new field.
---
Nitpick comments:
In `@lib/content/createContentHandler.ts`:
- Around line 13-85: The createContentHandler is doing validation, repo
resolution, orchestration, and response shaping in one large block; refactor by
extracting three small helpers: (1) resolveGithubRepo(validated) — encapsulate
the try/catch that calls getArtistContentReadiness and falls back to
selectAccountSnapshots and throws a clear error if no repo; (2)
triggerBatchCreates(payload, count) — perform the Promise.allSettled logic and
return { runIds, failedCount } using triggerCreateContent; and (3)
buildCreateResponse(result, validated) — compose the NextResponse body and
status (202) using runIds/failedCount and validated fields. Keep each helper and
the resulting createContentHandler under ~50 lines, reuse shared utilities for
CORS (getCorsHeaders) and validation (validateCreateContentBody), and have
createContentHandler call the new helpers and only handle high-level try/catch
and response return.
In `@lib/content/getArtistContentReadiness.ts`:
- Around line 73-127: Extract the file-validation block in
getArtistContentReadiness.ts into a new helper (e.g., checkArtistFiles or
checkRequiredFiles) that returns ContentReadinessIssue[]; move the hasFile and
hasAnyMp3 predicates into that helper, keep the same checks that push issues
(face-guide, config.json, songs/*.mp3, context/*.md, context/era.json), and have
getArtistContentReadiness call this helper and then derive requiredMissing and
warnings from its returned array so the main function length is reduced and the
validation is testable.
In `@lib/content/getArtistFileTree.ts`:
- Around line 14-28: Extract the duplicated blob-path predicate into a small
shared helper (e.g., isArtistBlobPath(path, artistSlug) or hasArtistInTree(tree,
artistSlug)) in getArtistFileTree.ts or a nearby util; replace both inline uses
that build blobPaths and call .some(...) with the new helper (update the checks
inside the initial mainTree block and the orgTree loop to call the helper) so
the path checks (p.startsWith(`artists/${artistSlug}/`) ||
p.startsWith(`${artistSlug}/`)) are centralized and reused.
In `@lib/content/validateCreateContentBody.ts`:
- Around line 15-37: Export the schema-inferred input type alongside the
normalized result: add an exported type alias (e.g. CreateContentBodySchema or
CreateContentBodyInput) that uses z.infer<typeof createContentBodySchema>, keep
the existing ValidatedCreateContentBody for the normalized/post-validation
shape, and ensure both types are exported from this module so callers can use
the raw schema type as well as the normalized type; reference
createContentBodySchema and ValidatedCreateContentBody when making the change.
In `@lib/supabase/files/selectFileByStorageKey.ts`:
- Line 2: The module imports FileRecord from createFileRecord which couples the
read path to the write path; update selectFileByStorageKey to return
Tables<"files"> | null instead of FileRecord and remove the cross-module type
import (replace the import line and any type annotations referencing FileRecord
in selectFileByStorageKey and its helpers). Ensure you import the Tables type
from your supabase types module (or wherever Tables is defined) and update the
function signature and any local variable types to use Tables<"files"> | null.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: f0a22aa9-edbd-41d6-9803-9254c2ac5bbf
⛔ Files ignored due to path filters (10)
lib/content/__tests__/createContentHandler.test.tsis excluded by!**/*.test.*,!**/__tests__/**and included bylib/**lib/content/__tests__/getArtistContentReadiness.test.tsis excluded by!**/*.test.*,!**/__tests__/**and included bylib/**lib/content/__tests__/getContentEstimateHandler.test.tsis excluded by!**/*.test.*,!**/__tests__/**and included bylib/**lib/content/__tests__/getContentTemplatesHandler.test.tsis excluded by!**/*.test.*,!**/__tests__/**and included bylib/**lib/content/__tests__/getContentValidateHandler.test.tsis excluded by!**/*.test.*,!**/__tests__/**and included bylib/**lib/content/__tests__/persistCreateContentRunVideo.test.tsis excluded by!**/*.test.*,!**/__tests__/**and included bylib/**lib/content/__tests__/validateCreateContentBody.test.tsis excluded by!**/*.test.*,!**/__tests__/**and included bylib/**lib/github/__tests__/getRepoGitModules.test.tsis excluded by!**/*.test.*,!**/__tests__/**and included bylib/**lib/trigger/__tests__/triggerCreateContent.test.tsis excluded by!**/*.test.*,!**/__tests__/**and included bylib/**pnpm-lock.yamlis excluded by!**/pnpm-lock.yamland included by none
📒 Files selected for processing (26)
app/api/content/create/route.tsapp/api/content/estimate/route.tsapp/api/content/templates/route.tsapp/api/content/validate/route.tslib/const.tslib/content/booleanFromString.tslib/content/contentTemplates.tslib/content/createContentHandler.tslib/content/getArtistContentReadiness.tslib/content/getArtistFileTree.tslib/content/getArtistRootPrefix.tslib/content/getContentEstimateHandler.tslib/content/getContentTemplatesHandler.tslib/content/getContentValidateHandler.tslib/content/isCompletedRun.tslib/content/persistCreateContentRunVideo.tslib/content/resolveArtistSlug.tslib/content/validateCreateContentBody.tslib/content/validateGetContentEstimateQuery.tslib/content/validateGetContentValidateQuery.tslib/github/getOrgRepoUrls.tslib/github/getRepoFileTree.tslib/github/getRepoGitModules.tslib/supabase/files/selectFileByStorageKey.tslib/supabase/storage/createSignedFileUrlByKey.tslib/trigger/triggerCreateContent.ts
| * Triggers the background content-creation pipeline and returns a run ID. | ||
| * | ||
| * @param request - Incoming API request. | ||
| * @returns Trigger response for the created task run. | ||
| */ |
There was a problem hiding this comment.
Fix JSDoc to reflect runIds array response.
Line 20 and Line 23 describe a singular run ID, but this endpoint returns multiple IDs through the delegated handler. Please align the JSDoc wording to avoid contract confusion.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@app/api/content/create/route.ts` around lines 20 - 24, Update the JSDoc in
app/api/content/create/route.ts so it documents that the endpoint returns
multiple run IDs (an array) rather than a singular run ID: change the
description and `@returns` to reference runIds (string[]) and plural wording for
the response from the route/handler (the POST handler in this file) so the
comment matches the actual delegated handler behavior.
| return NextResponse.json( | ||
| { | ||
| runIds, | ||
| status: "triggered", | ||
| artist_account_id: validated.artistAccountId, | ||
| template: validated.template, | ||
| lipsync: validated.lipsync, | ||
| ...(failedCount > 0 && { failed: failedCount }), | ||
| }, | ||
| { status: 202, headers: getCorsHeaders() }, | ||
| ); |
There was a problem hiding this comment.
Return an error when no content runs were successfully triggered.
If all batch triggers fail, this currently responds with 202 and empty runIds, which misreports outcome. Please return a failure status when runIds.length === 0.
🐛 Proposed fix
const runIds = results
.filter(r => r.status === "fulfilled")
.map(r => (r as PromiseFulfilledResult<{ id: string }>).value.id);
const failedCount = results.filter(r => r.status === "rejected").length;
+
+ if (runIds.length === 0) {
+ return NextResponse.json(
+ {
+ status: "error",
+ error: "Failed to trigger content creation",
+ failed: failedCount,
+ },
+ { status: 502, headers: getCorsHeaders() },
+ );
+ }
return NextResponse.json(
{
runIds,
status: "triggered",📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| return NextResponse.json( | |
| { | |
| runIds, | |
| status: "triggered", | |
| artist_account_id: validated.artistAccountId, | |
| template: validated.template, | |
| lipsync: validated.lipsync, | |
| ...(failedCount > 0 && { failed: failedCount }), | |
| }, | |
| { status: 202, headers: getCorsHeaders() }, | |
| ); | |
| if (runIds.length === 0) { | |
| return NextResponse.json( | |
| { | |
| status: "error", | |
| error: "Failed to trigger content creation", | |
| failed: failedCount, | |
| }, | |
| { status: 502, headers: getCorsHeaders() }, | |
| ); | |
| } | |
| return NextResponse.json( | |
| { | |
| runIds, | |
| status: "triggered", | |
| artist_account_id: validated.artistAccountId, | |
| template: validated.template, | |
| lipsync: validated.lipsync, | |
| ...(failedCount > 0 && { failed: failedCount }), | |
| }, | |
| { status: 202, headers: getCorsHeaders() }, | |
| ); |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@lib/content/createContentHandler.ts` around lines 64 - 74, The current
response always returns a 202 with possibly empty runIds; if all triggers failed
you should instead return an error response: check runIds.length === 0 after the
batch logic in createContentHandler (the block that constructs the
NextResponse.json using runIds, failedCount, artist_account_id, template,
lipsync and getCorsHeaders) and if true return a failure NextResponse.json
containing an error message (and include failedCount and any relevant context)
with an appropriate non-2xx status (e.g., 500 or 422) rather than 202; otherwise
keep the existing 202 response path.
| const snapshots = await selectAccountSnapshots(accountId); | ||
| if (!snapshots) { | ||
| throw new Error("Failed to query account snapshots"); | ||
| } | ||
| const githubRepo = snapshots[0]?.github_repo ?? null; | ||
| if (!githubRepo) { | ||
| throw new Error("No GitHub repository configured for this account"); | ||
| } |
There was a problem hiding this comment.
Incorrect check: arrays are always truthy.
The selectAccountSnapshots function returns Tables<"account_snapshots">[] — an array that's always truthy even when empty. The check if (!snapshots) on line 36 will never trigger; instead, an empty array would pass through, and line 39 would set githubRepo to null (via optional chaining), correctly caught at line 40.
The error message "Failed to query account snapshots" is misleading — an empty array means no snapshots exist for this account, not a query failure (query failures are already handled internally by selectAccountSnapshots which returns []).
Consider removing the dead code path or adjusting the logic:
🛠️ Proposed fix
const snapshots = await selectAccountSnapshots(accountId);
- if (!snapshots) {
- throw new Error("Failed to query account snapshots");
- }
const githubRepo = snapshots[0]?.github_repo ?? null;
if (!githubRepo) {
- throw new Error("No GitHub repository configured for this account");
+ throw new Error(
+ snapshots.length === 0
+ ? "No account snapshot found for this account"
+ : "No GitHub repository configured for this account"
+ );
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const snapshots = await selectAccountSnapshots(accountId); | |
| if (!snapshots) { | |
| throw new Error("Failed to query account snapshots"); | |
| } | |
| const githubRepo = snapshots[0]?.github_repo ?? null; | |
| if (!githubRepo) { | |
| throw new Error("No GitHub repository configured for this account"); | |
| } | |
| const snapshots = await selectAccountSnapshots(accountId); | |
| const githubRepo = snapshots[0]?.github_repo ?? null; | |
| if (!githubRepo) { | |
| throw new Error( | |
| snapshots.length === 0 | |
| ? "No account snapshot found for this account" | |
| : "No GitHub repository configured for this account" | |
| ); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@lib/content/getArtistContentReadiness.ts` around lines 35 - 42, The current
truthiness check on snapshots is incorrect because selectAccountSnapshots
returns an array; replace the dead if (!snapshots) branch in
getArtistContentReadiness with a check for an empty array (if (snapshots.length
=== 0)) and update the thrown error to reflect "No account snapshots found" (or
remove the branch entirely and rely on the subsequent github_repo null check);
locate the logic using selectAccountSnapshots, snapshots, and githubRepo to make
this change so the error message accurately represents an empty result rather
than a failed query.
| export async function persistCreateContentRunVideo<T extends TriggerRunLike>(run: T): Promise<T> { | ||
| if (run.taskIdentifier !== CREATE_CONTENT_TASK_ID || !isCompletedRun(run)) { | ||
| return run; | ||
| } | ||
|
|
||
| const output = (run.output ?? {}) as CreateContentOutput; | ||
| if (!output.accountId || !output.artistSlug || !output.videoSourceUrl) { | ||
| return run; | ||
| } | ||
|
|
||
| if (output.video?.storageKey) { | ||
| return run; | ||
| } | ||
|
|
||
| const fileName = `${output.artistSlug}-${run.id}.mp4`; | ||
| const storageKey = `content/${output.accountId}/${output.artistSlug}/${fileName}`; | ||
|
|
||
| const existingFile = await selectFileByStorageKey({ | ||
| ownerAccountId: output.accountId, | ||
| storageKey, | ||
| }); | ||
|
|
||
| if (existingFile) { | ||
| const signedUrl = await createSignedFileUrlByKey({ | ||
| key: existingFile.storage_key, | ||
| }); | ||
|
|
||
| return { | ||
| ...run, | ||
| output: { | ||
| ...output, | ||
| video: { | ||
| fileId: existingFile.id, | ||
| storageKey: existingFile.storage_key, | ||
| fileName: existingFile.file_name, | ||
| mimeType: existingFile.mime_type, | ||
| sizeBytes: existingFile.size_bytes, | ||
| signedUrl, | ||
| }, | ||
| }, | ||
| }; | ||
| } | ||
|
|
||
| const response = await fetch(output.videoSourceUrl); | ||
| if (!response.ok) { | ||
| throw new Error(`Failed to download rendered video: ${response.status} ${response.statusText}`); | ||
| } | ||
|
|
||
| const videoBlob = await response.blob(); | ||
| const mimeType = response.headers.get("content-type") || "video/mp4"; | ||
|
|
||
| await uploadFileByKey(storageKey, videoBlob, { | ||
| contentType: mimeType, | ||
| upsert: true, | ||
| }); | ||
|
|
||
| let createdFile; | ||
| try { | ||
| createdFile = await createFileRecord({ | ||
| ownerAccountId: output.accountId, | ||
| // Phase 1: artist account mapping is not wired yet, so we scope to owner account. | ||
| artistAccountId: output.accountId, | ||
| storageKey, | ||
| fileName, | ||
| mimeType, | ||
| sizeBytes: videoBlob.size, | ||
| description: `Content pipeline output for ${output.artistSlug}`, | ||
| tags: ["content", "video", output.template ?? "unknown-template"], | ||
| }); | ||
| } catch { | ||
| // Race condition: another request may have created the record. Re-select. | ||
| const raceFile = await selectFileByStorageKey({ ownerAccountId: output.accountId, storageKey }); | ||
| if (!raceFile) throw new Error("Failed to create or find file record"); | ||
| createdFile = raceFile; | ||
| } | ||
|
|
||
| const signedUrl = await createSignedFileUrlByKey({ | ||
| key: createdFile.storage_key, | ||
| }); | ||
|
|
||
| return { | ||
| ...run, | ||
| output: { | ||
| ...output, | ||
| video: { | ||
| fileId: createdFile.id, | ||
| storageKey: createdFile.storage_key, | ||
| fileName: createdFile.file_name, | ||
| mimeType: createdFile.mime_type, | ||
| sizeBytes: createdFile.size_bytes, | ||
| signedUrl, | ||
| }, | ||
| }, | ||
| }; | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# Locate and inspect the file
fd "persistCreateContentRunVideo" --type fRepository: recoupable/api
Length of output: 160
🏁 Script executed:
# If found, check the file exists and get its line count
[ -f "lib/content/persistCreateContentRunVideo.ts" ] && wc -l "lib/content/persistCreateContentRunVideo.ts"Repository: recoupable/api
Length of output: 105
🏁 Script executed:
# Read the file to verify the function signature and structure
cat -n "lib/content/persistCreateContentRunVideo.ts" | head -150Repository: recoupable/api
Length of output: 4638
Add timeout to fetch() call and refactor to reduce function complexity.
The fetch(output.videoSourceUrl) call on line 74 has no timeout—a stalled upstream response will hold the API request open indefinitely. Additionally, this function exceeds the 50-line guideline (currently 95 lines) and combines too many responsibilities: validation, DB lookup, remote download, storage upload, DB insert with race recovery, and URL generation. This violates SRP and makes it difficult to test and maintain.
⏱️ Add timeout to fetch
- const response = await fetch(output.videoSourceUrl);
+ const controller = new AbortController();
+ const timeout = setTimeout(() => controller.abort(), 30_000);
+ let response: Response;
+ try {
+ response = await fetch(output.videoSourceUrl, { signal: controller.signal });
+ } finally {
+ clearTimeout(timeout);
+ }
if (!response.ok) {
throw new Error(`Failed to download rendered video: ${response.status} ${response.statusText}`);
}Consider extracting the download and upload logic (lines 74–85) into a separate helper function. This will improve testability, error isolation, and deadline management.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@lib/content/persistCreateContentRunVideo.ts` around lines 31 - 125, The
persistCreateContentRunVideo function currently does an un-timed
fetch(output.videoSourceUrl) and is too large/complex; add an explicit timeout
to the download using AbortController (e.g., create an AbortController,
setTimeout to abort after a sensible deadline, and pass controller.signal to
fetch) and extract the download/upload responsibilities into a new helper (e.g.,
downloadVideoWithTimeout or downloadAndUploadVideo) that returns the uploaded
file blob/metadata (mimeType and size or storageKey) so
persistCreateContentRunVideo only handles validation, DB lookups
(selectFileByStorageKey), record creation (createFileRecord), race handling, and
signed URL generation (createSignedFileUrlByKey); call the helper from
persistCreateContentRunVideo in place of the direct fetch/upload steps and
ensure the helper cleans up the timeout and throws clear errors on non-ok
responses.
| const name = accounts[0]?.name; | ||
| if (!name) return null; | ||
|
|
||
| return name.toLowerCase().replace(/\s+/g, "-"); |
There was a problem hiding this comment.
Slug normalization is too weak for path-based lookups.
On Line 19, whitespace-only normalization can generate slugs that won’t match repo folder conventions for names with punctuation/diacritics (or separators), causing downstream false negatives in artist tree/readiness checks. Use a single canonical slug utility shared with sandbox folder generation.
Proposed fix
- return name.toLowerCase().replace(/\s+/g, "-");
+ return toArtistSlug(name);// Add in a shared utility used by both sandbox generation and content resolution.
export function toArtistSlug(input: string): string {
return input
.normalize("NFKD")
.replace(/[^\w\s-]/g, "")
.trim()
.toLowerCase()
.replace(/\s+/g, "-")
.replace(/-+/g, "-");
}As per coding guidelines: “Extract shared logic into reusable utilities following Don't Repeat Yourself (DRY) principle.”
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@lib/content/resolveArtistSlug.ts` at line 19, Replace the weak
whitespace-only slug logic in resolveArtistSlug.ts (the return in the exported
function that currently does name.toLowerCase().replace(/\s+/g, "-")) with a
call to a shared canonical slug utility (e.g., toArtistSlug); implement and
export toArtistSlug as a reusable function that: normalizes Unicode (NFKD),
strips non-word/separator characters, trims, lowercases, collapses whitespace to
single hyphens and collapses multiple hyphens (per the suggested
implementation), then import and use toArtistSlug in resolveArtistSlug and in
the sandbox folder generation code so both use the same robust slug algorithm.
| import type { NextRequest } from "next/server"; | ||
| import { NextResponse } from "next/server"; | ||
| import { getCorsHeaders } from "@/lib/networking/getCorsHeaders"; | ||
| import { validateAuthContext } from "@/lib/auth/validateAuthContext"; | ||
| import { resolveArtistSlug } from "@/lib/content/resolveArtistSlug"; | ||
|
|
||
| export type ValidatedGetContentValidateQuery = { | ||
| accountId: string; | ||
| artistAccountId: string; | ||
| artistSlug: string; | ||
| }; | ||
|
|
||
| /** | ||
| * Validates auth and query params for GET /api/content/validate. | ||
| * Requires artist_account_id query parameter. | ||
| */ | ||
| export async function validateGetContentValidateQuery( | ||
| request: NextRequest, | ||
| ): Promise<NextResponse | ValidatedGetContentValidateQuery> { | ||
| const authResult = await validateAuthContext(request); | ||
| if (authResult instanceof NextResponse) { | ||
| return authResult; | ||
| } | ||
|
|
||
| const artistAccountId = request.nextUrl.searchParams.get("artist_account_id"); | ||
| if (!artistAccountId) { | ||
| return NextResponse.json( | ||
| { status: "error", error: "artist_account_id query parameter is required" }, | ||
| { status: 400, headers: getCorsHeaders() }, | ||
| ); | ||
| } | ||
|
|
||
| const artistSlug = await resolveArtistSlug(artistAccountId); | ||
| if (!artistSlug) { | ||
| return NextResponse.json( | ||
| { status: "error", error: "Artist not found for the provided artist_account_id" }, | ||
| { status: 404, headers: getCorsHeaders() }, | ||
| ); | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, check if the file exists and its current state
cat -n lib/content/validateGetContentValidateQuery.ts 2>/dev/null | head -50Repository: recoupable/api
Length of output: 1847
🏁 Script executed:
# Search for other validate files to understand the repo pattern
fd -e ts -e tsx -path "*/lib/*" -name "validate*.ts" | head -20Repository: recoupable/api
Length of output: 226
🏁 Script executed:
# Check how other validate files use Zod
rg -A 10 "export const.*Schema = z\.object" lib/ --type ts | head -60Repository: recoupable/api
Length of output: 4308
🏁 Script executed:
# Look for the related content endpoint to understand full context
fd -e ts -path "*/app/api/*" -name "route.ts" | xargs rg -l "content" | head -10Repository: recoupable/api
Length of output: 511
Validate artist_account_id with Zod schema and export inferred type.
This validator doesn't follow the repository standard for validate*.ts files. It's missing:
- A Zod schema for the query parameter (currently accepts any non-empty string)
- An exported inferred type from that schema
Right now, malformed artist_account_id values reach resolveArtistSlug() and produce misleading 404 errors instead of rejecting invalid UUIDs with a 400 error, inconsistent with the pattern established in other validators like validateCreateWorkspaceBody.ts.
🛠️ Suggested implementation
+import { z } from "zod";
import type { NextRequest } from "next/server";
import { NextResponse } from "next/server";
import { getCorsHeaders } from "@/lib/networking/getCorsHeaders";
import { validateAuthContext } from "@/lib/auth/validateAuthContext";
import { resolveArtistSlug } from "@/lib/content/resolveArtistSlug";
+export const getContentValidateQuerySchema = z.object({
+ artist_account_id: z
+ .string({ message: "artist_account_id is required" })
+ .uuid("artist_account_id must be a valid UUID"),
+});
+
+export type GetContentValidateQuery = z.infer<typeof getContentValidateQuerySchema>;
+
export type ValidatedGetContentValidateQuery = {
accountId: string;
artistAccountId: string;
artistSlug: string;
};
@@
- const artistAccountId = request.nextUrl.searchParams.get("artist_account_id");
- if (!artistAccountId) {
+ const query = Object.fromEntries(request.nextUrl.searchParams.entries());
+ const result = getContentValidateQuerySchema.safeParse(query);
+ if (!result.success) {
return NextResponse.json(
- { status: "error", error: "artist_account_id query parameter is required" },
+ { status: "error", error: result.error.issues[0]?.message ?? "Invalid query" },
{ status: 400, headers: getCorsHeaders() },
);
}
+ const artistAccountId = result.data.artist_account_id;📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| import type { NextRequest } from "next/server"; | |
| import { NextResponse } from "next/server"; | |
| import { getCorsHeaders } from "@/lib/networking/getCorsHeaders"; | |
| import { validateAuthContext } from "@/lib/auth/validateAuthContext"; | |
| import { resolveArtistSlug } from "@/lib/content/resolveArtistSlug"; | |
| export type ValidatedGetContentValidateQuery = { | |
| accountId: string; | |
| artistAccountId: string; | |
| artistSlug: string; | |
| }; | |
| /** | |
| * Validates auth and query params for GET /api/content/validate. | |
| * Requires artist_account_id query parameter. | |
| */ | |
| export async function validateGetContentValidateQuery( | |
| request: NextRequest, | |
| ): Promise<NextResponse | ValidatedGetContentValidateQuery> { | |
| const authResult = await validateAuthContext(request); | |
| if (authResult instanceof NextResponse) { | |
| return authResult; | |
| } | |
| const artistAccountId = request.nextUrl.searchParams.get("artist_account_id"); | |
| if (!artistAccountId) { | |
| return NextResponse.json( | |
| { status: "error", error: "artist_account_id query parameter is required" }, | |
| { status: 400, headers: getCorsHeaders() }, | |
| ); | |
| } | |
| const artistSlug = await resolveArtistSlug(artistAccountId); | |
| if (!artistSlug) { | |
| return NextResponse.json( | |
| { status: "error", error: "Artist not found for the provided artist_account_id" }, | |
| { status: 404, headers: getCorsHeaders() }, | |
| ); | |
| } | |
| import { z } from "zod"; | |
| import type { NextRequest } from "next/server"; | |
| import { NextResponse } from "next/server"; | |
| import { getCorsHeaders } from "@/lib/networking/getCorsHeaders"; | |
| import { validateAuthContext } from "@/lib/auth/validateAuthContext"; | |
| import { resolveArtistSlug } from "@/lib/content/resolveArtistSlug"; | |
| export const getContentValidateQuerySchema = z.object({ | |
| artist_account_id: z | |
| .string({ message: "artist_account_id is required" }) | |
| .uuid("artist_account_id must be a valid UUID"), | |
| }); | |
| export type GetContentValidateQuery = z.infer<typeof getContentValidateQuerySchema>; | |
| export type ValidatedGetContentValidateQuery = { | |
| accountId: string; | |
| artistAccountId: string; | |
| artistSlug: string; | |
| }; | |
| /** | |
| * Validates auth and query params for GET /api/content/validate. | |
| * Requires artist_account_id query parameter. | |
| */ | |
| export async function validateGetContentValidateQuery( | |
| request: NextRequest, | |
| ): Promise<NextResponse | ValidatedGetContentValidateQuery> { | |
| const authResult = await validateAuthContext(request); | |
| if (authResult instanceof NextResponse) { | |
| return authResult; | |
| } | |
| const query = Object.fromEntries(request.nextUrl.searchParams.entries()); | |
| const result = getContentValidateQuerySchema.safeParse(query); | |
| if (!result.success) { | |
| return NextResponse.json( | |
| { status: "error", error: result.error.issues[0]?.message ?? "Invalid query" }, | |
| { status: 400, headers: getCorsHeaders() }, | |
| ); | |
| } | |
| const artistAccountId = result.data.artist_account_id; | |
| const artistSlug = await resolveArtistSlug(artistAccountId); | |
| if (!artistSlug) { | |
| return NextResponse.json( | |
| { status: "error", error: "Artist not found for the provided artist_account_id" }, | |
| { status: 404, headers: getCorsHeaders() }, | |
| ); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@lib/content/validateGetContentValidateQuery.ts` around lines 1 - 39, Add a
Zod schema and exported inferred type, validate the query against it, and reject
invalid values before calling resolveArtistSlug: define a schema (e.g.
getContentValidateQuerySchema = z.object({ artist_account_id: z.string().uuid()
})) and export its inferred type (e.g. export type GetContentValidateQuery =
z.infer<typeof getContentValidateQuerySchema>), then in
validateGetContentValidateQuery() build a plain object from
request.nextUrl.searchParams, use schema.safeParse()/parse to validate and
return a NextResponse 400 with getCorsHeaders() on failure; when valid, use the
parsed.artist_account_id value (not the raw string) as artistAccountId and
continue to resolveArtistSlug, returning the existing
ValidatedGetContentValidateQuery shape on success.
|
|
||
| if (!submodules) return []; | ||
|
|
||
| return submodules.map(s => s.url); |
There was a problem hiding this comment.
Normalize submodule URLs before returning them.
Line 29 forwards raw .gitmodules URLs, but downstream GitHub URL parsing is strict; SSH/relative URLs can be ignored and break org-repo fallback discovery.
Proposed fix
+function normalizeGitHubUrl(url: string): string | null {
+ const ssh = url.match(/^git@github\.com:(.+?)\/(.+?)(?:\.git)?$/);
+ if (ssh) return `https://github.com/${ssh[1]}/${ssh[2]}`;
+ if (url.startsWith("https://github.com/")) return url.replace(/\.git$/, "");
+ return null;
+}
...
- return submodules.map(s => s.url);
+ return submodules
+ .map(s => normalizeGitHubUrl(s.url))
+ .filter((u): u is string => Boolean(u));🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@lib/github/getOrgRepoUrls.ts` at line 29, getOrgRepoUrls currently returns
raw submodule.url values (see the return from submodules.map(s => s.url)), which
can include SSH or relative URLs that break downstream GitHub parsing; update
getOrgRepoUrls to normalize and filter these: for each submodule.url, ignore
relative paths and non-GitHub hosts, convert SSH-style
git@github.com:org/repo.git to https://github.com/org/repo (strip the .git
suffix), accept already-formed https/http URLs but normalize to https and remove
trailing slashes, and only return cleaned URLs (or skip those that cannot be
normalized) so downstream org/repo discovery receives consistent, parseable
GitHub URLs.
| const response = await fetch( | ||
| `https://raw.githubusercontent.com/${owner}/${repo}/${branch}/.gitmodules`, | ||
| { ...(token && { headers: { Authorization: `Bearer ${token}` } }) }, | ||
| `https://api.github.com/repos/${owner}/${repo}/contents/.gitmodules?ref=${branch}`, | ||
| { | ||
| headers: { | ||
| ...(token && { Authorization: `Bearer ${token}` }), | ||
| Accept: "application/vnd.github.v3.raw", | ||
| "User-Agent": "Recoup-API", | ||
| }, | ||
| }, |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
fd -t f "getRepoGitModules.ts" --type fileRepository: recoupable/api
Length of output: 89
🏁 Script executed:
rg -l "getRepoGitModules" --type tsRepository: recoupable/api
Length of output: 339
🏁 Script executed:
find . -name "*getRepoGitModules*" -type fRepository: recoupable/api
Length of output: 140
🏁 Script executed:
fd "github" --type dRepository: recoupable/api
Length of output: 98
🏁 Script executed:
git ls-files | grep -i github | head -20Repository: recoupable/api
Length of output: 970
🏁 Script executed:
cat -n lib/github/getRepoGitModules.tsRepository: recoupable/api
Length of output: 1423
🏁 Script executed:
cat -n lib/github/__tests__/getRepoGitModules.test.tsRepository: recoupable/api
Length of output: 2627
🏁 Script executed:
rg "getRepoGitModules" --type tsRepository: recoupable/api
Length of output: 4030
🏁 Script executed:
rg "contents/.gitmodules" --type tsRepository: recoupable/api
Length of output: 432
🏁 Script executed:
rg "api.github.com/repos" --type tsRepository: recoupable/api
Length of output: 1962
🌐 Web query:
GitHub API contents endpoint ref parameter URL encoding requirements
💡 Result:
- In the Contents endpoint (
GET /repos/{owner}/{repo}/contents/{path}),refis a query parameter: “the name of the commit/branch/tag.” [1] - Because it’s in the query string, you should apply standard URI percent-encoding rules (RFC 3986) to the parameter value (i.e., encode characters that would otherwise act as delimiters / aren’t allowed in a URI component). RFC 3986 states that if data would “conflict with a reserved character’s purpose as a delimiter, then the conflicting data MUST be percent-encoded before the URI is formed.” [2]
- Practical implications:
- Always encode
#as%23(otherwise it becomes a URL fragment and won’t be sent to GitHub). - Encode
&as%26and=as%3D(otherwise they break the query string). - For branch names with
/(e.g.,feature/foo), it’s often safe unencoded in a query value, but encoding as%2Fis a good defensive choice if any proxy/framework mishandles it.
- Always encode
- Practical implications:
- Don’t “pre-encode” and then let your HTTP client encode again (avoid double-encoding, e.g.
%becoming%25).
Example
GET https://api.github.com/repos/OWNER/REPO/contents/PATH?ref=feature%2Ffoo
Sources: [1] GitHub REST API “Get repository content” docs, ref query parameter. [2] RFC 3986 (URI Generic Syntax), percent-encoding/reserved-delimiter rule.
🌐 Web query:
encodeURIComponent branch names GitHub API safe
💡 Result:
Yes—URL-encoding branch names is the right/safe thing to do for the GitHub REST API, especially because branch names commonly contain /, which would otherwise be treated as a path separator.
What to do
- Encode the branch (or ref) value that you place into a URL path parameter using
encodeURIComponent(JS), so/becomes%2F(standard percent-encoding). [2] - For GitHub’s Git references endpoints, the
{ref}path parameter is documented asheads/<branch name>(ortags/<tag name>). [1]
If your branch isfeature/foo, you typically want the request path to contain one{ref}segment likeheads%2Ffeature%2Ffoo, not multiple segments.
Examples (JavaScript)
Get a git ref (branch names with slashes):
const ref = `heads/${branchName}`; // e.g. "heads/feature/foo"
const url = `/repos/${owner}/${repo}/git/ref/${encodeURIComponent(ref)}`;Get a branch (branches endpoint):
const url = `/repos/${owner}/${repo}/branches/${encodeURIComponent(branchName)}`;One gotcha (not GitHub-specific)
Some proxies/gateways incorrectly decode %2F back to / before routing, which can break endpoints that rely on encoded slashes. If you see mysterious 404s only in certain deployments, check your proxy/gateway settings. [3]
Sources: GitHub REST “Git references” endpoint format (heads/<branch name>). [1] Percent-encoding behavior (/ → %2F). [2] Encoded-slash decoding by gateways/proxies (example). [3]
Encode the ref query parameter to handle special characters in branch names.
The branch parameter is interpolated directly into the URL without encoding. If a branch name contains reserved characters like #, &, or /, this can cause the URL to be malformed. For example, a branch named feature/fix would need the / encoded as %2F to be treated as part of the parameter value rather than a path separator.
🔧 Safe URL encoding
- `https://api.github.com/repos/${owner}/${repo}/contents/.gitmodules?ref=${branch}`,
+ `https://api.github.com/repos/${owner}/${repo}/contents/.gitmodules?ref=${encodeURIComponent(branch)}`,📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const response = await fetch( | |
| `https://raw.githubusercontent.com/${owner}/${repo}/${branch}/.gitmodules`, | |
| { ...(token && { headers: { Authorization: `Bearer ${token}` } }) }, | |
| `https://api.github.com/repos/${owner}/${repo}/contents/.gitmodules?ref=${branch}`, | |
| { | |
| headers: { | |
| ...(token && { Authorization: `Bearer ${token}` }), | |
| Accept: "application/vnd.github.v3.raw", | |
| "User-Agent": "Recoup-API", | |
| }, | |
| }, | |
| const response = await fetch( | |
| `https://api.github.com/repos/${owner}/${repo}/contents/.gitmodules?ref=${encodeURIComponent(branch)}`, | |
| { | |
| headers: { | |
| ...(token && { Authorization: `Bearer ${token}` }), | |
| Accept: "application/vnd.github.v3.raw", | |
| "User-Agent": "Recoup-API", | |
| }, | |
| }, |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@lib/github/getRepoGitModules.ts` around lines 23 - 31, The URL interpolation
for the GitHub contents API uses the raw branch value, which can break for
branch names with reserved characters; update the request URL in
getRepoGitModules (the fetch call building
`https://api.github.com/repos/${owner}/${repo}/contents/.gitmodules?ref=${branch}`)
to URL-encode the branch value (use encodeURIComponent on `branch`) so the ref
query parameter is safe; keep the rest of the headers/logic unchanged.
| const { data, error } = await supabase | ||
| .from("files") | ||
| .select("*") | ||
| .eq("owner_account_id", ownerAccountId) | ||
| .eq("storage_key", storageKey) | ||
| .maybeSingle(); | ||
|
|
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
set -euo pipefail
# Verify whether uniqueness is enforced for (owner_account_id, storage_key)
echo "Searching migrations/schema for unique constraints on files(owner_account_id, storage_key)..."
rg -n -i 'create\s+unique\s+index|unique\s*\('
rg -n -i 'files.*owner_account_id.*storage_key|owner_account_id.*storage_key.*files'
rg -n 'owner_account_id|storage_key' types/database.types.tsRepository: recoupable/api
Length of output: 151
🏁 Script executed:
# Find migrations, schema, and database type files
find . -type f \( -name "*.sql" -o -name "*migration*" -o -name "*schema*" \) | head -20Repository: recoupable/api
Length of output: 40
🏁 Script executed:
# Check the structure of the supabase directory
ls -la lib/supabase/ 2>/dev/null || echo "lib/supabase not found"
ls -la lib/supabase/files/ 2>/dev/null || echo "lib/supabase/files not found"Repository: recoupable/api
Length of output: 2491
🏁 Script executed:
# Read the target file
cat lib/supabase/files/selectFileByStorageKey.tsRepository: recoupable/api
Length of output: 787
🏁 Script executed:
# Locate and examine database types file
find . -name "*database*.ts" -o -name "*database*.d.ts" | grep -E "(types|generated)"Repository: recoupable/api
Length of output: 83
🏁 Script executed:
# Read database types to understand Files table schema
cat ./types/database.types.tsRepository: recoupable/api
Length of output: 50371
🏁 Script executed:
# Read createFileRecord to understand FileRecord type
cat lib/supabase/files/createFileRecord.tsRepository: recoupable/api
Length of output: 1425
🏁 Script executed:
# Search for supabase migration/schema files
find . -path ./node_modules -prune -o -type f \( -name "*.sql" -o -name "*migration*" \) -print | grep -v node_modulesRepository: recoupable/api
Length of output: 40
🏁 Script executed:
# Find files directory in supabase to confirm Files table is manageable
ls -la lib/supabase/files/Repository: recoupable/api
Length of output: 316
🏁 Script executed:
# Search for any references to unique constraints on files table in codebase
rg -i "unique.*files|files.*unique" --type ts --type sqlRepository: recoupable/api
Length of output: 40
Add database-enforced uniqueness constraint and fix type coupling.
The .maybeSingle() call is fragile without a unique constraint on (owner_account_id, storage_key). If duplicate rows exist, the query throws instead of gracefully handling the lookup. The proposed workaround (adding order/limit) masks the issue but doesn't prevent duplicates at the source.
Additionally, FileRecord should be replaced with Tables<"files"> to follow Supabase query patterns and avoid type coupling to the create module.
Suggested fixes:
- Add a unique constraint to the database on
(owner_account_id, storage_key) - Replace
FileRecordimport withTables<"files">from the generated database types
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@lib/supabase/files/selectFileByStorageKey.ts` around lines 18 - 24, Add a DB
uniqueness constraint on the files table for the (owner_account_id, storage_key)
pair to prevent duplicates and remove reliance on .maybeSingle() in
selectFileByStorageKey; update the schema/migration to add
UNIQUE(owner_account_id, storage_key). Also replace the FileRecord type
usage/import with the generated supabase type Tables<"files"> (from your db
types) wherever selectFileByStorageKey and related code reference FileRecord so
queries use Tables<"files"> for correct typing and to decouple from the create
module.
| export interface TriggerCreateContentPayload { | ||
| accountId: string; | ||
| artistSlug: string; | ||
| template: string; | ||
| lipsync: boolean; | ||
| /** Controls caption length: "short", "medium", or "long". */ | ||
| captionLength: "short" | "medium" | "long"; | ||
| /** Whether to upscale image and video for higher quality. */ | ||
| upscale: boolean; | ||
| /** GitHub repo URL so the task can fetch artist files. */ | ||
| githubRepo: string; | ||
| } |
There was a problem hiding this comment.
Carry artistAccountId through the task payload.
validateCreateContentBody() already resolves artistAccountId separately from accountId, but this payload drops it. The downstream persistence path is then forced to write artistAccountId: output.accountId in lib/content/persistCreateContentRunVideo.ts (Lines 89-93), so any org/workspace flow where the owner account and artist account differ will attach the generated file to the wrong artist.
💡 Minimal payload fix
export interface TriggerCreateContentPayload {
accountId: string;
+ artistAccountId: string;
artistSlug: string;
template: string;
lipsync: boolean;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@lib/trigger/triggerCreateContent.ts` around lines 4 - 15, The
TriggerCreateContentPayload is missing artistAccountId which causes downstream
code to substitute output.accountId (see validateCreateContentBody() and
persistCreateContentRunVideo.ts); add artistAccountId: string to the
TriggerCreateContentPayload interface, ensure validateCreateContentBody()
includes and sets artistAccountId on the produced payload, and update
persistCreateContentRunVideo (and any callers) to use payload.artistAccountId
when persisting the created content instead of relying on output.accountId; also
update any associated type imports/usage to compile with the new field.
Summary
artist_account_idreplacesartist_slugin public API responsesaccount_idremoved from content endpoints🤖 Generated with Claude Code
Summary by CodeRabbit