-
Notifications
You must be signed in to change notification settings - Fork 1
feat: --source
flag added
#45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Deploying with
|
Status | Name | Latest Commit | Preview URL | Updated (UTC) |
---|---|---|---|---|
✅ Deployment successful! View logs |
claim-db-worker | f44687c | Commit Preview URL Branch Preview URL |
Aug 26 2025, 04:34 PM |
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45 Worker URLs
|
WalkthroughReads PRISMA_ACTOR_NAME and PRISMA_ACTOR_PROJECT from a project .env to derive an optional userAgent and propagate it through region selection, database creation requests (utm_source), analytics events, claim URL, and JSON output; PostHog analytics now require explicit host/key and are disabled if missing. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor U as User
participant C as CLI (create-db/index.js)
participant E as EnvReader (.env)
participant A as Analytics (create-db/analytics.js)
participant S as API Service
participant B as Browser (Claim URL)
U->>C: run create-db [--json] [region/flags...]
C->>C: parse args
C->>E: readUserEnvFile()
alt .env contains both keys
E-->>C: {PRISMA_ACTOR_NAME, PRISMA_ACTOR_PROJECT}
C->>C: userAgent = "NAME/PROJECT"
else no userAgent
E-->>C: {}
C->>C: userAgent = undefined
end
C->>A: cli_command_ran {has-source-from-env?, userAgent?}
alt interactive
C->>U: promptForRegion(defaultRegion, userAgent)
U-->>C: region
C->>A: region_selected {region, userAgent?}
else region provided via flag/JSON
C->>A: region_selected {region, userAgent?}
end
C->>S: createDatabase {name, region, utm_source: userAgent || CLI_NAME}
alt success
S-->>C: db info + claimUrl(utm_source)
C-->>U: output (JSON includes userAgent when present)
C->>B: open claimUrl (utm_source)
else failure
S-->>C: error
C->>A: database_creation_failed {reason, userAgent?}
C-->>U: error
end
Possibly related PRs
Suggested reviewers
Tip 🔌 Remote MCP (Model Context Protocol) integration is now available!Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats. ✨ Finishing Touches
🧪 Generate unit tests
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45 Worker URLs
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 6
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
create-db/index.js (3)
96-116
: Help output doesn’t document the new --source flag.Users won’t discover the feature or its behavior from
--help
. Add the flag to Options and Examples.${chalk.yellow("--json, -j")} Output machine-readable JSON and exit ${chalk.yellow("--list-regions")} List available regions and exit ${chalk.yellow("--help, -h")} Show this help message + ${chalk.yellow("--source, -s")} Derive utm_source from CTA_* in your project .env ... ${chalk.gray(`npx ${CLI_NAME} --json --region us-east-1`)} + ${chalk.gray(`npx ${CLI_NAME} --source --region us-east-1`)}
318-329
: utm_source: avoid undefined, keep backward-compat, and URL-encode.
- If
source
is unset, the request body omitsutm_source
and the claim URL rendersutm_source=undefined
. That breaks existing attribution and pollutes links.- Fallback to
CLI_NAME
and URL-encode the value in the claim URL.async function createDatabase(name, region, source, returnJson = false ) { let s; if (!returnJson) { s = spinner(); s.start("Creating your database..."); } + const utmSource = source || CLI_NAME; + const resp = await fetch(`${CREATE_DB_WORKER_URL}/create`, { method: "POST", headers: { "Content-Type": "application/json" }, - body: JSON.stringify({ region, name, utm_source: source }), + body: JSON.stringify({ region, name, utm_source: utmSource }), }); ... - const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${source}&utm_medium=cli`; + const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${encodeURIComponent(utmSource)}&utm_medium=cli`;Also applies to: 422-422
543-557
: Analytics: compute has-source-flag from parsed flags, not argv text.
rawArgs.includes("-s")
misses combined short forms (e.g.,-rs
). UseBoolean(flags.source)
.- "has-source-flag": rawArgs.includes("--source") || rawArgs.includes("-s"), + "has-source-flag": Boolean(flags.source),
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (1)
create-db/index.js
(17 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
analytics
(48-48)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (1)
create-db/index.js (1)
631-636
: LGTM: source plumbed through interactive/JSON/non-interactive flows.Passing
source
intopromptForRegion
andcreateDatabase
across code paths looks consistent.Also applies to: 660-666
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
create-db/index.js (2)
102-115
: Update help text to document the new --source flag (and example).Users won’t discover the flag via
--help
. Add an option line and an example.${chalk.yellow("--json, -j")} Output machine-readable JSON and exit + ${chalk.yellow("--source, -s")} Derive a UTM source from CTA_* vars in your .env and include in analytics and API calls ${chalk.yellow("--list-regions")} List available regions and exit @@ ${chalk.gray(`npx ${CLI_NAME} --json --region us-east-1`)} + ${chalk.gray(`npx ${CLI_NAME} --interactive --source`)}
318-329
: Preserve utm_source when --source is absent.Currently,
utm_source
is omitted from the JSON body whensource
is undefined. To keep existing attribution, fall back toCLI_NAME
.- body: JSON.stringify({ region, name, utm_source: source }), + body: JSON.stringify({ region, name, utm_source: source ?? CLI_NAME }),
♻️ Duplicate comments (6)
create-db/index.js (6)
170-173
: Await and return on -h single-short-flag path.Without
await
+return
, downstream code may still run in some harnesses/mocks whereprocess.exit
is stubbed.- if (mappedFlag === "help") showHelp(); + if (mappedFlag === "help") { await showHelp(); return; }
544-565
: Analytics: use parsed flags for “has-source-flag” and reuse the helper.Scanning
rawArgs
misses combos like-rs
. UseBoolean(flags.source)
and the helper to attach source.- const analyticsProps = { + const analyticsProps = { command: CLI_NAME, "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(), "has-region-flag": rawArgs.includes("--region") || rawArgs.includes("-r"), "has-interactive-flag": rawArgs.includes("--interactive") || rawArgs.includes("-i"), "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"), "has-list-regions-flag": rawArgs.includes("--list-regions"), - "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"), - "has-source-flag": rawArgs.includes("--source") || rawArgs.includes("-s"), + "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"), + "has-source-flag": Boolean(flags.source), "node-version": process.version, platform: process.platform, arch: process.arch, }; - - if (source) { - analyticsProps.source = source; - } - - await analytics.capture("create_db:cli_command_ran", analyticsProps); + await captureWithSource("create_db:cli_command_ran", analyticsProps, source);
4-5
: Use node: protocol for built-ins.Prefer
node:
specifiers for core modules to avoid resolution ambiguity and align with Node guidance.-import fs from "fs"; -import path from "path"; +import fs from "node:fs"; +import path from "node:path";
62-85
: Don’t hand-roll a .env parser; rely on process.env (dotenv.config) instead.This parser will mis-handle comments, whitespace, CRLF, export prefixes, and quoted/multiline values. You already call
dotenv.config()
at startup, so just derive the source fromprocess.env
.Replace this helper with a simple derivation helper:
-function readUserEnvFile() { - const userCwd = process.cwd(); - const envPath = path.join(userCwd, '.env'); - - if (!fs.existsSync(envPath)) { - return {}; - } - - const envContent = fs.readFileSync(envPath, 'utf8'); - const envVars = {}; - - envContent.split('\n').forEach(line => { - const trimmed = line.trim(); - if (trimmed && !trimmed.startsWith('#')) { - const [key, ...valueParts] = trimmed.split('='); - if (key && valueParts.length > 0) { - const value = valueParts.join('=').replace(/^["']|["']$/g, ''); - envVars[key.trim()] = value.trim(); - } - } - }); - - return envVars; -} +function deriveSourceFromProcessEnv() { + const { CTA_VERSION, CTA_FRAMEWORK, CTA_FRAMEWORK_VERSION } = process.env; + const parts = []; + if (CTA_VERSION) parts.push(`v${CTA_VERSION}`); + if (CTA_FRAMEWORK) parts.push(CTA_FRAMEWORK); + if (CTA_FRAMEWORK_VERSION) parts.push(`fv${CTA_FRAMEWORK_VERSION}`); + return parts.length ? parts.join("-") : undefined; +}
302-313
: DRY analytics: centralize “attach source if defined”.The same “conditionally add source” payload logic repeats across 5 blocks. Extract a helper and use it here to reduce duplication and missed cases.
Add near the top (after the analytics import):
+function captureWithSource(event, props, maybeSource) { + const payload = maybeSource ? { ...props, source: maybeSource } : props; + return analytics.capture(event, payload); +}Then refactor these blocks:
Interactive region selection:
- const analyticsProps = { - command: CLI_NAME, - region: region, - "selection-method": "interactive", - }; - - if (source) { - analyticsProps.source = source; - } - - await analytics.capture("create_db:region_selected", analyticsProps); + await captureWithSource( + "create_db:region_selected", + { command: CLI_NAME, region, "selection-method": "interactive" }, + source + );Rate-limit error:
- const analyticsProps = { - command: CLI_NAME, - region: region, - "error-type": "rate_limit", - "status-code": 429, - }; - - if (source) { - analyticsProps.source = source; - } - - await analytics.capture("create_db:database_creation_failed", analyticsProps); + await captureWithSource( + "create_db:database_creation_failed", + { command: CLI_NAME, region, "error-type": "rate_limit", "status-code": 429 }, + source + );Invalid JSON error:
- const analyticsProps = { - command: CLI_NAME, - region, - "error-type": "invalid_json", - "status-code": resp.status, - }; - - if (source) { - analyticsProps.source = source; - } - - await analytics.capture("create_db:database_creation_failed", analyticsProps); + await captureWithSource( + "create_db:database_creation_failed", + { command: CLI_NAME, region, "error-type": "invalid_json", "status-code": resp.status }, + source + );API error:
- const analyticsProps = { - command: CLI_NAME, - region: region, - "error-type": "api_error", - "error-message": result.error.message, - }; - - if (source) { - analyticsProps.source = source; - } - - await analytics.capture("create_db:database_creation_failed", analyticsProps); + await captureWithSource( + "create_db:database_creation_failed", + { command: CLI_NAME, region, "error-type": "api_error", "error-message": result.error.message }, + source + );Region selected via flag:
- const analyticsProps = { - command: CLI_NAME, - region: region, - "selection-method": "flag", - }; - - if (source) { - analyticsProps.source = source; - } - - await analytics.capture("create_db:region_selected", analyticsProps); + await captureWithSource( + "create_db:region_selected", + { command: CLI_NAME, region, "selection-method": "flag" }, + source + );Also applies to: 348-360, 383-395, 453-465, 590-601
523-541
: Duplicate source derivation/validation; compute once via process.env and validate once.You compute
source
from.env
twice and perform two validations. Collapse into a single derivation fromprocess.env
(already populated bydotenv.config()
), then validate once.Derivation:
- let source; - if (flags.source) { - const userEnvVars = readUserEnvFile(); - const userCwd = process.cwd(); - const envPath = path.join(userCwd, '.env'); - - if (fs.existsSync(envPath)) { - const ctaVars = []; - if (userEnvVars.CTA_VERSION) ctaVars.push(`v${userEnvVars.CTA_VERSION}`); - if (userEnvVars.CTA_FRAMEWORK) ctaVars.push(userEnvVars.CTA_FRAMEWORK); - if (userEnvVars.CTA_FRAMEWORK_VERSION) ctaVars.push(`fv${userEnvVars.CTA_FRAMEWORK_VERSION}`); - - if (ctaVars.length > 0) { - source = ctaVars.join('-'); - } - } - } + let source = undefined; + if (flags.source) { + source = deriveSourceFromProcessEnv(); + }Validation:
- if (flags.source) { - const userCwd = process.cwd(); - const envPath = path.join(userCwd, '.env'); - - if (!fs.existsSync(envPath)) { - console.error(chalk.red("Error: Source not configured correctly.")); - process.exit(1); - } - - const userEnvVars = readUserEnvFile(); - const ctaVars = []; - if (userEnvVars.CTA_VERSION) ctaVars.push(`v${userEnvVars.CTA_VERSION}`); - if (userEnvVars.CTA_FRAMEWORK) ctaVars.push(userEnvVars.CTA_FRAMEWORK); - if (userEnvVars.CTA_FRAMEWORK_VERSION) ctaVars.push(`fv${userEnvVars.CTA_FRAMEWORK_VERSION}`); - - if (ctaVars.length === 0) { - console.error(chalk.red("Error: Source not configured correctly.")); - process.exit(1); - } - } + if (flags.source && !source) { + console.error( + chalk.red( + "Error: Source not configured correctly. Expected CTA_VERSION, CTA_FRAMEWORK and/or CTA_FRAMEWORK_VERSION in .env." + ) + ); + process.exit(1); + }Also applies to: 604-623
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (1)
create-db/index.js
(18 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
analytics
(48-48)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (4)
create-db/index.js (4)
129-137
: Flag plumbing for --source looks good.Allowed flags and shorthand mapping are correctly extended.
277-277
: Signature change to accept source is appropriate.Passing
source
into the region prompt enables consistent analytics enrichment across flows.
435-435
: JSON output includes source context — good.Returning
source
(or null) in JSON response is helpful for end-to-end validation in CI and downstream tooling.
632-636
: Propagation of source through interactive/JSON and create calls looks consistent.Passing
source
topromptForRegion
andcreateDatabase
in all paths ensures consistent analytics and request attribution.Also applies to: 660-662, 666-666
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45 Worker URLs
|
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45 Worker URLs
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
create-db/index.js (2)
310-321
: Preserve UTM segmentation when source is absent.Requests previously used CLI_NAME for utm_source; with this change, the key disappears when source is undefined. If the backend expects or benefits from a default, fall back to CLI_NAME.
- body: JSON.stringify({ region, name, utm_source: source }), + body: JSON.stringify({ region, name, utm_source: source || CLI_NAME }),
524-547
: DRY analytics: reuse captureWithSource; add “has-source-flag” if you ship the flag.Replace manual payload building and optional source-mutation with the helper; also consider tracking has-source-flag separately from has-source-from-env.
- const analyticsProps = { - command: CLI_NAME, - "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(), - "has-region-flag": - rawArgs.includes("--region") || rawArgs.includes("-r"), - "has-interactive-flag": - rawArgs.includes("--interactive") || rawArgs.includes("-i"), - "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"), - "has-list-regions-flag": rawArgs.includes("--list-regions"), - "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"), - "has-source-from-env": !!source, - "node-version": process.version, - platform: process.platform, - arch: process.arch, - }; - - if (source) { - analyticsProps.source = source; - } - - await analytics.capture("create_db:cli_command_ran", analyticsProps); + await captureWithSource( + "create_db:cli_command_ran", + { + command: CLI_NAME, + "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(), + "has-region-flag": rawArgs.includes("--region") || rawArgs.includes("-r"), + "has-interactive-flag": rawArgs.includes("--interactive") || rawArgs.includes("-i"), + "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"), + "has-list-regions-flag": rawArgs.includes("--list-regions"), + "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"), + "has-source-from-env": !!source && !rawArgs.includes("--source") && !rawArgs.includes("-s"), + "has-source-flag": rawArgs.includes("--source") || rawArgs.includes("-s"), + "node-version": process.version, + platform: process.platform, + arch: process.arch, + }, + source + );
♻️ Duplicate comments (8)
create-db/index.js (8)
166-177
: Bug: help short-flag path doesn’t await showHelp() or return.Single short-flag branch should mirror the combined short-flags branch to avoid falling through.
if (shorthandMap[short]) { const mappedFlag = shorthandMap[short]; - if (mappedFlag === "help") showHelp(); + if (mappedFlag === "help") { await showHelp(); return; } if (mappedFlag === "region") { const region = args[i + 1]; if (!region || region.startsWith("-")) exitWithError("Missing value for -r flag."); flags.region = region; i++; } else { flags[mappedFlag] = true; } }
269-305
: DRY analytics capture; introduce captureWithSource helper.You repeat “attach source if defined” pattern. Centralize to avoid drift.
try { - const analyticsProps = { - command: CLI_NAME, - region: region, - "selection-method": "interactive", - }; - - if (source) { - analyticsProps.source = source; - } - - await analytics.capture("create_db:region_selected", analyticsProps); + await captureWithSource( + "create_db:region_selected", + { command: CLI_NAME, region, "selection-method": "interactive" }, + source + ); } catch (error) {}Add once near the top (after the analytics import):
function captureWithSource(event, props, maybeSource) { const payload = maybeSource ? { ...props, source: maybeSource } : props; return analytics.capture(event, payload); }
4-5
: Use node: protocol for built-ins.Prefer node:fs and node:path to match Node’s guidance and avoid resolution ambiguity.
-import fs from "fs"; -import path from "path"; +import fs from "node:fs"; +import path from "node:path";
62-85
: Don’t hand-roll .env parsing; use dotenv.parse or process.env.The custom parser mishandles quotes, CRLF, inline comments, export prefixes, and values containing '='. Since dotenv.config() is already called, prefer process.env; minimally, swap in dotenv.parse.
Minimal, safer fix for this helper:
function readUserEnvFile() { - const userCwd = process.cwd(); - const envPath = path.join(userCwd, '.env'); - - if (!fs.existsSync(envPath)) { - return {}; - } - - const envContent = fs.readFileSync(envPath, 'utf8'); - const envVars = {}; - - envContent.split('\n').forEach(line => { - const trimmed = line.trim(); - if (trimmed && !trimmed.startsWith('#')) { - const [key, ...valueParts] = trimmed.split('='); - if (key && valueParts.length > 0) { - const value = valueParts.join('=').replace(/^["']|["']$/g, ''); - envVars[key.trim()] = value.trim(); - } - } - }); - - return envVars; + const envPath = path.join(process.cwd(), ".env"); + if (!fs.existsSync(envPath)) return {}; + return dotenv.parse(fs.readFileSync(envPath, "utf8")); }Better: delete this helper and read from process.env everywhere (populated by dotenv.config()).
340-352
: Repeat pattern: use captureWithSource helper (rate-limit path).Apply the helper to keep payload construction consistent.
- const analyticsProps = { - command: CLI_NAME, - region: region, - "error-type": "rate_limit", - "status-code": 429, - }; - - if (source) { - analyticsProps.source = source; - } - - await analytics.capture("create_db:database_creation_failed", analyticsProps); + await captureWithSource( + "create_db:database_creation_failed", + { command: CLI_NAME, region, "error-type": "rate_limit", "status-code": 429 }, + source + );
375-387
: Repeat pattern: use captureWithSource helper (invalid JSON path).- const analyticsProps = { - command: CLI_NAME, - region, - "error-type": "invalid_json", - "status-code": resp.status, - }; - - if (source) { - analyticsProps.source = source; - } - - await analytics.capture("create_db:database_creation_failed", analyticsProps); + await captureWithSource( + "create_db:database_creation_failed", + { command: CLI_NAME, region, "error-type": "invalid_json", "status-code": resp.status }, + source + );
570-581
: Repeat pattern: use captureWithSource helper (region flag path).- const analyticsProps = { - command: CLI_NAME, - region: region, - "selection-method": "flag", - }; - - if (source) { - analyticsProps.source = source; - } - - await analytics.capture("create_db:region_selected", analyticsProps); + await captureWithSource( + "create_db:region_selected", + { command: CLI_NAME, region, "selection-method": "flag" }, + source + );
414-414
: Bug: claim URL produces “utm_source=undefined” when source is missing.This pollutes analytics and produces a literal “undefined” in links. Fall back to CLI_NAME and ensure proper encoding.
- const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${source}&utm_medium=cli`; + const utmSource = encodeURIComponent(source || CLI_NAME); + const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${utmSource}&utm_medium=cli`;
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (1)
create-db/index.js
(14 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
analytics
(48-48)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: create-db-worker
- GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (2)
create-db/index.js (2)
426-427
: LGTM: include source in JSON output.Returning source (or null) in --json aligns with observability/traceability goals.
594-599
: LGTM: threaded source through interactive/JSON and default flows.Passing source into promptForRegion and createDatabase maintains consistent tracing across modes.
Also applies to: 622-627
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45 Worker URLs
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
create-db/index.js (1)
310-321
: Preserve default attribution when env-derived source is absent.When
userAgent
is undefined,utm_source
drops from the payload entirely. If you previously attributed to the CLI, you can keep continuity with a fallback.- body: JSON.stringify({ region, name, utm_source: userAgent }), + body: JSON.stringify({ region, name, utm_source: userAgent || CLI_NAME }),
♻️ Duplicate comments (9)
create-db/index.js (9)
269-305
: DRY analytics: factor “attach source if present” into a helper; also standardize field name.This pattern appears multiple times. Extract a helper to avoid drift and use a consistent key (
source
) instead ofuser-agent
(which can be confused with the HTTP header).- try { - const analyticsProps = { - command: CLI_NAME, - region: region, - "selection-method": "interactive", - }; - - if (userAgent) { - analyticsProps["user-agent"] = userAgent; - } - - await analytics.capture("create_db:region_selected", analyticsProps); - } catch (error) {} + try { + await captureWithSource( + "create_db:region_selected", + { command: CLI_NAME, region, "selection-method": "interactive" }, + userAgent + ); + } catch {}Add near the top-level (after the analytics import):
function captureWithSource(event, props, maybeSource) { const payload = maybeSource ? { ...props, source: maybeSource } : props; return analytics.capture(event, payload); }
538-561
: Analytics props: compute “has-*” from parsed flags; keep analytics failures silent for users.
- Use
flags
instead of scanningrawArgs
to avoid missing combined short flags and quoting edge cases.- Avoid logging analytics errors to stdout/stderr except in development, to not confuse CLI users.
- const analyticsProps = { + const analyticsProps = { command: CLI_NAME, "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(), - "has-region-flag": - rawArgs.includes("--region") || rawArgs.includes("-r"), - "has-interactive-flag": - rawArgs.includes("--interactive") || rawArgs.includes("-i"), - "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"), - "has-list-regions-flag": rawArgs.includes("--list-regions"), - "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"), + "has-region-flag": Boolean(flags.region), + "has-interactive-flag": Boolean(flags.interactive), + "has-help-flag": Boolean(flags.help), + "has-list-regions-flag": Boolean(flags["list-regions"]), + "has-json-flag": Boolean(flags.json), "has-source-from-env": !!userAgent, "node-version": process.version, platform: process.platform, arch: process.arch, }; @@ - await analytics.capture("create_db:cli_command_ran", analyticsProps); - } catch (error) { - console.error("Error:", error.message); - } + await analytics.capture("create_db:cli_command_ran", analyticsProps); + } catch (error) { + if (process.env.NODE_ENV === "development") { + console.error("Analytics error:", error.message); + } + }
4-5
: Use node: protocol for built-ins (or drop imports if the custom .env parser is removed).Follow Node guidance and avoid resolution ambiguity by using
node:
specifiers.-import fs from "fs"; -import path from "path"; +import fs from "node:fs"; +import path from "node:path";
62-85
: Don’t hand-roll a .env parser; rely on dotenv/process.env and centralize source derivation.This parser mishandles edge cases (quotes, CRLF, inline comments, export, multiline). You already call
dotenv.config()
; read fromprocess.env
and delete this function. Also, add a tiny helper to compute the source once.Replace this block with:
-function readUserEnvFile() { - const userCwd = process.cwd(); - const envPath = path.join(userCwd, ".env"); - - if (!fs.existsSync(envPath)) { - return {}; - } - - const envContent = fs.readFileSync(envPath, "utf8"); - const envVars = {}; - - envContent.split("\n").forEach((line) => { - const trimmed = line.trim(); - if (trimmed && !trimmed.startsWith("#")) { - const [key, ...valueParts] = trimmed.split("="); - if (key && valueParts.length > 0) { - const value = valueParts.join("=").replace(/^["']|["']$/g, ""); - envVars[key.trim()] = value.trim(); - } - } - }); - - return envVars; -} +function getSourceFromEnv() { + const { PRISMA_ACTOR_NAME, PRISMA_ACTOR_PROJECT } = process.env; + return PRISMA_ACTOR_NAME && PRISMA_ACTOR_PROJECT + ? `${PRISMA_ACTOR_NAME}/${PRISMA_ACTOR_PROJECT}` + : undefined; +}
339-355
: DRY analytics for rate-limit error path.Use the same
captureWithSource
helper here.- try { - const analyticsProps = { - command: CLI_NAME, - region: region, - "error-type": "rate_limit", - "status-code": 429, - }; - - if (userAgent) { - analyticsProps["user-agent"] = userAgent; - } - - await analytics.capture( - "create_db:database_creation_failed", - analyticsProps - ); - } catch (error) {} + try { + await captureWithSource( + "create_db:database_creation_failed", + { command: CLI_NAME, region, "error-type": "rate_limit", "status-code": 429 }, + userAgent + ); + } catch {}
378-393
: DRY analytics for invalid JSON error path.Same duplication; use the helper.
- try { - const analyticsProps = { - command: CLI_NAME, - region, - "error-type": "invalid_json", - "status-code": resp.status, - }; - - if (userAgent) { - analyticsProps["user-agent"] = userAgent; - } - - await analytics.capture( - "create_db:database_creation_failed", - analyticsProps - ); - } catch {} + try { + await captureWithSource( + "create_db:database_creation_failed", + { command: CLI_NAME, region, "error-type": "invalid_json", "status-code": resp.status }, + userAgent + ); + } catch {}
456-471
: DRY analytics for API error path.Replace manual prop building with the shared helper.
- try { - const analyticsProps = { - command: CLI_NAME, - region: region, - "error-type": "api_error", - "error-message": result.error.message, - }; - - if (userAgent) { - analyticsProps["user-agent"] = userAgent; - } - - await analytics.capture( - "create_db:database_creation_failed", - analyticsProps - ); - } catch (error) {} + try { + await captureWithSource( + "create_db:database_creation_failed", + { command: CLI_NAME, region, "error-type": "api_error", "error-message": result.error.message }, + userAgent + ); + } catch {}
529-536
: DeriveuserAgent
from process.env (or flags), not by re-reading .env. Also PR title vs code mismatch.
- Replace
readUserEnvFile()
usage with a one-liner readingprocess.env
(dotenv already ran).- PR title says “--source flag added” but there is no such flag parsed/helped here. Either implement it or update the title.
- let userAgent; - const userEnvVars = readUserEnvFile(); - if (userEnvVars.PRISMA_ACTOR_NAME && userEnvVars.PRISMA_ACTOR_PROJECT) { - userAgent = `${userEnvVars.PRISMA_ACTOR_NAME}/${userEnvVars.PRISMA_ACTOR_PROJECT}`; - } + // Flag (if implemented) should override ENV; otherwise pull from ENV. + let userAgent = /* flags.source ?? */ getSourceFromEnv();Would you like a follow-up patch to add
--source/-s
(help text, allowed flags, and parsing) so it overrides the env-derived value?
584-595
: DRY analytics for region flag path.Use the shared helper for consistency with the interactive path.
- try { - const analyticsProps = { - command: CLI_NAME, - region: region, - "selection-method": "flag", - }; - - if (userAgent) { - analyticsProps["user-agent"] = userAgent; - } - - await analytics.capture("create_db:region_selected", analyticsProps); - } catch (error) {} + try { + await captureWithSource( + "create_db:region_selected", + { command: CLI_NAME, region, "selection-method": "flag" }, + userAgent + ); + } catch {}
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (2)
create-db/analytics.js
(1 hunks)create-db/index.js
(15 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
analytics
(50-50)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (3)
create-db/index.js (3)
424-439
: JSON output: field name is consistent and optional — LGTM.Including
source
only when provided avoids changing the default contract for existing consumers.
605-610
: PropagatinguserAgent
through the JSON path — LGTM.Passing the source into
promptForRegion
andcreateDatabase
keeps attribution consistent in non-interactive mode.
634-639
: PropagatinguserAgent
through the interactive path — LGTM.Attribution remains consistent across flows.
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45 Worker URLs
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (4)
create-db/analytics.js (4)
24-31
: Bound the network call with a short timeout and make it fire-and-forget.A hung analytics POST can stall CLI UX. Add an AbortController timeout (configurable) and mark the request keepalive.
Apply this diff:
- try { - const response = await fetch(POSTHOG_CAPTURE_URL, { + try { + const controller = new AbortController(); + const timeoutMs = Number(process.env.ANALYTICS_TIMEOUT_MS ?? 2000); + const timeout = setTimeout(() => controller.abort(), timeoutMs); + const response = await fetch(POSTHOG_CAPTURE_URL, { method: "POST", headers: { "Content-Type": "application/json", }, - body: JSON.stringify(payload), + body: JSON.stringify(payload), + keepalive: true, + signal: controller.signal, }); + clearTimeout(timeout);
33-35
: Include HTTP status code for easier debugging.Status text can be empty; include the numeric code for clarity.
Apply this diff:
- if (!response.ok) { - throw new EventCaptureError(eventName, response.statusText); - } + if (!response.ok) { + throw new EventCaptureError( + eventName, + `${response.status} ${response.statusText || "Unknown"}` + ); + }
17-18
: Avoid per-event random distinct_id; prefer a stable, privacy-safe identifier.A new randomUUID per event prevents session/user-level aggregation. Consider a stable distinct_id (e.g., persisted machine/session ID, or an env-provided userAgent/source when privacy permits). Keep $process_person_profile: false to avoid PII.
I can draft a tiny helper that persists a UUID at ~/.config/prisma/create-db/machine-id (or respects XDG) and falls back to random when unwritable—say the word.
37-41
: Optional: add a targeted debug toggle.Logging only in NODE_ENV=development is fine. If you want opt-in visibility on CI without flipping NODE_ENV, consider also honoring DEBUG=create-db:analytics.
Proposed tweak inside catch:
if ( process.env.NODE_ENV === "development" || process.env.DEBUG === "create-db:analytics" ) { console.error("Analytics error:", error.message); }
♻️ Duplicate comments (1)
create-db/analytics.js (1)
12-12
: Nice: hardcoded PostHog key removed.This addresses the prior review about shipping a baked-in API key. Good change.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (1)
create-db/analytics.js
(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45 Worker URLs
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
create-db/analytics.js (3)
24-31
: Allow callers to provide a stable distinct_id to improve event correlation.Right now every event gets a fresh UUID, which fragments user sessions. Prefer a caller-provided distinct_id when available (e.g., a persisted CLI install ID), falling back to a UUID.
Apply this diff:
- distinct_id: randomUUID(), + // Prefer caller-provided distinct_id for stable correlation across events + distinct_id: properties?.distinct_id ?? randomUUID(),
33-50
: Add a short timeout via AbortController to avoid hanging on network stalls.Without a timeout, fetch can hang indefinitely and delay CLI exit. Abort after a few seconds; still silent-fail outside development.
Apply this diff:
- try { - const response = await fetch(POSTHOG_CAPTURE_URL, { + const controller = new AbortController(); + const timeoutId = setTimeout(() => controller.abort(), 5000); + try { + const response = await fetch(POSTHOG_CAPTURE_URL, { method: "POST", headers: { "Content-Type": "application/json", }, - body: JSON.stringify(payload), + body: JSON.stringify(payload), + signal: controller.signal, }); if (!response.ok) { throw new EventCaptureError(eventName, response.statusText); } - } catch (error) { + } catch (error) { // Silently fail analytics to not disrupt user experience if (process.env.NODE_ENV === "development") { console.error("Analytics error:", error.message); } - } + } finally { + clearTimeout(timeoutId); + }
42-44
: Include numeric status code; statusText can be empty in Node fetch.This improves diagnostics while keeping behavior unchanged.
Apply this diff:
- throw new EventCaptureError(eventName, response.statusText); + const text = response.statusText || "Unknown Status"; + throw new EventCaptureError( + eventName, + `${response.status} ${text}`.trim() + );
♻️ Duplicate comments (1)
create-db/analytics.js (1)
11-21
: Resolved: analytics now fail-closed without a baked-in key; thanks for addressing prior concerns.You now gate on both POSTHOG_API_HOST and POSTHOG_API_KEY and early-return, with a dev-only warning. This removes the committed default key and avoids posting to undefined/capture. Looks good.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (1)
create-db/analytics.js
(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: create-db-worker
- GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (1)
create-db/analytics.js (1)
34-34
: Confirm Node runtime guarantees global fetch; otherwise polyfill or guard.If the CLI runs on Node < 18, global fetch is undefined. Either enforce engines >= 18 or add a lazy import/polyfill.
Would you verify the repo’s engines.node and runtime target? If engines < 18 or unspecified, I can add a tiny guard like:
- const response = await fetch(POSTHOG_CAPTURE_URL, { + const _fetch = globalThis.fetch ?? (await import("node-fetch")).default; + const response = await _fetch(POSTHOG_CAPTURE_URL, {
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45 Worker URLs
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (5)
create-db/analytics.js (2)
3-6
: Include HTTP status code in EventCaptureError for easier debugging.statusText can be empty; include numeric status as well.
-class EventCaptureError extends Error { - constructor(event, status) { - super(`Failed to submit PostHog event '${event}': ${status}`); - } -} +class EventCaptureError extends Error { + constructor(event, statusCode, statusText) { + const detail = [statusCode, statusText].filter(Boolean).join(" "); + super(`Failed to submit PostHog event '${event}': ${detail}`); + } +} @@ - if (!response.ok) { - throw new EventCaptureError(eventName, response.statusText); - } + if (!response.ok) { + throw new EventCaptureError(eventName, response.status, response.statusText); + }Also applies to: 49-51
1-1
: Use node: protocol import for built-in crypto (consistency with Node guidance).-import { randomUUID } from "crypto"; +import { randomUUID } from "node:crypto";create-db/index.js (3)
416-431
: JSON output: expose a stable ‘source’ field alongside userAgent (non-breaking).Downstream tools likely look for “utm_source”/“source”. Keep userAgent for compatibility; add source duplicate.
- if (userAgent) { - jsonResponse.userAgent = userAgent; - } + if (userAgent) { + jsonResponse.userAgent = userAgent; // existing + jsonResponse.source = userAgent; // new, explicit + }
526-547
: Analytics flags: use parsed flags instead of scanning rawArgs; add has-source-flag.- const analyticsProps = { + const analyticsProps = { command: CLI_NAME, - "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(), - "has-region-flag": - rawArgs.includes("--region") || rawArgs.includes("-r"), - "has-interactive-flag": - rawArgs.includes("--interactive") || rawArgs.includes("-i"), - "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"), - "has-list-regions-flag": rawArgs.includes("--list-regions"), - "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"), + "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(), + "has-region-flag": Boolean(flags.region), + "has-interactive-flag": Boolean(flags.interactive), + "has-help-flag": Boolean(flags.help), + "has-list-regions-flag": Boolean(flags["list-regions"]), + "has-json-flag": Boolean(flags.json), + "has-source-flag": Boolean(flags.source), "has-user-agent-from-env": !!userAgent, "node-version": process.version, platform: process.platform, arch: process.arch, "user-agent": userAgent, };
167-176
: Help short-flag branch should await showHelp() and return.The long-flag and combined-short branches await and return; keep behavior consistent here to avoid falling through.
- if (mappedFlag === "help") showHelp(); + if (mappedFlag === "help") { await showHelp(); return; }
♻️ Duplicate comments (3)
create-db/analytics.js (1)
11-12
: Normalize and validate POSTHOG_API_HOST (trim + ensure scheme) to prevent malformed URLs.If POSTHOG_API_HOST lacks a scheme or has trailing/leading whitespace, fetch will throw (e.g., "app.posthog.com/capture"). Normalize before use.
Apply:
- const POSTHOG_API_HOST = process.env.POSTHOG_API_HOST; - const POSTHOG_KEY = process.env.POSTHOG_API_KEY; + const POSTHOG_API_HOST_RAW = process.env.POSTHOG_API_HOST; + const POSTHOG_KEY_RAW = process.env.POSTHOG_API_KEY; + const POSTHOG_API_HOST = POSTHOG_API_HOST_RAW?.trim(); + const POSTHOG_KEY = POSTHOG_KEY_RAW?.trim(); @@ - const POSTHOG_CAPTURE_URL = `${POSTHOG_API_HOST.replace(/\/+$/, "")}/capture`; + const hostWithScheme = POSTHOG_API_HOST.startsWith("http") + ? POSTHOG_API_HOST + : `https://${POSTHOG_API_HOST}`; + const POSTHOG_CAPTURE_URL = `${hostWithScheme.replace(/\/+$/, "")}/capture`;Also applies to: 28-28
create-db/index.js (2)
4-6
: Prefer node: protocol for built-ins or drop these imports if the custom .env reader is removed.-import fs from "fs"; -import path from "path"; +import fs from "node:fs"; +import path from "node:path";Note: If you remove readUserEnvFile (see below), these imports can be deleted entirely.
412-412
: Fix bug: claim URL uses raw userAgent and can emit “utm_source=undefined”; also URL-encode.- const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${userAgent}&utm_medium=cli`; + const utmSource = encodeURIComponent(userAgent || CLI_NAME); + const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${utmSource}&utm_medium=cli`;
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (2)
create-db/analytics.js
(1 hunks)create-db/index.js
(15 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (2)
create-db/analytics.js (1)
14-26
: Fail-closed gating looks good; removes accidental analytics when env is missing.Early-returning when POSTHOG_API_HOST/POSTHOG_API_KEY are absent (and warning only in development) is the right call and aligns with least-privilege defaults.
create-db/index.js (1)
588-596
: Flow check: promptForRegion + createDatabase with userAgent parameter work end-to-end.Looks correct after the signature change; JSON and interactive paths pass the value through.
Also applies to: 620-626
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (8)
.github/workflows/preview.yml (4)
86-93
: Avoid referencing outputs from commented-out steps; add robust fallbacks.steps.deploy-db/steps.deploy-claim don’t exist (those steps are commented), so the expressions will resolve to null. Make the fallback explicit in bash so this step works whether or not the preview deploys run.
- CREATE_DB_WORKER_URL="${{ steps.deploy-db.outputs.deployment-url || secrets.CREATE_DB_WORKER_URL }}" - CLAIM_DB_WORKER_URL="${{ steps.deploy-claim.outputs.deployment-url || secrets.CLAIM_DB_WORKER_URL }}" + CREATE_DB_WORKER_URL="${{ steps.deploy-db.outputs.deployment-url }}" + CLAIM_DB_WORKER_URL="${{ steps.deploy-claim.outputs.deployment-url }}" + # Explicit fallbacks when outputs are empty or undefined + : "${CREATE_DB_WORKER_URL:=${{ secrets.CREATE_DB_WORKER_URL }}}" + : "${CLAIM_DB_WORKER_URL:=${{ secrets.CLAIM_DB_WORKER_URL }}}"
117-119
: Preview comment may show empty Worker URLs; add the same fallback used above.If deploy steps are skipped, these envs will be empty and your comment renders “undefined”. Use the resolved envs or apply an expression fallback.
- CREATE_DB_WORKER_URL: ${{ steps.deploy-db.outputs.deployment-url }} - CLAIM_DB_WORKER_URL: ${{ steps.deploy-claim.outputs.deployment-url }} + CREATE_DB_WORKER_URL: ${{ steps.deploy-db.outputs.deployment-url || env.CREATE_DB_WORKER_URL || secrets.CREATE_DB_WORKER_URL }} + CLAIM_DB_WORKER_URL: ${{ steps.deploy-claim.outputs.deployment-url || env.CLAIM_DB_WORKER_URL || secrets.CLAIM_DB_WORKER_URL }}
135-139
: Typo in npx instruction: $pr should be pr.This breaks copy/paste for create-postgres.
- npx create-postgres@$pr${{ github.event.number }} + npx create-postgres@pr${{ github.event.number }}
1-12
: Harden workflow permissions (principle of least privilege).Default GITHUB_TOKEN permissions are broad. Explicitly scope them for this workflow.
name: Preview deploy all Workers and CLIs on: pull_request: @@ env: # each folder under the repo root that contains one of your CLIs WORKSPACES: create-db create-pg create-postgres +permissions: + contents: read + pull-requests: write # needed for comment + packages: write # needed for npm publish with GITHUB_TOKEN if used anywhere.github/workflows/release.yml (3)
1-9
: Restrict GITHUB_TOKEN permissions for releases.Add explicit permissions needed for publishing and PR creation.
name: Release CLIs @@ on: workflow_dispatch: push: branches: - main +permissions: + contents: write # commit/version and create PR + pull-requests: write # changesets/action + packages: write # npm publish if using GITHUB_TOKEN with GitHub Packages
76-82
: Avoid generating fallback changesets in your CI release workflowInjecting a default, empty changeset during the
release.yml
run can lead to:
- Polluted commit history with synthetic “auto-generated changeset” files.
- An infinite loop:
- Workflow sees no
.changeset
files → creates a default onechangesets/action@v1
runspnpm changeset version
, which removes all.changeset
files- The next push to
main
(from the version‐bump commit or merged PR) retriggers the workflow → back to step 1To prevent this and align with best practices, author changesets in feature PRs before merging to
main
; the release workflow should only consume them.Suggested refactor (optional):
• Remove or comment out the fallback block in
.github/workflows/release.yml
(around lines 76–82):- - name: 📝 Ensure Changeset Exists - run: | - if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then - echo "No changeset found. Creating a default one..." - pnpm changeset add --empty --message "chore(release): auto-generated changeset" - fi• As an alternative, if you still need a one-time “first release” fallback, guard it so it only fires when no prior tags/releases exist (e.g. detect the first version bump).
Please verify:
- That you haven’t already observed “auto-generated changeset” commits or churn in your Git history.
- The exact behavior of
changesets/action@v1
in your pipeline (does it open a PR or push directly tomain
?).- That removing this block won’t break any existing release flows you rely on.
56-63
: Remove manual version bumps from the release workflowThe CI currently runs
npm version patch --no-git-tag-version
on publish failures, which can desynchronize the actual published package versions from what Changesets tracks. Since this repo already uses Changesets to drive all versioning ("version": "changeset version"
in package.json, and thechangesets/action@v1
step in the release workflow), it’s best to let Changesets be the single source of truth for version bumps.Points of attention:
- File:
.github/workflows/release.yml
- Lines: ~56–63 (the
if ! pnpm publish
block)Suggested refactor options:
Option A – fail fast on publish errors
- if ! pnpm publish --access public; then - echo "Publish failed, trying to bump version and retry..." - npm version patch --no-git-tag-version - pnpm publish --access public || echo "Publish failed again for $pkg" - fi + pnpm publish --access public || exit 1Option B – delegate all versioning to Changesets
• Remove the entire fallback block and rely on the existing Changesets steps (pnpm changeset version
+changesets/action@v1
) to handle version bumps and tagging.create-db/package.json (1)
29-35
: Document missing PRISMA_ACTOR env vars & review dotenv side effectsThe POSTHOG_API_KEY/POSTHOG_API_HOST variables are already documented in README.md, but PRISMA_ACTOR_NAME and PRISMA_ACTOR_PROJECT (used in index.js) are not. Also, while top-level
dotenv.config()
is fine for a CLI-only module, you may want to defer it to avoid import-time side effects if this package is ever consumed programmatically.• In README.md, under the “Environment variables” (or add a new section), document:
PRISMA_ACTOR_NAME
: name to use in the User-Agent header when persisting databasesPRISMA_ACTOR_PROJECT
: project identifier to use in the User-Agent header
• No changes needed for POSTHOG_API_KEY / POSTHOG_API_HOST docs—they’re present at lines ~139–140 of README.md.
• (Optional) Refactorimport dotenv from "dotenv"; dotenv.config();
so thatdotenv.config()
is called inside the CLI entrypoint (e.g. within amain()
), preventing side effects on import.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (5)
.github/workflows/preview.yml
(1 hunks).github/workflows/release.yml
(1 hunks)create-db/package.json
(1 hunks)create-pg/package.json
(1 hunks)create-postgres/package.json
(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: create-db-worker
- GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (3)
create-postgres/package.json (1)
3-3
: Version bump to 1.0.3: LGTM.No other changes; workspace dependency remains pinned to create-db via workspace:*.
create-db/package.json (1)
3-3
: Version bump to 1.0.3 aligns with PR changes: LGTM.Files array includes analytics.js, which matches the analytics gating updates noted in this PR.
create-pg/package.json (1)
3-3
: Version bump to 1.0.3: LGTM.No functional changes; continues to depend on workspace create-db.
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45 Worker URLs
|
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45 Worker URLs
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
create-db/index.js (1)
419-435
: JSON schema: prefer neutral naming (“source” or “utmSource”) over “userAgent”; confirm output contract.This field isn’t a browser UA and can confuse integrators. Consider renaming for clarity. Also ensure no “undefined” values leak.
- if (userAgent) { - jsonResponse.userAgent = userAgent; - } + if (userAgent) { + jsonResponse.source = userAgent; // or 'utmSource' + }Would you like me to prepare a follow-up patch that renames analytics keys from "user-agent" to "source" for consistency?
♻️ Duplicate comments (9)
create-db/index.js (9)
167-176
: Bug: help short flag branch doesn’t await showHelp() or return — CLI flow continues unintentionally.This can lead to unexpected execution after printing help.
- if (mappedFlag === "help") showHelp(); + if (mappedFlag === "help") { await showHelp(); return; }
270-306
: DRY analytics: use a small helper to attach source/userAgent only when present.Replace the inline payload construction with a helper to ensure consistency across all capture sites.
try { - const analyticsProps = { - command: CLI_NAME, - region: region, - "selection-method": "interactive", - "user-agent": userAgent, - }; - - await analytics.capture("create_db:region_selected", analyticsProps); + await captureWithSource( + "create_db:region_selected", + { command: CLI_NAME, region, "selection-method": "interactive" }, + userAgent + ); } catch (error) {}Add this helper near the top of the file (after importing analytics):
function captureWithSource(event, props, maybeSource) { const payload = maybeSource ? { ...props, "user-agent": maybeSource } : props; return analytics.capture(event, payload); }
530-549
: Analytics flags: rely on parsed flags instead of scanning rawArgs.rawArgs misses combined short flags and inflates maintenance cost. You already have parsed flags here.
- const analyticsProps = { + const analyticsProps = { command: CLI_NAME, "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(), - "has-region-flag": - rawArgs.includes("--region") || rawArgs.includes("-r"), - "has-interactive-flag": - rawArgs.includes("--interactive") || rawArgs.includes("-i"), - "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"), - "has-list-regions-flag": rawArgs.includes("--list-regions"), - "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"), + "has-region-flag": Boolean(flags.region), + "has-interactive-flag": Boolean(flags.interactive), + "has-help-flag": Boolean(flags.help), + "has-list-regions-flag": Boolean(flags["list-regions"]), + "has-json-flag": Boolean(flags.json), "has-user-agent-from-env": !!userAgent, "node-version": process.version, platform: process.platform, arch: process.arch, "user-agent": userAgent, };
570-582
: DRY: region flag analytics should also use captureWithSource helper.Mirror the interactive path refactor.
- const analyticsProps = { - command: CLI_NAME, - region: region, - "selection-method": "flag", - "user-agent": userAgent, - }; - - await analytics.capture("create_db:region_selected", analyticsProps); + await captureWithSource( + "create_db:region_selected", + { command: CLI_NAME, region, "selection-method": "flag" }, + userAgent + );
4-5
: Remove unused fs/path imports (or switch to node: specifiers) — you shouldn’t read .env manually anymore.Given dotenv.config() is used and we should rely on process.env (see comment below removing readUserEnvFile), these imports become dead code. If you decide to keep file access for some reason, prefer node: specifiers.
-import fs from "fs"; -import path from "path"; +// (removed; no longer needed once readUserEnvFile is deleted)
63-86
: Don’t hand-roll a .env parser; delete readUserEnvFile and rely on dotenv.config()/process.env.The custom parser mishandles edge cases (quotes, whitespace, export, CRLF, comments). dotenv already hydrated process.env. Remove this function.
-function readUserEnvFile() { - const userCwd = process.cwd(); - const envPath = path.join(userCwd, ".env"); - - if (!fs.existsSync(envPath)) { - return {}; - } - - const envContent = fs.readFileSync(envPath, "utf8"); - const envVars = {}; - - envContent.split("\n").forEach((line) => { - const trimmed = line.trim(); - if (trimmed && !trimmed.startsWith("#")) { - const [key, ...valueParts] = trimmed.split("="); - if (key && valueParts.length > 0) { - const value = valueParts.join("=").replace(/^["']|["']$/g, ""); - envVars[key.trim()] = value.trim(); - } - } - }); - - return envVars; -}
416-416
: Fix: claim URL emits utm_source=undefined and isn’t URL-encoded.Fallback and percent-encode UTM values to avoid polluted analytics and broken links.
- const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${userAgent}&utm_medium=cli`; + const utmSource = encodeURIComponent(userAgent || CLI_NAME); + const utmMedium = encodeURIComponent("cli"); + const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${utmSource}&utm_medium=${utmMedium}`;
592-592
: Naming consistency: variable represents a UTM “source”, not a browser “user agent”.Consider renaming the param and variable from userAgent → source for clarity across promptForRegion/createDatabase and call sites. Analytics payload keys can remain "utm_source" (body) and a neutral "source" (event property) to avoid confusion.
- region = await promptForRegion(region, userAgent); + region = await promptForRegion(region, source); ... - const result = await createDatabase(name, region, userAgent, true); + const result = await createDatabase(name, region, source, true); ... - region = await promptForRegion(region, userAgent); + region = await promptForRegion(region, source); ... - await createDatabase(name, region, userAgent); + await createDatabase(name, region, source);Follow-up: rename function parameters and local variables accordingly.
Also applies to: 596-596, 621-621, 626-626
518-528
: Single derivation for source/userAgent; don’t re-read .env; PR title claims “--source flag” but it isn’t implemented.Compute once from flags (if implemented) then fall back to process.env (hydrated by dotenv). Implement -–source/-s, or update the PR title.
- let userAgent; - const userEnvVars = readUserEnvFile(); - if (userEnvVars.PRISMA_ACTOR_NAME && userEnvVars.PRISMA_ACTOR_PROJECT) { - userAgent = `${userEnvVars.PRISMA_ACTOR_NAME}/${userEnvVars.PRISMA_ACTOR_PROJECT}`; - } + // Derive UTM source once: flag overrides env + let userAgent = flags.source; + if (!userAgent) { + const { PRISMA_ACTOR_NAME, PRISMA_ACTOR_PROJECT } = process.env; + if (PRISMA_ACTOR_NAME && PRISMA_ACTOR_PROJECT) { + userAgent = `${PRISMA_ACTOR_NAME}/${PRISMA_ACTOR_PROJECT}`; + } + }Outside this hunk, please implement the –-source/-s flag and help text:
@@ async function parseArgs() { - const allowedFlags = [ + const allowedFlags = [ "region", "help", "list-regions", "interactive", "json", + "source", ]; const shorthandMap = { r: "region", i: "interactive", h: "help", j: "json", + s: "source", }; @@ long-flag branch + } else if (flag === "source") { + const src = args[i + 1]; + if (!src || src.startsWith("-")) + exitWithError("Missing value for --source flag."); + flags.source = src; + i++; @@ single short flag (-s) + } else if (mappedFlag === "source") { + const src = args[i + 1]; + if (!src || src.startsWith("-")) + exitWithError("Missing value for -s flag."); + flags.source = src; + i++; @@ combined short flags + } else if (mappedFlag === "source") { + const src = args[i + 1]; + if (!src || src.startsWith("-")) + exitWithError("Missing value for -s flag."); + flags.source = src; + i++;And document in help():
- ${chalk.yellow("--json, -j")} Output machine-readable JSON and exit + ${chalk.yellow("--source <id>, -s <id>")} UTM source (overrides env PRISMA_ACTOR_NAME/PRISMA_ACTOR_PROJECT) + ${chalk.yellow("--json, -j")} Output machine-readable JSON and exit
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (1)
create-db/index.js
(15 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
analytics
(61-61)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (2)
create-db/index.js (2)
308-323
: LGTM: create request includes safe utm_source fallback.Falling back to CLI_NAME prevents sending undefined to the service. Looks correct.
413-414
: LGTM: append sslmode=require to direct connection string.Good default for secure connections; matches typical managed Postgres recommendations.
Summary by CodeRabbit
New Features
UX Improvements
Chores