Skip to content

Conversation

aidankmcalister
Copy link
Member

@aidankmcalister aidankmcalister commented Aug 21, 2025

Summary by CodeRabbit

  • New Features

    • Database creation now derives a user-agent/source from project env and includes it as utm_source in requests, claim URLs, and returned JSON.
  • UX Improvements

    • Derived user-agent is propagated across interactive, flag, and JSON flows for consistent attribution.
    • Analytics events include the derived user-agent; analytics failures are surfaced in development for troubleshooting.
  • Chores

    • Analytics now require explicit host and API key (no fallbacks).
    • CI updated to use pnpm action v4; package versions bumped to 1.0.3.

Copy link

cloudflare-workers-and-pages bot commented Aug 21, 2025

Deploying with  Cloudflare Workers  Cloudflare Workers

The latest updates on your project. Learn more about integrating Git with Workers.

Status Name Latest Commit Preview URL Updated (UTC)
✅ Deployment successful!
View logs
claim-db-worker f44687c Commit Preview URL

Branch Preview URL
Aug 26 2025, 04:34 PM

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17134940199:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17134940199.

Copy link

coderabbitai bot commented Aug 21, 2025

Walkthrough

Reads PRISMA_ACTOR_NAME and PRISMA_ACTOR_PROJECT from a project .env to derive an optional userAgent and propagate it through region selection, database creation requests (utm_source), analytics events, claim URL, and JSON output; PostHog analytics now require explicit host/key and are disabled if missing.

Changes

Cohort / File(s) Summary
Env reading & source derivation
create-db/index.js
Added readUserEnvFile() to parse a project .env and derive userAgent as PRISMA_ACTOR_NAME/PRISMA_ACTOR_PROJECT when both exist.
Source propagation through control flow
create-db/index.js
Compute userAgent in main() and thread it to promptForRegion(defaultRegion, userAgent) and createDatabase(name, region, userAgent, returnJson) across interactive, flag, and JSON flows; updated call sites accordingly.
API request & output shaping
create-db/index.js
POST bodies and claim URL use utm_source: userAgent (fallback to existing CLI name if absent). Returned JSON includes a userAgent field when present.
Analytics integration
create-db/index.js
Analytics events (cli_command_ran, region_selected, database_creation_failed, invalid_json, api_error, etc.) now include userAgent when present; some analytics failures are logged to console in more paths.
PostHog config and error handling
create-db/analytics.js
Require both POSTHOG_API_HOST and POSTHOG_API_KEY to enable analytics; build POSTHOG_CAPTURE_URL from trimmed POSTHOG_API_HOST + /capture; disable analytics (with dev-only warning) if missing; restrict error logs to development.
Workflows
.github/workflows/preview.yml, .github/workflows/release.yml
Update pnpm setup action to pnpm/action-setup@v4 (input version: 8 unchanged).
Package version bumps
create-db/package.json, create-pg/package.json, create-postgres/package.json
Bumped package versions from 1.0.2 to 1.0.3.
Public API / exports
create-db/*.js
Exported signatures updated for promptForRegion(defaultRegion, userAgent) and createDatabase(name, region, userAgent, returnJson); getRegions and validateRegion unchanged.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor U as User
  participant C as CLI (create-db/index.js)
  participant E as EnvReader (.env)
  participant A as Analytics (create-db/analytics.js)
  participant S as API Service
  participant B as Browser (Claim URL)

  U->>C: run create-db [--json] [region/flags...]
  C->>C: parse args
  C->>E: readUserEnvFile()
  alt .env contains both keys
    E-->>C: {PRISMA_ACTOR_NAME, PRISMA_ACTOR_PROJECT}
    C->>C: userAgent = "NAME/PROJECT"
  else no userAgent
    E-->>C: {}
    C->>C: userAgent = undefined
  end

  C->>A: cli_command_ran {has-source-from-env?, userAgent?}

  alt interactive
    C->>U: promptForRegion(defaultRegion, userAgent)
    U-->>C: region
    C->>A: region_selected {region, userAgent?}
  else region provided via flag/JSON
    C->>A: region_selected {region, userAgent?}
  end

  C->>S: createDatabase {name, region, utm_source: userAgent || CLI_NAME}
  alt success
    S-->>C: db info + claimUrl(utm_source)
    C-->>U: output (JSON includes userAgent when present)
    C->>B: open claimUrl (utm_source)
  else failure
    S-->>C: error
    C->>A: database_creation_failed {reason, userAgent?}
    C-->>U: error
  end
Loading

Possibly related PRs

  • DC-4828 --json flag #43 — Modifies create-db/index.js and changes the createDatabase function signature/behavior (overlaps with userAgent/signature changes).
  • fix: version bump #44 — Contains package version bumps for create-db, create-pg, and create-postgres (matches the version updates here).

Suggested reviewers

  • ankur-arch
  • mhessdev
  • nurul3101

Tip

🔌 Remote MCP (Model Context Protocol) integration is now available!

Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats.

✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch DC-4829-source-flag

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR/Issue comments)

Type @coderabbitai help to get the list of available commands.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Status, Documentation and Community

  • Visit our Status Page to check the current availability of CodeRabbit.
  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17135144702:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17135144702.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (3)
create-db/index.js (3)

96-116: Help output doesn’t document the new --source flag.

Users won’t discover the feature or its behavior from --help. Add the flag to Options and Examples.

   ${chalk.yellow("--json, -j")}                      Output machine-readable JSON and exit
   ${chalk.yellow("--list-regions")}                  List available regions and exit
   ${chalk.yellow("--help, -h")}                      Show this help message
+  ${chalk.yellow("--source, -s")}                    Derive utm_source from CTA_* in your project .env

 ...
   ${chalk.gray(`npx ${CLI_NAME} --json --region us-east-1`)}
+  ${chalk.gray(`npx ${CLI_NAME} --source --region us-east-1`)}

318-329: utm_source: avoid undefined, keep backward-compat, and URL-encode.

  • If source is unset, the request body omits utm_source and the claim URL renders utm_source=undefined. That breaks existing attribution and pollutes links.
  • Fallback to CLI_NAME and URL-encode the value in the claim URL.
 async function createDatabase(name, region, source, returnJson = false ) {
   let s;
   if (!returnJson) {
     s = spinner();
     s.start("Creating your database...");
   }
 
+  const utmSource = source || CLI_NAME;
+
   const resp = await fetch(`${CREATE_DB_WORKER_URL}/create`, {
     method: "POST",
     headers: { "Content-Type": "application/json" },
-    body: JSON.stringify({ region, name, utm_source: source }),
+    body: JSON.stringify({ region, name, utm_source: utmSource }),
   });
 ...
-  const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${source}&utm_medium=cli`;
+  const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${encodeURIComponent(utmSource)}&utm_medium=cli`;

Also applies to: 422-422


543-557: Analytics: compute has-source-flag from parsed flags, not argv text.

rawArgs.includes("-s") misses combined short forms (e.g., -rs). Use Boolean(flags.source).

-        "has-source-flag": rawArgs.includes("--source") || rawArgs.includes("-s"),
+        "has-source-flag": Boolean(flags.source),
📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 1caa532 and a3733fe.

📒 Files selected for processing (1)
  • create-db/index.js (17 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
  • analytics (48-48)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (1)
create-db/index.js (1)

631-636: LGTM: source plumbed through interactive/JSON/non-interactive flows.

Passing source into promptForRegion and createDatabase across code paths looks consistent.

Also applies to: 660-666

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
create-db/index.js (2)

102-115: Update help text to document the new --source flag (and example).

Users won’t discover the flag via --help. Add an option line and an example.

   ${chalk.yellow("--json, -j")}                      Output machine-readable JSON and exit
+  ${chalk.yellow("--source, -s")}                    Derive a UTM source from CTA_* vars in your .env and include in analytics and API calls
   ${chalk.yellow("--list-regions")}                  List available regions and exit
@@
   ${chalk.gray(`npx ${CLI_NAME} --json --region us-east-1`)}
+  ${chalk.gray(`npx ${CLI_NAME} --interactive --source`)}

318-329: Preserve utm_source when --source is absent.

Currently, utm_source is omitted from the JSON body when source is undefined. To keep existing attribution, fall back to CLI_NAME.

-    body: JSON.stringify({ region, name, utm_source: source }),
+    body: JSON.stringify({ region, name, utm_source: source ?? CLI_NAME }),
♻️ Duplicate comments (6)
create-db/index.js (6)

170-173: Await and return on -h single-short-flag path.

Without await + return, downstream code may still run in some harnesses/mocks where process.exit is stubbed.

-        if (mappedFlag === "help") showHelp();
+        if (mappedFlag === "help") { await showHelp(); return; }

544-565: Analytics: use parsed flags for “has-source-flag” and reuse the helper.

Scanning rawArgs misses combos like -rs. Use Boolean(flags.source) and the helper to attach source.

-      const analyticsProps = {
+      const analyticsProps = {
         command: CLI_NAME,
         "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(),
         "has-region-flag":
           rawArgs.includes("--region") || rawArgs.includes("-r"),
         "has-interactive-flag":
           rawArgs.includes("--interactive") || rawArgs.includes("-i"),
         "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"),
         "has-list-regions-flag": rawArgs.includes("--list-regions"),
-        "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"),
-        "has-source-flag": rawArgs.includes("--source") || rawArgs.includes("-s"),
+        "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"),
+        "has-source-flag": Boolean(flags.source),
         "node-version": process.version,
         platform: process.platform,
         arch: process.arch,
       };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:cli_command_ran", analyticsProps);
+      await captureWithSource("create_db:cli_command_ran", analyticsProps, source);

4-5: Use node: protocol for built-ins.

Prefer node: specifiers for core modules to avoid resolution ambiguity and align with Node guidance.

-import fs from "fs";
-import path from "path";
+import fs from "node:fs";
+import path from "node:path";

62-85: Don’t hand-roll a .env parser; rely on process.env (dotenv.config) instead.

This parser will mis-handle comments, whitespace, CRLF, export prefixes, and quoted/multiline values. You already call dotenv.config() at startup, so just derive the source from process.env.

Replace this helper with a simple derivation helper:

-function readUserEnvFile() {
-  const userCwd = process.cwd();
-  const envPath = path.join(userCwd, '.env');
-  
-  if (!fs.existsSync(envPath)) {
-    return {};
-  }
-  
-  const envContent = fs.readFileSync(envPath, 'utf8');
-  const envVars = {};
-  
-  envContent.split('\n').forEach(line => {
-    const trimmed = line.trim();
-    if (trimmed && !trimmed.startsWith('#')) {
-      const [key, ...valueParts] = trimmed.split('=');
-      if (key && valueParts.length > 0) {
-        const value = valueParts.join('=').replace(/^["']|["']$/g, '');
-        envVars[key.trim()] = value.trim();
-      }
-    }
-  });
-  
-  return envVars;
-}
+function deriveSourceFromProcessEnv() {
+  const { CTA_VERSION, CTA_FRAMEWORK, CTA_FRAMEWORK_VERSION } = process.env;
+  const parts = [];
+  if (CTA_VERSION) parts.push(`v${CTA_VERSION}`);
+  if (CTA_FRAMEWORK) parts.push(CTA_FRAMEWORK);
+  if (CTA_FRAMEWORK_VERSION) parts.push(`fv${CTA_FRAMEWORK_VERSION}`);
+  return parts.length ? parts.join("-") : undefined;
+}

302-313: DRY analytics: centralize “attach source if defined”.

The same “conditionally add source” payload logic repeats across 5 blocks. Extract a helper and use it here to reduce duplication and missed cases.

Add near the top (after the analytics import):

+function captureWithSource(event, props, maybeSource) {
+  const payload = maybeSource ? { ...props, source: maybeSource } : props;
+  return analytics.capture(event, payload);
+}

Then refactor these blocks:

Interactive region selection:

-    const analyticsProps = {
-      command: CLI_NAME,
-      region: region,
-      "selection-method": "interactive",
-    };
-    
-    if (source) {
-      analyticsProps.source = source;
-    }
-    
-    await analytics.capture("create_db:region_selected", analyticsProps);
+    await captureWithSource(
+      "create_db:region_selected",
+      { command: CLI_NAME, region, "selection-method": "interactive" },
+      source
+    );

Rate-limit error:

-      const analyticsProps = {
-        command: CLI_NAME,
-        region: region,
-        "error-type": "rate_limit",
-        "status-code": 429,
-      };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:database_creation_failed", analyticsProps);
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "rate_limit", "status-code": 429 },
+        source
+      );

Invalid JSON error:

-      const analyticsProps = {
-        command: CLI_NAME,
-        region,
-        "error-type": "invalid_json",
-        "status-code": resp.status,
-      };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:database_creation_failed", analyticsProps);
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "invalid_json", "status-code": resp.status },
+        source
+      );

API error:

-      const analyticsProps = {
-        command: CLI_NAME,
-        region: region,
-        "error-type": "api_error",
-        "error-message": result.error.message,
-      };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:database_creation_failed", analyticsProps);
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "api_error", "error-message": result.error.message },
+        source
+      );

Region selected via flag:

-        const analyticsProps = {
-          command: CLI_NAME,
-          region: region,
-          "selection-method": "flag",
-        };
-        
-        if (source) {
-          analyticsProps.source = source;
-        }
-        
-        await analytics.capture("create_db:region_selected", analyticsProps);
+        await captureWithSource(
+          "create_db:region_selected",
+          { command: CLI_NAME, region, "selection-method": "flag" },
+          source
+        );

Also applies to: 348-360, 383-395, 453-465, 590-601


523-541: Duplicate source derivation/validation; compute once via process.env and validate once.

You compute source from .env twice and perform two validations. Collapse into a single derivation from process.env (already populated by dotenv.config()), then validate once.

Derivation:

-          let source;
-    if (flags.source) {
-      const userEnvVars = readUserEnvFile();
-      const userCwd = process.cwd();
-      const envPath = path.join(userCwd, '.env');
-      
-      if (fs.existsSync(envPath)) {
-        const ctaVars = [];
-        if (userEnvVars.CTA_VERSION) ctaVars.push(`v${userEnvVars.CTA_VERSION}`);
-        if (userEnvVars.CTA_FRAMEWORK) ctaVars.push(userEnvVars.CTA_FRAMEWORK);
-        if (userEnvVars.CTA_FRAMEWORK_VERSION) ctaVars.push(`fv${userEnvVars.CTA_FRAMEWORK_VERSION}`);
-        
-        if (ctaVars.length > 0) {
-          source = ctaVars.join('-');
-        }
-      }
-    }
+    let source = undefined;
+    if (flags.source) {
+      source = deriveSourceFromProcessEnv();
+    }

Validation:

-    if (flags.source) {
-      const userCwd = process.cwd();
-      const envPath = path.join(userCwd, '.env');
-      
-      if (!fs.existsSync(envPath)) {
-        console.error(chalk.red("Error: Source not configured correctly."));
-        process.exit(1);
-      }
-      
-      const userEnvVars = readUserEnvFile();
-      const ctaVars = [];
-      if (userEnvVars.CTA_VERSION) ctaVars.push(`v${userEnvVars.CTA_VERSION}`);
-      if (userEnvVars.CTA_FRAMEWORK) ctaVars.push(userEnvVars.CTA_FRAMEWORK);
-      if (userEnvVars.CTA_FRAMEWORK_VERSION) ctaVars.push(`fv${userEnvVars.CTA_FRAMEWORK_VERSION}`);
-      
-      if (ctaVars.length === 0) {
-        console.error(chalk.red("Error: Source not configured correctly."));
-        process.exit(1);
-      }
-    }
+    if (flags.source && !source) {
+      console.error(
+        chalk.red(
+          "Error: Source not configured correctly. Expected CTA_VERSION, CTA_FRAMEWORK and/or CTA_FRAMEWORK_VERSION in .env."
+        )
+      );
+      process.exit(1);
+    }

Also applies to: 604-623

📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between a3733fe and 4993671.

📒 Files selected for processing (1)
  • create-db/index.js (18 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
  • analytics (48-48)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (4)
create-db/index.js (4)

129-137: Flag plumbing for --source looks good.

Allowed flags and shorthand mapping are correctly extended.


277-277: Signature change to accept source is appropriate.

Passing source into the region prompt enables consistent analytics enrichment across flows.


435-435: JSON output includes source context — good.

Returning source (or null) in JSON response is helpful for end-to-end validation in CI and downstream tooling.


632-636: Propagation of source through interactive/JSON and create calls looks consistent.

Passing source to promptForRegion and createDatabase in all paths ensures consistent analytics and request attribution.

Also applies to: 660-662, 666-666

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17136895926:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17136895926.

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17136985429:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17136985429.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
create-db/index.js (2)

310-321: Preserve UTM segmentation when source is absent.

Requests previously used CLI_NAME for utm_source; with this change, the key disappears when source is undefined. If the backend expects or benefits from a default, fall back to CLI_NAME.

-    body: JSON.stringify({ region, name, utm_source: source }),
+    body: JSON.stringify({ region, name, utm_source: source || CLI_NAME }),

524-547: DRY analytics: reuse captureWithSource; add “has-source-flag” if you ship the flag.

Replace manual payload building and optional source-mutation with the helper; also consider tracking has-source-flag separately from has-source-from-env.

-      const analyticsProps = {
-        command: CLI_NAME,
-        "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(),
-        "has-region-flag":
-          rawArgs.includes("--region") || rawArgs.includes("-r"),
-        "has-interactive-flag":
-          rawArgs.includes("--interactive") || rawArgs.includes("-i"),
-        "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"),
-        "has-list-regions-flag": rawArgs.includes("--list-regions"),
-        "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"),
-        "has-source-from-env": !!source,
-        "node-version": process.version,
-        platform: process.platform,
-        arch: process.arch,
-      };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:cli_command_ran", analyticsProps);
+      await captureWithSource(
+        "create_db:cli_command_ran",
+        {
+          command: CLI_NAME,
+          "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(),
+          "has-region-flag": rawArgs.includes("--region") || rawArgs.includes("-r"),
+          "has-interactive-flag": rawArgs.includes("--interactive") || rawArgs.includes("-i"),
+          "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"),
+          "has-list-regions-flag": rawArgs.includes("--list-regions"),
+          "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"),
+          "has-source-from-env": !!source && !rawArgs.includes("--source") && !rawArgs.includes("-s"),
+          "has-source-flag": rawArgs.includes("--source") || rawArgs.includes("-s"),
+          "node-version": process.version,
+          platform: process.platform,
+          arch: process.arch,
+        },
+        source
+      );
♻️ Duplicate comments (8)
create-db/index.js (8)

166-177: Bug: help short-flag path doesn’t await showHelp() or return.

Single short-flag branch should mirror the combined short-flags branch to avoid falling through.

       if (shorthandMap[short]) {
         const mappedFlag = shorthandMap[short];
-        if (mappedFlag === "help") showHelp();
+        if (mappedFlag === "help") { await showHelp(); return; }
         if (mappedFlag === "region") {
           const region = args[i + 1];
           if (!region || region.startsWith("-"))
             exitWithError("Missing value for -r flag.");
           flags.region = region;
           i++;
         } else {
           flags[mappedFlag] = true;
         }
       }

269-305: DRY analytics capture; introduce captureWithSource helper.

You repeat “attach source if defined” pattern. Centralize to avoid drift.

   try {
-    const analyticsProps = {
-      command: CLI_NAME,
-      region: region,
-      "selection-method": "interactive",
-    };
-    
-    if (source) {
-      analyticsProps.source = source;
-    }
-    
-    await analytics.capture("create_db:region_selected", analyticsProps);
+    await captureWithSource(
+      "create_db:region_selected",
+      { command: CLI_NAME, region, "selection-method": "interactive" },
+      source
+    );
   } catch (error) {}

Add once near the top (after the analytics import):

function captureWithSource(event, props, maybeSource) {
  const payload = maybeSource ? { ...props, source: maybeSource } : props;
  return analytics.capture(event, payload);
}

4-5: Use node: protocol for built-ins.

Prefer node:fs and node:path to match Node’s guidance and avoid resolution ambiguity.

-import fs from "fs";
-import path from "path";
+import fs from "node:fs";
+import path from "node:path";

62-85: Don’t hand-roll .env parsing; use dotenv.parse or process.env.

The custom parser mishandles quotes, CRLF, inline comments, export prefixes, and values containing '='. Since dotenv.config() is already called, prefer process.env; minimally, swap in dotenv.parse.

Minimal, safer fix for this helper:

 function readUserEnvFile() {
-  const userCwd = process.cwd();
-  const envPath = path.join(userCwd, '.env');
-  
-  if (!fs.existsSync(envPath)) {
-    return {};
-  }
-  
-  const envContent = fs.readFileSync(envPath, 'utf8');
-  const envVars = {};
-  
-  envContent.split('\n').forEach(line => {
-    const trimmed = line.trim();
-    if (trimmed && !trimmed.startsWith('#')) {
-      const [key, ...valueParts] = trimmed.split('=');
-      if (key && valueParts.length > 0) {
-        const value = valueParts.join('=').replace(/^["']|["']$/g, '');
-        envVars[key.trim()] = value.trim();
-      }
-    }
-  });
-  
-  return envVars;
+  const envPath = path.join(process.cwd(), ".env");
+  if (!fs.existsSync(envPath)) return {};
+  return dotenv.parse(fs.readFileSync(envPath, "utf8"));
 }

Better: delete this helper and read from process.env everywhere (populated by dotenv.config()).


340-352: Repeat pattern: use captureWithSource helper (rate-limit path).

Apply the helper to keep payload construction consistent.

-      const analyticsProps = {
-        command: CLI_NAME,
-        region: region,
-        "error-type": "rate_limit",
-        "status-code": 429,
-      };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:database_creation_failed", analyticsProps);
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "rate_limit", "status-code": 429 },
+        source
+      );

375-387: Repeat pattern: use captureWithSource helper (invalid JSON path).

-      const analyticsProps = {
-        command: CLI_NAME,
-        region,
-        "error-type": "invalid_json",
-        "status-code": resp.status,
-      };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:database_creation_failed", analyticsProps);
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "invalid_json", "status-code": resp.status },
+        source
+      );

570-581: Repeat pattern: use captureWithSource helper (region flag path).

-        const analyticsProps = {
-          command: CLI_NAME,
-          region: region,
-          "selection-method": "flag",
-        };
-        
-        if (source) {
-          analyticsProps.source = source;
-        }
-        
-        await analytics.capture("create_db:region_selected", analyticsProps);
+        await captureWithSource(
+          "create_db:region_selected",
+          { command: CLI_NAME, region, "selection-method": "flag" },
+          source
+        );

414-414: Bug: claim URL produces “utm_source=undefined” when source is missing.

This pollutes analytics and produces a literal “undefined” in links. Fall back to CLI_NAME and ensure proper encoding.

-  const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${source}&utm_medium=cli`;
+  const utmSource = encodeURIComponent(source || CLI_NAME);
+  const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${utmSource}&utm_medium=cli`;
📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 4993671 and f628681.

📒 Files selected for processing (1)
  • create-db/index.js (14 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
  • analytics (48-48)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: create-db-worker
  • GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (2)
create-db/index.js (2)

426-427: LGTM: include source in JSON output.

Returning source (or null) in --json aligns with observability/traceability goals.


594-599: LGTM: threaded source through interactive/JSON and default flows.

Passing source into promptForRegion and createDatabase maintains consistent tracing across modes.

Also applies to: 622-627

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17164565333:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17164565333.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
create-db/index.js (1)

310-321: Preserve default attribution when env-derived source is absent.

When userAgent is undefined, utm_source drops from the payload entirely. If you previously attributed to the CLI, you can keep continuity with a fallback.

-    body: JSON.stringify({ region, name, utm_source: userAgent }),
+    body: JSON.stringify({ region, name, utm_source: userAgent || CLI_NAME }),
♻️ Duplicate comments (9)
create-db/index.js (9)

269-305: DRY analytics: factor “attach source if present” into a helper; also standardize field name.

This pattern appears multiple times. Extract a helper to avoid drift and use a consistent key (source) instead of user-agent (which can be confused with the HTTP header).

-  try {
-    const analyticsProps = {
-      command: CLI_NAME,
-      region: region,
-      "selection-method": "interactive",
-    };
-
-    if (userAgent) {
-      analyticsProps["user-agent"] = userAgent;
-    }
-
-    await analytics.capture("create_db:region_selected", analyticsProps);
-  } catch (error) {}
+  try {
+    await captureWithSource(
+      "create_db:region_selected",
+      { command: CLI_NAME, region, "selection-method": "interactive" },
+      userAgent
+    );
+  } catch {}

Add near the top-level (after the analytics import):

function captureWithSource(event, props, maybeSource) {
  const payload = maybeSource ? { ...props, source: maybeSource } : props;
  return analytics.capture(event, payload);
}

538-561: Analytics props: compute “has-*” from parsed flags; keep analytics failures silent for users.

  • Use flags instead of scanning rawArgs to avoid missing combined short flags and quoting edge cases.
  • Avoid logging analytics errors to stdout/stderr except in development, to not confuse CLI users.
-      const analyticsProps = {
+      const analyticsProps = {
         command: CLI_NAME,
         "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(),
-        "has-region-flag":
-          rawArgs.includes("--region") || rawArgs.includes("-r"),
-        "has-interactive-flag":
-          rawArgs.includes("--interactive") || rawArgs.includes("-i"),
-        "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"),
-        "has-list-regions-flag": rawArgs.includes("--list-regions"),
-        "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"),
+        "has-region-flag": Boolean(flags.region),
+        "has-interactive-flag": Boolean(flags.interactive),
+        "has-help-flag": Boolean(flags.help),
+        "has-list-regions-flag": Boolean(flags["list-regions"]),
+        "has-json-flag": Boolean(flags.json),
         "has-source-from-env": !!userAgent,
         "node-version": process.version,
         platform: process.platform,
         arch: process.arch,
       };
@@
-      await analytics.capture("create_db:cli_command_ran", analyticsProps);
-    } catch (error) {
-      console.error("Error:", error.message);
-    }
+      await analytics.capture("create_db:cli_command_ran", analyticsProps);
+    } catch (error) {
+      if (process.env.NODE_ENV === "development") {
+        console.error("Analytics error:", error.message);
+      }
+    }

4-5: Use node: protocol for built-ins (or drop imports if the custom .env parser is removed).

Follow Node guidance and avoid resolution ambiguity by using node: specifiers.

-import fs from "fs";
-import path from "path";
+import fs from "node:fs";
+import path from "node:path";

62-85: Don’t hand-roll a .env parser; rely on dotenv/process.env and centralize source derivation.

This parser mishandles edge cases (quotes, CRLF, inline comments, export, multiline). You already call dotenv.config(); read from process.env and delete this function. Also, add a tiny helper to compute the source once.

Replace this block with:

-function readUserEnvFile() {
-  const userCwd = process.cwd();
-  const envPath = path.join(userCwd, ".env");
-
-  if (!fs.existsSync(envPath)) {
-    return {};
-  }
-
-  const envContent = fs.readFileSync(envPath, "utf8");
-  const envVars = {};
-
-  envContent.split("\n").forEach((line) => {
-    const trimmed = line.trim();
-    if (trimmed && !trimmed.startsWith("#")) {
-      const [key, ...valueParts] = trimmed.split("=");
-      if (key && valueParts.length > 0) {
-        const value = valueParts.join("=").replace(/^["']|["']$/g, "");
-        envVars[key.trim()] = value.trim();
-      }
-    }
-  });
-
-  return envVars;
-}
+function getSourceFromEnv() {
+  const { PRISMA_ACTOR_NAME, PRISMA_ACTOR_PROJECT } = process.env;
+  return PRISMA_ACTOR_NAME && PRISMA_ACTOR_PROJECT
+    ? `${PRISMA_ACTOR_NAME}/${PRISMA_ACTOR_PROJECT}`
+    : undefined;
+}

339-355: DRY analytics for rate-limit error path.

Use the same captureWithSource helper here.

-    try {
-      const analyticsProps = {
-        command: CLI_NAME,
-        region: region,
-        "error-type": "rate_limit",
-        "status-code": 429,
-      };
-
-      if (userAgent) {
-        analyticsProps["user-agent"] = userAgent;
-      }
-
-      await analytics.capture(
-        "create_db:database_creation_failed",
-        analyticsProps
-      );
-    } catch (error) {}
+    try {
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "rate_limit", "status-code": 429 },
+        userAgent
+      );
+    } catch {}

378-393: DRY analytics for invalid JSON error path.

Same duplication; use the helper.

-    try {
-      const analyticsProps = {
-        command: CLI_NAME,
-        region,
-        "error-type": "invalid_json",
-        "status-code": resp.status,
-      };
-
-      if (userAgent) {
-        analyticsProps["user-agent"] = userAgent;
-      }
-
-      await analytics.capture(
-        "create_db:database_creation_failed",
-        analyticsProps
-      );
-    } catch {}
+    try {
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "invalid_json", "status-code": resp.status },
+        userAgent
+      );
+    } catch {}

456-471: DRY analytics for API error path.

Replace manual prop building with the shared helper.

-    try {
-      const analyticsProps = {
-        command: CLI_NAME,
-        region: region,
-        "error-type": "api_error",
-        "error-message": result.error.message,
-      };
-
-      if (userAgent) {
-        analyticsProps["user-agent"] = userAgent;
-      }
-
-      await analytics.capture(
-        "create_db:database_creation_failed",
-        analyticsProps
-      );
-    } catch (error) {}
+    try {
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "api_error", "error-message": result.error.message },
+        userAgent
+      );
+    } catch {}

529-536: Derive userAgent from process.env (or flags), not by re-reading .env. Also PR title vs code mismatch.

  • Replace readUserEnvFile() usage with a one-liner reading process.env (dotenv already ran).
  • PR title says “--source flag added” but there is no such flag parsed/helped here. Either implement it or update the title.
-    let userAgent;
-    const userEnvVars = readUserEnvFile();
-    if (userEnvVars.PRISMA_ACTOR_NAME && userEnvVars.PRISMA_ACTOR_PROJECT) {
-      userAgent = `${userEnvVars.PRISMA_ACTOR_NAME}/${userEnvVars.PRISMA_ACTOR_PROJECT}`;
-    }
+    // Flag (if implemented) should override ENV; otherwise pull from ENV.
+    let userAgent = /* flags.source ?? */ getSourceFromEnv();

Would you like a follow-up patch to add --source/-s (help text, allowed flags, and parsing) so it overrides the env-derived value?


584-595: DRY analytics for region flag path.

Use the shared helper for consistency with the interactive path.

-      try {
-        const analyticsProps = {
-          command: CLI_NAME,
-          region: region,
-          "selection-method": "flag",
-        };
-
-        if (userAgent) {
-          analyticsProps["user-agent"] = userAgent;
-        }
-
-        await analytics.capture("create_db:region_selected", analyticsProps);
-      } catch (error) {}
+      try {
+        await captureWithSource(
+          "create_db:region_selected",
+          { command: CLI_NAME, region, "selection-method": "flag" },
+          userAgent
+        );
+      } catch {}
📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 8bfa0ae and 4666b46.

📒 Files selected for processing (2)
  • create-db/analytics.js (1 hunks)
  • create-db/index.js (15 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
  • analytics (50-50)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (3)
create-db/index.js (3)

424-439: JSON output: field name is consistent and optional — LGTM.

Including source only when provided avoids changing the default contract for existing consumers.


605-610: Propagating userAgent through the JSON path — LGTM.

Passing the source into promptForRegion and createDatabase keeps attribution consistent in non-interactive mode.


634-639: Propagating userAgent through the interactive path — LGTM.

Attribution remains consistent across flows.

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17164722487:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17164722487.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (4)
create-db/analytics.js (4)

24-31: Bound the network call with a short timeout and make it fire-and-forget.

A hung analytics POST can stall CLI UX. Add an AbortController timeout (configurable) and mark the request keepalive.

Apply this diff:

-    try {
-      const response = await fetch(POSTHOG_CAPTURE_URL, {
+    try {
+      const controller = new AbortController();
+      const timeoutMs = Number(process.env.ANALYTICS_TIMEOUT_MS ?? 2000);
+      const timeout = setTimeout(() => controller.abort(), timeoutMs);
+      const response = await fetch(POSTHOG_CAPTURE_URL, {
         method: "POST",
         headers: {
           "Content-Type": "application/json",
         },
-        body: JSON.stringify(payload),
+        body: JSON.stringify(payload),
+        keepalive: true,
+        signal: controller.signal,
       });
+      clearTimeout(timeout);

33-35: Include HTTP status code for easier debugging.

Status text can be empty; include the numeric code for clarity.

Apply this diff:

-      if (!response.ok) {
-        throw new EventCaptureError(eventName, response.statusText);
-      }
+      if (!response.ok) {
+        throw new EventCaptureError(
+          eventName,
+          `${response.status} ${response.statusText || "Unknown"}`
+        );
+      }

17-18: Avoid per-event random distinct_id; prefer a stable, privacy-safe identifier.

A new randomUUID per event prevents session/user-level aggregation. Consider a stable distinct_id (e.g., persisted machine/session ID, or an env-provided userAgent/source when privacy permits). Keep $process_person_profile: false to avoid PII.

I can draft a tiny helper that persists a UUID at ~/.config/prisma/create-db/machine-id (or respects XDG) and falls back to random when unwritable—say the word.


37-41: Optional: add a targeted debug toggle.

Logging only in NODE_ENV=development is fine. If you want opt-in visibility on CI without flipping NODE_ENV, consider also honoring DEBUG=create-db:analytics.

Proposed tweak inside catch:

if (
  process.env.NODE_ENV === "development" ||
  process.env.DEBUG === "create-db:analytics"
) {
  console.error("Analytics error:", error.message);
}
♻️ Duplicate comments (1)
create-db/analytics.js (1)

12-12: Nice: hardcoded PostHog key removed.

This addresses the prior review about shipping a baked-in API key. Good change.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 4666b46 and be9f976.

📒 Files selected for processing (1)
  • create-db/analytics.js (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17164819065:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17164819065.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (3)
create-db/analytics.js (3)

24-31: Allow callers to provide a stable distinct_id to improve event correlation.

Right now every event gets a fresh UUID, which fragments user sessions. Prefer a caller-provided distinct_id when available (e.g., a persisted CLI install ID), falling back to a UUID.

Apply this diff:

-      distinct_id: randomUUID(),
+      // Prefer caller-provided distinct_id for stable correlation across events
+      distinct_id: properties?.distinct_id ?? randomUUID(),

33-50: Add a short timeout via AbortController to avoid hanging on network stalls.

Without a timeout, fetch can hang indefinitely and delay CLI exit. Abort after a few seconds; still silent-fail outside development.

Apply this diff:

-    try {
-      const response = await fetch(POSTHOG_CAPTURE_URL, {
+    const controller = new AbortController();
+    const timeoutId = setTimeout(() => controller.abort(), 5000);
+    try {
+      const response = await fetch(POSTHOG_CAPTURE_URL, {
         method: "POST",
         headers: {
           "Content-Type": "application/json",
         },
-        body: JSON.stringify(payload),
+        body: JSON.stringify(payload),
+        signal: controller.signal,
       });
 
       if (!response.ok) {
         throw new EventCaptureError(eventName, response.statusText);
       }
-    } catch (error) {
+    } catch (error) {
       // Silently fail analytics to not disrupt user experience
       if (process.env.NODE_ENV === "development") {
         console.error("Analytics error:", error.message);
       }
-    }
+    } finally {
+      clearTimeout(timeoutId);
+    }

42-44: Include numeric status code; statusText can be empty in Node fetch.

This improves diagnostics while keeping behavior unchanged.

Apply this diff:

-        throw new EventCaptureError(eventName, response.statusText);
+        const text = response.statusText || "Unknown Status";
+        throw new EventCaptureError(
+          eventName,
+          `${response.status} ${text}`.trim()
+        );
♻️ Duplicate comments (1)
create-db/analytics.js (1)

11-21: Resolved: analytics now fail-closed without a baked-in key; thanks for addressing prior concerns.

You now gate on both POSTHOG_API_HOST and POSTHOG_API_KEY and early-return, with a dev-only warning. This removes the committed default key and avoids posting to undefined/capture. Looks good.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between be9f976 and 76d7ec7.

📒 Files selected for processing (1)
  • create-db/analytics.js (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: create-db-worker
  • GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (1)
create-db/analytics.js (1)

34-34: Confirm Node runtime guarantees global fetch; otherwise polyfill or guard.

If the CLI runs on Node < 18, global fetch is undefined. Either enforce engines >= 18 or add a lazy import/polyfill.

Would you verify the repo’s engines.node and runtime target? If engines < 18 or unspecified, I can add a tiny guard like:

-      const response = await fetch(POSTHOG_CAPTURE_URL, {
+      const _fetch = globalThis.fetch ?? (await import("node-fetch")).default;
+      const response = await _fetch(POSTHOG_CAPTURE_URL, {

mhessdev
mhessdev previously approved these changes Aug 25, 2025
Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17239115059:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17239115059.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (5)
create-db/analytics.js (2)

3-6: Include HTTP status code in EventCaptureError for easier debugging.

statusText can be empty; include numeric status as well.

-class EventCaptureError extends Error {
-  constructor(event, status) {
-    super(`Failed to submit PostHog event '${event}': ${status}`);
-  }
-}
+class EventCaptureError extends Error {
+  constructor(event, statusCode, statusText) {
+    const detail = [statusCode, statusText].filter(Boolean).join(" ");
+    super(`Failed to submit PostHog event '${event}': ${detail}`);
+  }
+}
@@
-      if (!response.ok) {
-        throw new EventCaptureError(eventName, response.statusText);
-      }
+      if (!response.ok) {
+        throw new EventCaptureError(eventName, response.status, response.statusText);
+      }

Also applies to: 49-51


1-1: Use node: protocol import for built-in crypto (consistency with Node guidance).

-import { randomUUID } from "crypto";
+import { randomUUID } from "node:crypto";
create-db/index.js (3)

416-431: JSON output: expose a stable ‘source’ field alongside userAgent (non-breaking).

Downstream tools likely look for “utm_source”/“source”. Keep userAgent for compatibility; add source duplicate.

-    if (userAgent) {
-      jsonResponse.userAgent = userAgent;
-    }
+    if (userAgent) {
+      jsonResponse.userAgent = userAgent; // existing
+      jsonResponse.source = userAgent;    // new, explicit
+    }

526-547: Analytics flags: use parsed flags instead of scanning rawArgs; add has-source-flag.

-      const analyticsProps = {
+      const analyticsProps = {
         command: CLI_NAME,
-        "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(),
-        "has-region-flag":
-          rawArgs.includes("--region") || rawArgs.includes("-r"),
-        "has-interactive-flag":
-          rawArgs.includes("--interactive") || rawArgs.includes("-i"),
-        "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"),
-        "has-list-regions-flag": rawArgs.includes("--list-regions"),
-        "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"),
+        "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(),
+        "has-region-flag": Boolean(flags.region),
+        "has-interactive-flag": Boolean(flags.interactive),
+        "has-help-flag": Boolean(flags.help),
+        "has-list-regions-flag": Boolean(flags["list-regions"]),
+        "has-json-flag": Boolean(flags.json),
+        "has-source-flag": Boolean(flags.source),
         "has-user-agent-from-env": !!userAgent,
         "node-version": process.version,
         platform: process.platform,
         arch: process.arch,
         "user-agent": userAgent,
       };

167-176: Help short-flag branch should await showHelp() and return.

The long-flag and combined-short branches await and return; keep behavior consistent here to avoid falling through.

-        if (mappedFlag === "help") showHelp();
+        if (mappedFlag === "help") { await showHelp(); return; }
♻️ Duplicate comments (3)
create-db/analytics.js (1)

11-12: Normalize and validate POSTHOG_API_HOST (trim + ensure scheme) to prevent malformed URLs.

If POSTHOG_API_HOST lacks a scheme or has trailing/leading whitespace, fetch will throw (e.g., "app.posthog.com/capture"). Normalize before use.

Apply:

-    const POSTHOG_API_HOST = process.env.POSTHOG_API_HOST;
-    const POSTHOG_KEY = process.env.POSTHOG_API_KEY;
+    const POSTHOG_API_HOST_RAW = process.env.POSTHOG_API_HOST;
+    const POSTHOG_KEY_RAW = process.env.POSTHOG_API_KEY;
+    const POSTHOG_API_HOST = POSTHOG_API_HOST_RAW?.trim();
+    const POSTHOG_KEY = POSTHOG_KEY_RAW?.trim();
@@
-    const POSTHOG_CAPTURE_URL = `${POSTHOG_API_HOST.replace(/\/+$/, "")}/capture`;
+    const hostWithScheme = POSTHOG_API_HOST.startsWith("http")
+      ? POSTHOG_API_HOST
+      : `https://${POSTHOG_API_HOST}`;
+    const POSTHOG_CAPTURE_URL = `${hostWithScheme.replace(/\/+$/, "")}/capture`;

Also applies to: 28-28

create-db/index.js (2)

4-6: Prefer node: protocol for built-ins or drop these imports if the custom .env reader is removed.

-import fs from "fs";
-import path from "path";
+import fs from "node:fs";
+import path from "node:path";

Note: If you remove readUserEnvFile (see below), these imports can be deleted entirely.


412-412: Fix bug: claim URL uses raw userAgent and can emit “utm_source=undefined”; also URL-encode.

-  const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${userAgent}&utm_medium=cli`;
+  const utmSource = encodeURIComponent(userAgent || CLI_NAME);
+  const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${utmSource}&utm_medium=cli`;
📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 76d7ec7 and 725dc45.

📒 Files selected for processing (2)
  • create-db/analytics.js (1 hunks)
  • create-db/index.js (15 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (2)
create-db/analytics.js (1)

14-26: Fail-closed gating looks good; removes accidental analytics when env is missing.

Early-returning when POSTHOG_API_HOST/POSTHOG_API_KEY are absent (and warning only in development) is the right call and aligns with least-privilege defaults.

create-db/index.js (1)

588-596: Flow check: promptForRegion + createDatabase with userAgent parameter work end-to-end.

Looks correct after the signature change; JSON and interactive paths pass the value through.

Also applies to: 620-626

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (8)
.github/workflows/preview.yml (4)

86-93: Avoid referencing outputs from commented-out steps; add robust fallbacks.

steps.deploy-db/steps.deploy-claim don’t exist (those steps are commented), so the expressions will resolve to null. Make the fallback explicit in bash so this step works whether or not the preview deploys run.

-          CREATE_DB_WORKER_URL="${{ steps.deploy-db.outputs.deployment-url || secrets.CREATE_DB_WORKER_URL }}"
-          CLAIM_DB_WORKER_URL="${{ steps.deploy-claim.outputs.deployment-url || secrets.CLAIM_DB_WORKER_URL }}"
+          CREATE_DB_WORKER_URL="${{ steps.deploy-db.outputs.deployment-url }}"
+          CLAIM_DB_WORKER_URL="${{ steps.deploy-claim.outputs.deployment-url }}"
+          # Explicit fallbacks when outputs are empty or undefined
+          : "${CREATE_DB_WORKER_URL:=${{ secrets.CREATE_DB_WORKER_URL }}}"
+          : "${CLAIM_DB_WORKER_URL:=${{ secrets.CLAIM_DB_WORKER_URL }}}"

117-119: Preview comment may show empty Worker URLs; add the same fallback used above.

If deploy steps are skipped, these envs will be empty and your comment renders “undefined”. Use the resolved envs or apply an expression fallback.

-          CREATE_DB_WORKER_URL: ${{ steps.deploy-db.outputs.deployment-url }}
-          CLAIM_DB_WORKER_URL: ${{ steps.deploy-claim.outputs.deployment-url }}
+          CREATE_DB_WORKER_URL: ${{ steps.deploy-db.outputs.deployment-url || env.CREATE_DB_WORKER_URL || secrets.CREATE_DB_WORKER_URL }}
+          CLAIM_DB_WORKER_URL: ${{ steps.deploy-claim.outputs.deployment-url || env.CLAIM_DB_WORKER_URL || secrets.CLAIM_DB_WORKER_URL }}

135-139: Typo in npx instruction: $pr should be pr.

This breaks copy/paste for create-postgres.

-            npx create-postgres@$pr${{ github.event.number }}
+            npx create-postgres@pr${{ github.event.number }}

1-12: Harden workflow permissions (principle of least privilege).

Default GITHUB_TOKEN permissions are broad. Explicitly scope them for this workflow.

 name: Preview deploy all Workers and CLIs
 
 on:
   pull_request:
@@
 env:
   # each folder under the repo root that contains one of your CLIs
   WORKSPACES: create-db create-pg create-postgres
+permissions:
+  contents: read
+  pull-requests: write   # needed for comment
+  packages: write        # needed for npm publish with GITHUB_TOKEN if used anywhere
.github/workflows/release.yml (3)

1-9: Restrict GITHUB_TOKEN permissions for releases.

Add explicit permissions needed for publishing and PR creation.

 name: Release CLIs
@@
 on:
   workflow_dispatch:
   push:
     branches:
       - main
+permissions:
+  contents: write       # commit/version and create PR
+  pull-requests: write  # changesets/action
+  packages: write       # npm publish if using GITHUB_TOKEN with GitHub Packages

76-82: Avoid generating fallback changesets in your CI release workflow

Injecting a default, empty changeset during the release.yml run can lead to:

  • Polluted commit history with synthetic “auto-generated changeset” files.
  • An infinite loop:
    1. Workflow sees no .changeset files → creates a default one
    2. changesets/action@v1 runs pnpm changeset version, which removes all .changeset files
    3. The next push to main (from the version‐bump commit or merged PR) retriggers the workflow → back to step 1

To prevent this and align with best practices, author changesets in feature PRs before merging to main; the release workflow should only consume them.

Suggested refactor (optional):

• Remove or comment out the fallback block in .github/workflows/release.yml (around lines 76–82):

-      - name: 📝 Ensure Changeset Exists
-        run: |
-          if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then
-            echo "No changeset found. Creating a default one..."
-            pnpm changeset add --empty --message "chore(release): auto-generated changeset"
-          fi

• As an alternative, if you still need a one-time “first release” fallback, guard it so it only fires when no prior tags/releases exist (e.g. detect the first version bump).

Please verify:

  • That you haven’t already observed “auto-generated changeset” commits or churn in your Git history.
  • The exact behavior of changesets/action@v1 in your pipeline (does it open a PR or push directly to main?).
  • That removing this block won’t break any existing release flows you rely on.

56-63: Remove manual version bumps from the release workflow

The CI currently runs npm version patch --no-git-tag-version on publish failures, which can desynchronize the actual published package versions from what Changesets tracks. Since this repo already uses Changesets to drive all versioning ("version": "changeset version" in package.json, and the changesets/action@v1 step in the release workflow), it’s best to let Changesets be the single source of truth for version bumps.

Points of attention:

  • File: .github/workflows/release.yml
  • Lines: ~56–63 (the if ! pnpm publish block)

Suggested refactor options:

Option A – fail fast on publish errors

-            if ! pnpm publish --access public; then
-              echo "Publish failed, trying to bump version and retry..."
-              npm version patch --no-git-tag-version
-              pnpm publish --access public || echo "Publish failed again for $pkg"
-            fi
+            pnpm publish --access public || exit 1

Option B – delegate all versioning to Changesets
• Remove the entire fallback block and rely on the existing Changesets steps (pnpm changeset version + changesets/action@v1) to handle version bumps and tagging.

create-db/package.json (1)

29-35: Document missing PRISMA_ACTOR env vars & review dotenv side effects

The POSTHOG_API_KEY/POSTHOG_API_HOST variables are already documented in README.md, but PRISMA_ACTOR_NAME and PRISMA_ACTOR_PROJECT (used in index.js) are not. Also, while top-level dotenv.config() is fine for a CLI-only module, you may want to defer it to avoid import-time side effects if this package is ever consumed programmatically.

• In README.md, under the “Environment variables” (or add a new section), document:

  • PRISMA_ACTOR_NAME: name to use in the User-Agent header when persisting databases
  • PRISMA_ACTOR_PROJECT: project identifier to use in the User-Agent header
    • No changes needed for POSTHOG_API_KEY / POSTHOG_API_HOST docs—they’re present at lines ~139–140 of README.md.
    • (Optional) Refactor import dotenv from "dotenv"; dotenv.config(); so that dotenv.config() is called inside the CLI entrypoint (e.g. within a main()), preventing side effects on import.
📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 725dc45 and 6d4006f.

📒 Files selected for processing (5)
  • .github/workflows/preview.yml (1 hunks)
  • .github/workflows/release.yml (1 hunks)
  • create-db/package.json (1 hunks)
  • create-pg/package.json (1 hunks)
  • create-postgres/package.json (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: create-db-worker
  • GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (3)
create-postgres/package.json (1)

3-3: Version bump to 1.0.3: LGTM.

No other changes; workspace dependency remains pinned to create-db via workspace:*.

create-db/package.json (1)

3-3: Version bump to 1.0.3 aligns with PR changes: LGTM.

Files array includes analytics.js, which matches the analytics gating updates noted in this PR.

create-pg/package.json (1)

3-3: Version bump to 1.0.3: LGTM.

No functional changes; continues to depend on workspace create-db.

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17239715470:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17239715470.

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17244466791:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17244466791.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
create-db/index.js (1)

419-435: JSON schema: prefer neutral naming (“source” or “utmSource”) over “userAgent”; confirm output contract.

This field isn’t a browser UA and can confuse integrators. Consider renaming for clarity. Also ensure no “undefined” values leak.

-    if (userAgent) {
-      jsonResponse.userAgent = userAgent;
-    }
+    if (userAgent) {
+      jsonResponse.source = userAgent; // or 'utmSource'
+    }

Would you like me to prepare a follow-up patch that renames analytics keys from "user-agent" to "source" for consistency?

♻️ Duplicate comments (9)
create-db/index.js (9)

167-176: Bug: help short flag branch doesn’t await showHelp() or return — CLI flow continues unintentionally.

This can lead to unexpected execution after printing help.

-        if (mappedFlag === "help") showHelp();
+        if (mappedFlag === "help") { await showHelp(); return; }

270-306: DRY analytics: use a small helper to attach source/userAgent only when present.

Replace the inline payload construction with a helper to ensure consistency across all capture sites.

   try {
-    const analyticsProps = {
-      command: CLI_NAME,
-      region: region,
-      "selection-method": "interactive",
-      "user-agent": userAgent,
-    };
-
-    await analytics.capture("create_db:region_selected", analyticsProps);
+    await captureWithSource(
+      "create_db:region_selected",
+      { command: CLI_NAME, region, "selection-method": "interactive" },
+      userAgent
+    );
   } catch (error) {}

Add this helper near the top of the file (after importing analytics):

function captureWithSource(event, props, maybeSource) {
  const payload = maybeSource ? { ...props, "user-agent": maybeSource } : props;
  return analytics.capture(event, payload);
}

530-549: Analytics flags: rely on parsed flags instead of scanning rawArgs.

rawArgs misses combined short flags and inflates maintenance cost. You already have parsed flags here.

-      const analyticsProps = {
+      const analyticsProps = {
         command: CLI_NAME,
         "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(),
-        "has-region-flag":
-          rawArgs.includes("--region") || rawArgs.includes("-r"),
-        "has-interactive-flag":
-          rawArgs.includes("--interactive") || rawArgs.includes("-i"),
-        "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"),
-        "has-list-regions-flag": rawArgs.includes("--list-regions"),
-        "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"),
+        "has-region-flag": Boolean(flags.region),
+        "has-interactive-flag": Boolean(flags.interactive),
+        "has-help-flag": Boolean(flags.help),
+        "has-list-regions-flag": Boolean(flags["list-regions"]),
+        "has-json-flag": Boolean(flags.json),
         "has-user-agent-from-env": !!userAgent,
         "node-version": process.version,
         platform: process.platform,
         arch: process.arch,
         "user-agent": userAgent,
       };

570-582: DRY: region flag analytics should also use captureWithSource helper.

Mirror the interactive path refactor.

-      const analyticsProps = {
-        command: CLI_NAME,
-        region: region,
-        "selection-method": "flag",
-        "user-agent": userAgent,
-      };
-
-      await analytics.capture("create_db:region_selected", analyticsProps);
+      await captureWithSource(
+        "create_db:region_selected",
+        { command: CLI_NAME, region, "selection-method": "flag" },
+        userAgent
+      );

4-5: Remove unused fs/path imports (or switch to node: specifiers) — you shouldn’t read .env manually anymore.

Given dotenv.config() is used and we should rely on process.env (see comment below removing readUserEnvFile), these imports become dead code. If you decide to keep file access for some reason, prefer node: specifiers.

-import fs from "fs";
-import path from "path";
+// (removed; no longer needed once readUserEnvFile is deleted)

63-86: Don’t hand-roll a .env parser; delete readUserEnvFile and rely on dotenv.config()/process.env.

The custom parser mishandles edge cases (quotes, whitespace, export, CRLF, comments). dotenv already hydrated process.env. Remove this function.

-function readUserEnvFile() {
-  const userCwd = process.cwd();
-  const envPath = path.join(userCwd, ".env");
-
-  if (!fs.existsSync(envPath)) {
-    return {};
-  }
-
-  const envContent = fs.readFileSync(envPath, "utf8");
-  const envVars = {};
-
-  envContent.split("\n").forEach((line) => {
-    const trimmed = line.trim();
-    if (trimmed && !trimmed.startsWith("#")) {
-      const [key, ...valueParts] = trimmed.split("=");
-      if (key && valueParts.length > 0) {
-        const value = valueParts.join("=").replace(/^["']|["']$/g, "");
-        envVars[key.trim()] = value.trim();
-      }
-    }
-  });
-
-  return envVars;
-}

416-416: Fix: claim URL emits utm_source=undefined and isn’t URL-encoded.

Fallback and percent-encode UTM values to avoid polluted analytics and broken links.

-  const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${userAgent}&utm_medium=cli`;
+  const utmSource = encodeURIComponent(userAgent || CLI_NAME);
+  const utmMedium = encodeURIComponent("cli");
+  const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${utmSource}&utm_medium=${utmMedium}`;

592-592: Naming consistency: variable represents a UTM “source”, not a browser “user agent”.

Consider renaming the param and variable from userAgent → source for clarity across promptForRegion/createDatabase and call sites. Analytics payload keys can remain "utm_source" (body) and a neutral "source" (event property) to avoid confusion.

-          region = await promptForRegion(region, userAgent);
+          region = await promptForRegion(region, source);
...
-        const result = await createDatabase(name, region, userAgent, true);
+        const result = await createDatabase(name, region, source, true);
...
-      region = await promptForRegion(region, userAgent);
+      region = await promptForRegion(region, source);
...
-    await createDatabase(name, region, userAgent);
+    await createDatabase(name, region, source);

Follow-up: rename function parameters and local variables accordingly.

Also applies to: 596-596, 621-621, 626-626


518-528: Single derivation for source/userAgent; don’t re-read .env; PR title claims “--source flag” but it isn’t implemented.

Compute once from flags (if implemented) then fall back to process.env (hydrated by dotenv). Implement -–source/-s, or update the PR title.

-    let userAgent;
-    const userEnvVars = readUserEnvFile();
-    if (userEnvVars.PRISMA_ACTOR_NAME && userEnvVars.PRISMA_ACTOR_PROJECT) {
-      userAgent = `${userEnvVars.PRISMA_ACTOR_NAME}/${userEnvVars.PRISMA_ACTOR_PROJECT}`;
-    }
+    // Derive UTM source once: flag overrides env
+    let userAgent = flags.source;
+    if (!userAgent) {
+      const { PRISMA_ACTOR_NAME, PRISMA_ACTOR_PROJECT } = process.env;
+      if (PRISMA_ACTOR_NAME && PRISMA_ACTOR_PROJECT) {
+        userAgent = `${PRISMA_ACTOR_NAME}/${PRISMA_ACTOR_PROJECT}`;
+      }
+    }

Outside this hunk, please implement the –-source/-s flag and help text:

@@ async function parseArgs() {
-  const allowedFlags = [
+  const allowedFlags = [
     "region",
     "help",
     "list-regions",
     "interactive",
     "json",
+    "source",
   ];
   const shorthandMap = {
     r: "region",
     i: "interactive",
     h: "help",
     j: "json",
+    s: "source",
   };
@@ long-flag branch
+      } else if (flag === "source") {
+        const src = args[i + 1];
+        if (!src || src.startsWith("-"))
+          exitWithError("Missing value for --source flag.");
+        flags.source = src;
+        i++;
@@ single short flag (-s)
+        } else if (mappedFlag === "source") {
+          const src = args[i + 1];
+          if (!src || src.startsWith("-"))
+            exitWithError("Missing value for -s flag.");
+          flags.source = src;
+          i++;
@@ combined short flags
+        } else if (mappedFlag === "source") {
+          const src = args[i + 1];
+          if (!src || src.startsWith("-"))
+            exitWithError("Missing value for -s flag.");
+          flags.source = src;
+          i++;

And document in help():

-  ${chalk.yellow("--json, -j")}                      Output machine-readable JSON and exit
+  ${chalk.yellow("--source <id>, -s <id>")}          UTM source (overrides env PRISMA_ACTOR_NAME/PRISMA_ACTOR_PROJECT)
+  ${chalk.yellow("--json, -j")}                      Output machine-readable JSON and exit
📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 97e97de and f44687c.

📒 Files selected for processing (1)
  • create-db/index.js (15 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
  • analytics (61-61)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (2)
create-db/index.js (2)

308-323: LGTM: create request includes safe utm_source fallback.

Falling back to CLI_NAME prevents sending undefined to the service. Looks correct.


413-414: LGTM: append sslmode=require to direct connection string.

Good default for secure connections; matches typical managed Postgres recommendations.

@aidankmcalister aidankmcalister merged commit 5c6f2f6 into main Aug 27, 2025
4 checks passed
@aidankmcalister aidankmcalister deleted the DC-4829-source-flag branch August 27, 2025 15:26
This was referenced Sep 3, 2025
@coderabbitai coderabbitai bot mentioned this pull request Oct 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants