Conversation
…ings-and-issues Fix CLI/template issues; harden auth, migrate/generate flows and realtime
…ckage feat(client): scaffold @betterbase/client TypeScript SDK
|
You have reached your Codex usage limits for code reviews. You can see your limits in the Codex usage dashboard. |
📝 WalkthroughWalkthroughThis PR introduces significant enhancements to the CLI scaffolding, adds a complete TypeScript client package, and extends template support with realtime capabilities. Changes include auth/generation command improvements with validation, pagination and filtering support, a new environment module, SQL parsing refinements, and a production build setup. The new client package provides auth, querying, realtime subscriptions, and error handling abstractions. Changes
Sequence Diagram(s)sequenceDiagram
participant Client as Client App
participant AuthC as AuthClient
participant Server as BetterBase Server
participant DB as Database
Client->>AuthC: signUp({email, password, name})
AuthC->>AuthC: Validate with Zod
AuthC->>Server: POST /auth/signup
Server->>DB: Create user & hash password
DB-->>Server: User created
Server-->>AuthC: {token, user}
AuthC->>AuthC: Store token in storage
AuthC->>Client: Trigger onAuthStateChange
Client-->>Client: Update UI with user
sequenceDiagram
participant Client as Client App
participant QB as QueryBuilder
participant Server as BetterBase Server
participant DB as Database
Client->>QB: from('users').eq('role', 'admin').limit(10).offset(0).execute()
QB->>QB: Build query params (filters, pagination)
QB->>Server: GET /api/users?filters=...&limit=10&offset=0
Server->>Server: Validate filters & pagination
Server->>DB: Query with filters + limit+1
DB-->>Server: Results [item1, item2, ...]
Server-->>QB: {data: [...], pagination: {limit, offset, hasMore}}
QB-->>Client: BetterBaseResponse<User[]>
sequenceDiagram
participant Client as Client App
participant Realtime as RealtimeClient
participant WS as WebSocket
participant Server as BetterBase Server
participant Broadcaster as Event Broadcaster
Client->>Realtime: from('users').on('UPDATE', callback).subscribe()
Realtime->>Realtime: Auto-connect WebSocket
Realtime->>WS: Connect to /ws
WS-->>Server: Upgrade connection
Server->>Server: Authenticate with token
Server-->>WS: Connection established
Realtime->>WS: Send subscribe({table: 'users', events: ['UPDATE']})
WS-->>Server: Register subscription
Broadcaster->>DB: User updated
Broadcaster->>Server: Broadcast update event
Server->>WS: Send {type: 'update', table: 'users', event: 'UPDATE', data}
WS-->>Realtime: Receive message
Realtime->>Realtime: Match filter & event
Realtime->>Client: callback({event: 'UPDATE', data, timestamp})
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Possibly related PRs
Suggested labels
🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 16
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
betterbase/packages/cli/src/commands/generate.ts (1)
101-111:⚠️ Potential issue | 🟠 MajorGenerated code: Filter values are untyped strings — type mismatch on non-string columns and sensitive-column exposure
Two concerns with the filter block in the generated code:
Type mismatch: Filter values come from URL query params as raw strings. Passing a string into
eq()for aninteger,boolean, ordatetimecolumn will work on SQLite (via type affinity) but will silently mismatch or error on Postgres. For example,?active=trueon a boolean column will compare the string"true"against the stored boolean, which won't match expected rows in Postgres.Sensitive column exposure: The implicit allowlist (
key in ${tableName}) permits filtering on every column, including sensitive ones likepassword_hash,api_token, orrefresh_token. A caller can probe?api_token=someValueto confirm existence of a specific token value. The existing comment acknowledges this but the generated code ships with this risk by default.🛡️ Suggested mitigation — coerce filter values using the schema type and apply an explicit column allowlist
Add a coerce step that maps each column's type to the appropriate Zod coercer, and optionally emit an
FILTERABLE_COLUMNSset so operators can tighten it:+// Allowlist: extend with column names you explicitly want to expose as filterable. +const FILTERABLE_COLUMNS = new Set<string>([ + // e.g., 'status', 'category' +]); + if (filters.length > 0) { - // Security note: by default all table columns are filterable... const conditions = filters - .filter(([key]) => key in ${tableName}) - .map(([key, value]) => eq(${tableName}[key as keyof typeof ${tableName}] as never, value as never)); + .filter(([key]) => FILTERABLE_COLUMNS.has(key) && key in ${tableName}) + .map(([key, value]) => eq(${tableName}[key as keyof typeof ${tableName}] as never, value as never));For the type coercion concern, the
schemaTypeToZodhelper ingenerate.tsalready has the right mapping — it can be reused to emit a per-column coerce map into the generated file.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/commands/generate.ts` around lines 101 - 111, The generated filter block currently accepts raw string values and allows filtering on every column (variables: filters, ${tableName}, eq, and, query.where), so you must (1) restrict which columns can be filtered by emitting and checking an explicit FILTERABLE_COLUMNS set and only iterating filters whose key is in that set, and (2) coerce incoming string filter values to the column's schema type before calling eq by reusing the existing schemaTypeToZod mapping (or a per-column coerce map emitted alongside the generated table) to parse/coerce values to number/boolean/Date/etc.; build conditions from the coerced values and only call query = query.where(and(...conditions)) when coercion succeeds.betterbase/templates/base/README.md (1)
12-28:⚠️ Potential issue | 🟡 MinorStructure section is missing
realtime.ts.The new Realtime section (line 45) references
src/lib/realtime.ts, but the directory structure listing here doesn't include it. Update to keep them consistent.📝 Suggested update
lib/ env.ts + realtime.ts index.ts🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/templates/base/README.md` around lines 12 - 28, The README's project structure listing is missing the realtime file referenced later; add "realtime.ts" under the lib entry in the tree (i.e., include src/lib/realtime.ts in the listed structure) so the top-level structure block matches the Realtime section that references src/lib/realtime.ts; ensure the filename "realtime.ts" matches the Realtime section wording exactly.
🧹 Nitpick comments (20)
betterbase/templates/base/.gitignore (1)
3-3: Consider covering.env.*variants to prevent accidental secret commits.Only
.envis ignored. Environment files like.env.local,.env.development, and.env.productionwould be tracked if a user creates them. Since this is a template that will be scaffolded into user projects, broader coverage is safer.🛡️ Proposed addition
dist/ node_modules/ .env +.env.* +!.env.exampleThe
!.env.examplenegation ensures an example/template env file can still be committed alongside the project for documentation purposes.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/templates/base/.gitignore` at line 3, The .gitignore currently only ignores ".env"; update the .gitignore to also ignore all env variants by adding a pattern like ".env.*" while keeping the existing ".env" entry, and add a negation rule "!.env.example" so an example env file can still be committed; modify the template's .gitignore accordingly to include these three patterns (".env", ".env.*", "!.env.example").betterbase/packages/client/tsconfig.test.json (1)
5-5:declarationMap: falseis redundant.
declarationMapis only relevant whendeclaration: true. Since Line 4 already setsdeclaration: false, this option has no effect and can be removed.♻️ Proposed cleanup
"compilerOptions": { "declaration": false, - "declarationMap": false, "noEmit": true },🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/client/tsconfig.test.json` at line 5, The tsconfig includes a redundant "declarationMap" option: since "declaration" is set to false in the same config, "declarationMap" has no effect; remove the "declarationMap" entry from betterbase/packages/client/tsconfig.test.json (i.e., delete the "declarationMap" property) to clean up the config while leaving the existing "declaration": false intact.betterbase/tsconfig.base.json (1)
7-7: Remove"jsx": "react-jsx"from the shared base tsconfig—no package in the monorepo uses React.The client package has no React dependency (only
@types/bunandtypescript), and there are no.tsxor.jsxfiles anywhere in the codebase. Since the setting is inherited by all packages but needed by none, it unnecessarily couples the shared baseline to React JSX semantics.While harmless today, the idiomatic monorepo pattern is to keep the base config JSX-agnostic and add
"jsx": "react-jsx"only to the specific package-leveltsconfig.jsonthat actually uses React (if needed in the future).Suggested fix
Remove from
betterbase/tsconfig.base.json:"target": "ES2022", "module": "ESNext", "moduleResolution": "Bundler", "strict": true, - "jsx": "react-jsx", "esModuleInterop": true,If React is needed in a specific package later, add it to that package's
tsconfig.json:{ "extends": "../../tsconfig.base.json", "compilerOptions": { "jsx": "react-jsx" } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/tsconfig.base.json` at line 7, Remove the shared JSX setting "jsx": "react-jsx" from the base TS config so the monorepo base remains React-agnostic; delete the "jsx": "react-jsx" entry in betterbase/tsconfig.base.json and, if any package later needs React, add "jsx": "react-jsx" into that package's local tsconfig.json under compilerOptions (extending the base) instead of keeping it in the shared base.betterbase/packages/client/package.json (1)
11-13:enginesdeclares onlynodebut every script requires Bun.
build,dev, andtestall invokebundirectly. A developer running these scripts with plain Node will hit an immediatecommand not foundwith no diagnostic hint. Declaring a minimum Bun version makes the requirement explicit and lets package managers surface a clear error.🔧 Proposed fix
"engines": { - "node": ">=18" + "node": ">=18", + "bun": ">=1.0.0" },🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/client/package.json` around lines 11 - 13, The package.json currently lists only the Node engine but all npm scripts ("build", "dev", "test") invoke the Bun CLI; update the "engines" field in package.json to declare Bun as required (e.g., add "bun": ">=1.0.0" alongside "node": ">=18") so package managers surface a clear error when Bun is missing and developers are guided to install the correct Bun version before running the build/dev/test scripts.betterbase/packages/cli/src/commands/auth.ts (1)
72-75: Thetypeofcheck is redundant but harmless.
!createdUseralready guards againstundefinedandnull(since!null === true). Thetypeof createdUser !== 'object'branch is unreachable for values returned by Drizzle's.returning(). Not a bug, just a minor observation.♻️ Simplified check
const createdUser = created[0]; - if (!createdUser || typeof createdUser !== 'object') { + if (!createdUser) { return c.json({ error: 'Failed to create user record' }, 500); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/commands/auth.ts` around lines 72 - 75, The null/undefined guard on createdUser is redundant with the typeof check; simplify the conditional in the auth creation flow by replacing "if (!createdUser || typeof createdUser !== 'object')" with a single existence check like "if (!createdUser)" used after you assign createdUser = created[0]; update the error-return branch that calls c.json({ error: 'Failed to create user record' }, 500) to remain the same so behavior is unchanged; this touches the created, createdUser variable handling in the function that performs the DB .returning() call and the c.json error-return path.betterbase/packages/cli/src/commands/init.ts (3)
593-614: POST handler validates but does not persist — ensure the TODO is tracked.The
POST /api/usersroute validates the request body but returns a canned response without inserting into the database (line 598). The TODO comment is there, but this is a gap in the generated template's usefulness.Would you like me to generate the
db.insert(users).values(parsed)implementation, or open an issue to track this?🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/commands/init.ts` around lines 593 - 614, The POST /api/users handler currently validates the payload in usersRoute.post but does not persist it; implement persistence by inserting the validated object into your DB (e.g., call db.insert(users).values(parsed) or use a UsersService.saveUser(parsed)) before returning the response, handle insertion errors similarly to existing error branches, and return the persisted record or its id instead of the canned "not persisted" message; update the TODO and ensure you import/instantiate the db or UsersService used by usersRoute.post and keep createUserSchema as the validation step.
538-552:parseNonNegativeIntsilently swallowsNaNfrom non-numeric strings — acceptable per the documented intent, but noteNumber("")is0.
Number("")returns0, which passesNumber.isInteger(0) && 0 >= 0, so an empty-string query param (e.g.,?limit=) would yield0instead of the fallback. This is already guarded on line 542 (if (!value)catchesundefined), but an explicit empty string""fromc.req.query('limit')would pass through toNumber("")→0.In practice this is minor because Hono's
c.req.query()returnsundefinedfor absent params and""for?limit=(empty value), solimit=0triggers the early-return empty-array branch on line 559. No functional bug, just a subtle edge-case worth being aware of.Optional: treat empty strings as missing
function parseNonNegativeInt(value: string | undefined, fallback: number): number { - if (!value) { + if (!value || value.trim() === '') { return fallback; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/commands/init.ts` around lines 538 - 552, The helper parseNonNegativeInt silently treats an explicit empty string ("") as 0 because Number("") === 0; update parseNonNegativeInt to treat empty strings as missing by returning the fallback when value is "" or value.trim() === "", before attempting Number(value), so that calls to parseNonNegativeInt (e.g., in init.ts where c.req.query('limit') is passed) will yield the fallback for ?limit= instead of 0.
104-130:drizzle.config.tsuses rawprocess.env.DB_PATHwhile app code usesenvmodule — intentional but worth a comment.The generated
drizzle.config.tsreadsprocess.env.DB_PATHdirectly (line 112), while the runtime code (src/db/index.ts,src/db/migrate.ts) goes through the Zod-validatedenvmodule. This divergence is correct because drizzle-kit loads the config independently, but it means theDEFAULT_DB_PATHfallback is duplicated (hardcoded'file:local.db'here vs'local.db'inenv.ts).Consider extracting the default into a shared constant or at minimum leaving a comment in the generated config pointing to
env.tsas the source of truth, to prevent future drift.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/commands/init.ts` around lines 104 - 130, The generated drizzle.config.ts currently hardcodes a DB fallback (in buildDrizzleConfig) that can drift from the app's env.ts DEFAULT_DB_PATH; update buildDrizzleConfig to either (preferred) import/use a shared constant for the default DB path or (simpler) change the generated template to include a clear inline comment in the returned drizzle.config.ts pointing to src/env.ts (and ensure the fallback string matches DEFAULT_DB_PATH, e.g. 'local.db' vs 'file:local.db' as used by the app); modify the databaseUrl mapping in buildDrizzleConfig and the template generation so the fallback value and the added comment are emitted into the generated file (referencing the buildDrizzleConfig function and the produced drizzle.config.ts).betterbase/packages/cli/src/commands/generate.ts (2)
75-82: Generated code: HoistpaginationSchemaand constants out of the request handler
DEFAULT_LIMIT,MAX_LIMIT,DEFAULT_OFFSET, andpaginationSchemaare re-created on every incoming GET request because they live inside theasync (c) => { ... }closure. These are static definitions; they belong at module scope in the generated route file.♻️ Proposed fix — move to module scope in the generated template
Move these lines before the route registration:
+const DEFAULT_LIMIT = 50; +const MAX_LIMIT = 100; +const DEFAULT_OFFSET = 0; + +const paginationSchema = z.object({ + limit: z.coerce.number().int().nonnegative().default(DEFAULT_LIMIT), + offset: z.coerce.number().int().nonnegative().default(DEFAULT_OFFSET), +}); ${tableName}Route.get('/', async (c) => { - const DEFAULT_LIMIT = 50; - const MAX_LIMIT = 100; - const DEFAULT_OFFSET = 0; - - const paginationSchema = z.object({ - limit: z.coerce.number().int().nonnegative().default(DEFAULT_LIMIT), - offset: z.coerce.number().int().nonnegative().default(DEFAULT_OFFSET), - });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/commands/generate.ts` around lines 75 - 82, The constants DEFAULT_LIMIT, MAX_LIMIT, DEFAULT_OFFSET and the zod schema paginationSchema are mistakenly defined inside the async request handler and thus re-created per request; move their declarations out of the async (c) => { ... } closure to module scope in the generated route file so they are initialized once at load time, ensuring the handler uses the module-scoped DEFAULT_LIMIT, MAX_LIMIT, DEFAULT_OFFSET and paginationSchema symbols instead of re-declaring them on every request.
242-258: Renameprocessto avoid shadowing the globalprocessobject
const process = Bun.spawn(...)shadows the globalprocess(env, argv, cwd, etc.) for the rest of the function. It works correctly today but is a silent hazard for future edits.♻️ Rename to `child` or `proc`
- const process = Bun.spawn(['bun', 'add', '@hono/zod-validator'], { + const child = Bun.spawn(['bun', 'add', '@hono/zod-validator'], { cwd: projectRoot, stdout: 'pipe', stderr: 'pipe', }); const [exitCode, stdout, stderr] = await Promise.all([ - process.exited, - new Response(process.stdout).text(), - new Response(process.stderr).text(), + child.exited, + new Response(child.stdout).text(), + new Response(child.stderr).text(), ]);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/commands/generate.ts` around lines 242 - 258, The local variable named process in the Bun.spawn call shadows the global process object; rename it to something like child or proc (e.g., const child = Bun.spawn(...)) and update all usages (process.exited, process.stdout, process.stderr) to the new identifier (child.exited, child.stdout, child.stderr) so the global process remains accessible and no shadowing occurs.betterbase/packages/client/src/build.ts (1)
17-19: Improve error diagnostic logging to match CLI build conventions.The CLI build script (
packages/cli/src/build.ts) mapsresult.logsto strings and joins them into a readable diagnostic message. Here, the raw logs array is passed toconsole.error, which may produce[object Object]output. Consider formatting the logs for readability.♻️ Suggested improvement
if (!esmResult.success) { - console.error('ESM build failed:', esmResult.logs); + const diagnostics = esmResult.logs.map((log) => (typeof log === 'string' ? log : JSON.stringify(log))).join('\n'); + console.error(`ESM build failed with ${esmResult.logs.length} error(s):\n${diagnostics}`); process.exit(1); }Apply the same pattern to the CJS error block (lines 32-34).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/client/src/build.ts` around lines 17 - 19, The error logging for the ESM build uses raw logs (esmResult.logs) which can render as [object Object]; update the ESM failure branch that checks esmResult.success to map each log entry to a string (like the CLI's result.logs -> strings mapping) and join them into a single diagnostic message passed to console.error, and apply the identical formatting change to the CJS failure branch (the cjsResult.logs handling) so both failure messages produce human-readable joined diagnostics.betterbase/packages/client/test/client.test.ts (1)
4-23: Good smoke tests for the public API surface.These verify that the client factory wires up
auth,realtime, andfrom()correctly. Consider adding behavioral tests in a follow-up — e.g., verifying that query builder methods chain correctly andexecute()constructs the expected fetch request (using a mock server or intercepted fetch).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/client/test/client.test.ts` around lines 4 - 23, Add behavioral tests that assert query builder chaining and the network request built by execute(): use createClient to get client and client.from('users') to build a query (referencing from, select, eq methods), chain methods (e.g., .select(...).eq(...)) and call execute(), then intercept global fetch (or use msw) to assert the outgoing request URL, method, headers (including key when provided) and body/params; ensure tests cover both chained invocations and a simple execute() call so the fetch request shape and authentication behavior are verified.betterbase/packages/client/tsconfig.json (1)
1-12: Confirm thatstrict: trueis already inherited fromtsconfig.base.json.The base config enables strict mode, so this tsconfig is compliant with the TypeScript requirements. No changes needed here.
Optionally, consider moving
src/build.tsoutsidesrc/(to ascripts/ortools/directory) for cleaner separation between runtime library code and build tooling. This avoids having Bun-specific tooling mixed with the client library source, even though types don't leak into the published package.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/client/tsconfig.json` around lines 1 - 12, Confirm that "strict": true is inherited by verifying tsconfig.json extends ../../tsconfig.base.json (ensure tsconfig.base.json contains "compilerOptions.strict": true) and leave this package tsconfig as-is; optionally, move the Bun-specific build script src/build.ts out of src/ into a scripts/ or tools/ directory (and update any imports or build references) to separate build tooling from the client library code.betterbase/packages/client/src/auth.ts (2)
21-32: Storage is hardcoded tolocalStorage— no injection point for SSR or non-browser environments.The
getStorage()helper only checks forglobalThis.localStorage. In SSR or Node.js environments (common for testing or server-rendered apps), storage silently becomesnull, so auth state won't persist across calls. Consider accepting an optionalstorageadapter in theAuthClientconstructor (or inBetterBaseConfig) to support custom backends (e.g., cookie-based, in-memory, or secure storage).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/client/src/auth.ts` around lines 21 - 32, getStorage() currently always reads globalThis.localStorage which breaks SSR/testing and offers no injection point; update the auth client to accept an optional storage adapter and fall back to existing behavior. Add a storage parameter to the AuthClient constructor (or a storage field on BetterBaseConfig) and use that adapter inside AuthClient methods instead of calling getStorage() directly; keep getStorage() as a fallback implementation that returns globalThis.localStorage when available. Ensure the adapter implements the same Storage interface (getItem/setItem/removeItem) so existing code using AuthClient (e.g., signIn/signOut/session handling) continues to work seamlessly in browser, SSR, or custom backends.
42-64: No Zod validation oncredentialsbefore sending to the API.Per coding guidelines, Zod validation should be used everywhere for type safety. Validating
password(non-empty, minimum length) client-side provides faster feedback and prevents unnecessary network round-trips. The same applies tosignInon Line 66.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/client/src/auth.ts` around lines 42 - 64, Add Zod validation for AuthCredentials in signUp (and mirror in signIn) by defining a Zod schema for { email: string().email(), password: string().min(8) } and validating the credentials parameter at the start of signUp (and signIn); if validation fails, return early with data: null and an appropriate error (e.g., AuthError or a ValidationError wrapper) without calling this.fetchImpl, otherwise proceed as before, and include the parsed/validated values when building the request body so the API only receives validated data.betterbase/packages/client/src/realtime.ts (1)
107-120: DoublescheduleReconnectfromonerror+oncloseis safe butonerrornullswsprematurely.When
onerrorfires, it setsthis.ws = null(Line 109) and schedules reconnect. The subsequentoncloseevent will also fire, setthis.ws = nullagain (harmless), and callscheduleReconnect()again (guarded byreconnectTimeoutcheck — also harmless). This works correctly, but settingthis.ws = nullinonerrormeans the socket isn't explicitly closed. Consider callingthis.ws?.close()before nulling.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/client/src/realtime.ts` around lines 107 - 120, In the WebSocket onerror handler, ensure the socket is explicitly closed before you null it and schedule reconnects: call this.ws?.close() (if present) at the start of the error handler, then clear any existing reconnectTimeout (this.reconnectTimeout) and set this.ws = null, and finally call this.scheduleReconnect(); update the onerror callback where this.ws is referenced so the socket is closed explicitly rather than just nulled (symbols: this.ws, this.reconnectTimeout, this.scheduleReconnect, onerror).betterbase/templates/base/src/lib/realtime.ts (1)
41-60: CustomdeepEqual— consider using a well-tested utility likefast-deep-equal.The implementation handles common cases but misses
Date,Map,Set,RegExp, etc. For filter matching with simple values this is fine, but a one-liner import offast-deep-equal(2.5KB, no deps) would be more robust and less code to maintain.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/templates/base/src/lib/realtime.ts` around lines 41 - 60, The custom deepEqual implementation in function deepEqual is incomplete for types like Date/Map/Set/RegExp; replace it with a well-tested utility by importing and using fast-deep-equal instead: remove or replace the local deepEqual function and import deepEqual from 'fast-deep-equal' (or default-exported name your codebase expects) and use that exported function wherever deepEqual(...) is called in this module (refer to the current deepEqual function name to locate usages); add the package to dependencies if not present.betterbase/templates/base/src/index.ts (1)
11-28: WebSocket token auth via query param exposes token in server logs and browser history.The fallback to
c.req.query('token')(Line 13) is a common pattern for WebSocket connections (since browsers can't set headers on the WS handshake), but be aware that query-string tokens can leak into access logs, proxy logs, and Referer headers. Consider documenting this trade-off or recommending short-lived tokens for the query param path.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/templates/base/src/index.ts` around lines 11 - 28, WebSocket token fallback currently reads c.req.query('token') into queryToken and uses it as token in upgradeWebSocket (authHeaderToken/queryToken/token) which can leak credentials via access logs and browser history; update the upgradeWebSocket handler to avoid exposing the raw query token: prefer the Authorization header, only accept query param as a documented fallback for short-lived tokens, ensure you never log the token (remove any logging of token value) and add an inline comment near upgradeWebSocket explaining the trade-off and recommending short-lived tokens or alternate transports (cookie/subprotocol) for production; also consider emitting a warning (without the token value) when falling back to the query param so ops are aware.betterbase/packages/client/src/query-builder.ts (1)
4-14:QueryBuilderis stateful and single-use — document or guard against reuse.Filters, options, and
selectFieldsaccumulate across chained calls and persist afterexecute(). A second call toexecute()on the same builder instance will replay all prior filters. This is a common footgun. Either reset state after execution, return a new builder fromfrom()each time, or document that builders are not reusable.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/client/src/query-builder.ts` around lines 4 - 14, QueryBuilder is stateful and reuse is a footgun: add a private flag (e.g. private executed = false) on the QueryBuilder class, set executed = true at the start or end of execute(), and check this flag in mutating methods (any methods that modify filters, options, selectFields and the from() builder factory) to throw a clear error like "QueryBuilder instances are single-use; create a new one via from()". Update execute(), any filter/option/select setters, and from() to reference this.executed so once execute() has run the instance refuses further mutations or execute() calls.betterbase/packages/client/src/client.ts (1)
13-40: Missing Zod validation onBetterBaseConfiginput.The constructor accepts
configwithout runtime validation. As per coding guidelines, Zod validation should be implemented everywhere for type safety. Validate at leasturl(non-empty, valid URL format) andkey(non-empty string if provided) at the public entry point.♻️ Proposed validation
import type { BetterBaseConfig } from './types'; +import { z } from 'zod'; import { QueryBuilder } from './query-builder'; import { AuthClient } from './auth'; import { RealtimeClient } from './realtime'; +const configSchema = z.object({ + url: z.string().url(), + key: z.string().min(1).optional(), + schema: z.string().optional(), +}); + export class BetterBaseClient { ... constructor(config: BetterBaseConfig) { + const validated = configSchema.parse(config); - this.url = config.url.replace(/\/$/, ''); + this.url = validated.url.replace(/\/$/, '');🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/client/src/client.ts` around lines 13 - 40, Add runtime Zod validation for the BetterBaseConfig passed into the constructor: define a zod schema (e.g., BetterBaseConfigSchema) that requires url as a non-empty string matching a URL pattern (or z.string().url()), optional key as a non-empty string if present, and optional fetch as a function; import zod at top, run schema.parse or schema.safeParse at the start of the constructor, throw or surface validation errors, then use the parsed values (trim trailing slash from parsed.url before assigning this.url) when constructing AuthClient and RealtimeClient so invalid configs are rejected early.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@betterbase/packages/cli/package.json`:
- Line 20: Move the "typescript" entry out of the "dependencies" object and
place it under "devDependencies" in package.json; specifically locate the
"typescript": "^5.8.0" key in the "dependencies" section and remove it, then add
the same "typescript": "^5.8.0" key to the "devDependencies" section so tsc is
only installed for build/typecheck and not as a runtime dependency.
In `@betterbase/packages/cli/src/commands/auth.ts`:
- Around line 274-282: The tokenizer in auth.ts incorrectly treats two adjacent
single quotes as an SQL-style escape and skips i forward (the block that checks
if inSingle && next === "'" using variables inSingle, inDouble, inBacktick, ch,
next and i); remove that SQL-style escape branch so single-quote adjacency is
not treated specially for TypeScript parsing and rely on the existing
backslash-escape handling already implemented around lines 268-272 to handle
'\'' sequences—i.e., delete the if (inSingle && next === "'") { i += 2;
continue; } path (or the whole surrounding conditional) and keep the rest of the
single-quote open/close flip logic intact.
In `@betterbase/packages/cli/src/commands/generate.ts`:
- Around line 218-239: The function ensureZodValidatorInstalled currently only
checks projectRoot/node_modules and the local package.json, missing hoisted
workspace installations; update it to walk up the directory tree from
projectRoot towards the filesystem/workspace root and at each ancestor check for
node_modules/@hono/zod-validator existence (the modulePath logic) and also read
that ancestor's package.json to look for '@hono/zod-validator' in dependencies,
devDependencies, or peerDependencies; stop the walk as soon as you find the
package (and return) or reach the filesystem root/workspace root so you avoid
unnecessary bun add installs.
In `@betterbase/packages/cli/src/commands/migrate.ts`:
- Around line 396-398: The log message currently emitted by logger.info before
calling runDrizzleKit(['push']) is misleading because it reads as if the push
already completed; update the message so it reflects that the push is about to
run or move the confirmation message until after runDrizzleKit resolves
successfully. Locate the logger.info call and either change the text (e.g.,
"drizzle/ files are for preview; running push will apply changes.") or emit a
second logger.info after runDrizzleKit(['push']) that confirms success (e.g.,
"drizzle-kit push completed; changes applied.") and ensure any failure path logs
an appropriate error.
- Around line 266-339: The current SQL-splitting loop in migrate.ts (the
variables/statements/current/inSingle/inDouble/inBacktick logic and the
escapeNext handling) doesn't handle line comments (-- ...) or block comments (/*
... */) and incorrectly treats backslash as a string escape; add two boolean
flags (e.g., inLineComment, inBlockComment) and update the loop to: when not in
any quote, detect "--" and set inLineComment until you hit a newline (skip
characters and don't treat ';' as a delimiter), detect "/*" and set
inBlockComment until you see "*/" (again skipping semicolons inside); when
inLineComment ignore everything until '\n' and then clear inLineComment, when
inBlockComment scan for "*/" and clear it; remove or disable the existing
backslash-escape behavior (escapeNext and the ch === '\\' branch) for SQLite
string handling so backslashes are not treated as escapes inside
inSingle/inDouble/inBacktick, and ensure semicolon handling (ch === ';' &&
!inSingle && !inDouble && !inBacktick && !inLineComment && !inBlockComment) only
splits when not inside any quote or comment.
In `@betterbase/packages/cli/src/index.ts`:
- Around line 36-58: The onExit handler currently calls cleanup() directly which
can throw and prevent removal of listeners; modify onExit (and any paths that
call it like onSigInt, onSigTerm, onProcessExit) to ensure cleanup errors are
caught and listeners are always removed—either wrap cleanup() in a try/catch and
swallow/log the error, or add a boolean guard (e.g., cleanedUp) so cleanup()
runs only once and listener removal always executes; update references in the
same module (functions onExit, onSigInt, onSigTerm, onProcessExit and the call
site after runDevCommand) to use the new safe pattern.
In `@betterbase/packages/cli/src/utils/context-generator.ts`:
- Line 47: The success message currently includes a redundant emoji before the
text in the call to logger.success (the call that prints "✅ Generated
${outputPath}"); remove the inline "✅" so the call becomes
logger.success("Generated ${outputPath}") (or the equivalent template string
without the emoji) because logger.success already prepends its own checkmark;
update the single invocation in context-generator.ts where logger.success is
used to avoid double success indicators.
In `@betterbase/packages/cli/src/utils/scanner.ts`:
- Around line 98-100: The fallback from SQL table name to the variable name
doesn't work because tableObj.name is always a string, so the nullish coalescing
in the assignment to tables[tableKey] never triggers; change the operator so
empty strings also fall back: when computing tableKey (where tableObj is
produced by parseTable and declaration.name.text is the variable name), use a
logical OR fallback (e.g., use tableObj.name || declaration.name.text) so an
empty tableObj.name will correctly use declaration.name.text before assigning
into tables.
In `@betterbase/packages/client/package.json`:
- Around line 25-32: The scripts are misconfigured: ensure "typecheck:test"
includes the --noEmit flag (matching "typecheck") to avoid emitting built
artifacts, and change "lint" from running tsc to invoking your linter (e.g.,
ESLint/Biome) or swap the commands if the flags were accidentally reversed;
update the "typecheck", "lint", and "typecheck:test" script entries so
"typecheck" and "typecheck:test" both use --noEmit and "lint" runs the actual
linter command you use in the repo (or add a separate lint script if needed).
- Around line 7-10: Update the repository.url field in package.json to point to
the actual GitHub repository for this PR (replace
"https://github.com/betterbase/betterbase.git" with the correct URL for
weroperking/Betterbase, e.g. "https://github.com/weroperking/Betterbase.git");
edit the "repository" object in betterbase/packages/client/package.json so the
"url" property reflects the real repo, commit the change and ensure package.json
remains valid JSON.
In `@betterbase/packages/client/src/client.ts`:
- Line 34: BetterBaseClient constructs RealtimeClient without passing the auth
token; update RealtimeClient to accept and store the token (via constructor
param and a setToken(token: string) method) and modify its connect() to append
the token as a ?token=... query parameter when building the WebSocket URL; then
change BetterBaseClient to pass the current auth token into new
RealtimeClient(...) and call realtime.setToken(...) whenever auth state changes
so the live connection or subsequent reconnects include the token.
In `@betterbase/packages/client/src/errors.ts`:
- Around line 1-29: The base class sets this.name to the literal
'BetterBaseError', so all subclasses (NetworkError, AuthError, ValidationError)
report the wrong name; update BetterBaseError's constructor to assign this.name
= this.constructor.name (instead of the hardcoded string) so subclass instances
automatically get the correct name (e.g., 'AuthError'), keeping the rest of the
constructor and parameters unchanged.
In `@betterbase/packages/client/src/query-builder.ts`:
- Around line 108-109: The naive singularization in query-builder.ts (computing
singularKey from this.table) fails for irregular plurals; change the logic in
the QueryBuilder where singularKey is computed to either (a) use a provided
override option (e.g., accept a singularKey/singletonName in the QueryBuilder
constructor/options and use that when present) or (b) use a robust pluralization
library (e.g., call pluralize.singular(this.table)) instead of slicing off a
trailing "s", and keep the existing fallback to result.data; update any places
referencing singularKey and document the new option if you add one.
- Around line 26-29: The in() method currently stores arrays as plain values
under filters with keys like `${column}_in`, which the server does not parse;
update in() in query-builder.ts to serialize arrays using JSON.stringify(values)
and ensure the server route parsing logic is updated to JSON.parse query values
for keys ending with `_in` (or alternately add server-side parsing for the `_in`
convention) so the client/server contract matches; additionally add Zod
validation inside in() (e.g. validate `values` with a schema like
z.array(z.unknown())) and validate other public QueryBuilder inputs to enforce
types before serializing.
In `@betterbase/packages/client/src/realtime.ts`:
- Around line 61-66: The connect() method currently throws when WebSocket is
undefined, which makes from(...).on(...).subscribe() throw synchronously; change
connect() to not throw — instead detect typeof WebSocket === 'undefined', log a
warning, set an internal "disabled" or "connected" flag and return early so
callers still get a valid RealtimeSubscription object; update from(),
subscribe(), and RealtimeSubscription behavior to return an inert/no-op
subscription when that flag is set (no socket ops, no throws, handlers ignored
or queued) so subscribe() is always safe in non-browser/SSR environments.
In `@betterbase/templates/base/src/lib/realtime.ts`:
- Around line 76-86: The authenticate(token) placeholder accepts fabricated
tokens and must be disabled in production: modify the authenticate function to
only allow this mock parsing when NODE_ENV === 'development' (or when an
explicit feature flag like ENABLE_DEV_AUTH is true), and in all other
environments return null or require a configured real verifier; additionally add
a startup-time loud logger/error if no real auth verifier is configured (e.g.,
during app init check for a configured auth provider and log.error or throw if
missing) so the dev-only path cannot be accidentally shipped enabled.
---
Outside diff comments:
In `@betterbase/packages/cli/src/commands/generate.ts`:
- Around line 101-111: The generated filter block currently accepts raw string
values and allows filtering on every column (variables: filters, ${tableName},
eq, and, query.where), so you must (1) restrict which columns can be filtered by
emitting and checking an explicit FILTERABLE_COLUMNS set and only iterating
filters whose key is in that set, and (2) coerce incoming string filter values
to the column's schema type before calling eq by reusing the existing
schemaTypeToZod mapping (or a per-column coerce map emitted alongside the
generated table) to parse/coerce values to number/boolean/Date/etc.; build
conditions from the coerced values and only call query =
query.where(and(...conditions)) when coercion succeeds.
In `@betterbase/templates/base/README.md`:
- Around line 12-28: The README's project structure listing is missing the
realtime file referenced later; add "realtime.ts" under the lib entry in the
tree (i.e., include src/lib/realtime.ts in the listed structure) so the
top-level structure block matches the Realtime section that references
src/lib/realtime.ts; ensure the filename "realtime.ts" matches the Realtime
section wording exactly.
---
Nitpick comments:
In `@betterbase/packages/cli/src/commands/auth.ts`:
- Around line 72-75: The null/undefined guard on createdUser is redundant with
the typeof check; simplify the conditional in the auth creation flow by
replacing "if (!createdUser || typeof createdUser !== 'object')" with a single
existence check like "if (!createdUser)" used after you assign createdUser =
created[0]; update the error-return branch that calls c.json({ error: 'Failed to
create user record' }, 500) to remain the same so behavior is unchanged; this
touches the created, createdUser variable handling in the function that performs
the DB .returning() call and the c.json error-return path.
In `@betterbase/packages/cli/src/commands/generate.ts`:
- Around line 75-82: The constants DEFAULT_LIMIT, MAX_LIMIT, DEFAULT_OFFSET and
the zod schema paginationSchema are mistakenly defined inside the async request
handler and thus re-created per request; move their declarations out of the
async (c) => { ... } closure to module scope in the generated route file so they
are initialized once at load time, ensuring the handler uses the module-scoped
DEFAULT_LIMIT, MAX_LIMIT, DEFAULT_OFFSET and paginationSchema symbols instead of
re-declaring them on every request.
- Around line 242-258: The local variable named process in the Bun.spawn call
shadows the global process object; rename it to something like child or proc
(e.g., const child = Bun.spawn(...)) and update all usages (process.exited,
process.stdout, process.stderr) to the new identifier (child.exited,
child.stdout, child.stderr) so the global process remains accessible and no
shadowing occurs.
In `@betterbase/packages/cli/src/commands/init.ts`:
- Around line 593-614: The POST /api/users handler currently validates the
payload in usersRoute.post but does not persist it; implement persistence by
inserting the validated object into your DB (e.g., call
db.insert(users).values(parsed) or use a UsersService.saveUser(parsed)) before
returning the response, handle insertion errors similarly to existing error
branches, and return the persisted record or its id instead of the canned "not
persisted" message; update the TODO and ensure you import/instantiate the db or
UsersService used by usersRoute.post and keep createUserSchema as the validation
step.
- Around line 538-552: The helper parseNonNegativeInt silently treats an
explicit empty string ("") as 0 because Number("") === 0; update
parseNonNegativeInt to treat empty strings as missing by returning the fallback
when value is "" or value.trim() === "", before attempting Number(value), so
that calls to parseNonNegativeInt (e.g., in init.ts where c.req.query('limit')
is passed) will yield the fallback for ?limit= instead of 0.
- Around line 104-130: The generated drizzle.config.ts currently hardcodes a DB
fallback (in buildDrizzleConfig) that can drift from the app's env.ts
DEFAULT_DB_PATH; update buildDrizzleConfig to either (preferred) import/use a
shared constant for the default DB path or (simpler) change the generated
template to include a clear inline comment in the returned drizzle.config.ts
pointing to src/env.ts (and ensure the fallback string matches DEFAULT_DB_PATH,
e.g. 'local.db' vs 'file:local.db' as used by the app); modify the databaseUrl
mapping in buildDrizzleConfig and the template generation so the fallback value
and the added comment are emitted into the generated file (referencing the
buildDrizzleConfig function and the produced drizzle.config.ts).
In `@betterbase/packages/client/package.json`:
- Around line 11-13: The package.json currently lists only the Node engine but
all npm scripts ("build", "dev", "test") invoke the Bun CLI; update the
"engines" field in package.json to declare Bun as required (e.g., add "bun":
">=1.0.0" alongside "node": ">=18") so package managers surface a clear error
when Bun is missing and developers are guided to install the correct Bun version
before running the build/dev/test scripts.
In `@betterbase/packages/client/src/auth.ts`:
- Around line 21-32: getStorage() currently always reads globalThis.localStorage
which breaks SSR/testing and offers no injection point; update the auth client
to accept an optional storage adapter and fall back to existing behavior. Add a
storage parameter to the AuthClient constructor (or a storage field on
BetterBaseConfig) and use that adapter inside AuthClient methods instead of
calling getStorage() directly; keep getStorage() as a fallback implementation
that returns globalThis.localStorage when available. Ensure the adapter
implements the same Storage interface (getItem/setItem/removeItem) so existing
code using AuthClient (e.g., signIn/signOut/session handling) continues to work
seamlessly in browser, SSR, or custom backends.
- Around line 42-64: Add Zod validation for AuthCredentials in signUp (and
mirror in signIn) by defining a Zod schema for { email: string().email(),
password: string().min(8) } and validating the credentials parameter at the
start of signUp (and signIn); if validation fails, return early with data: null
and an appropriate error (e.g., AuthError or a ValidationError wrapper) without
calling this.fetchImpl, otherwise proceed as before, and include the
parsed/validated values when building the request body so the API only receives
validated data.
In `@betterbase/packages/client/src/build.ts`:
- Around line 17-19: The error logging for the ESM build uses raw logs
(esmResult.logs) which can render as [object Object]; update the ESM failure
branch that checks esmResult.success to map each log entry to a string (like the
CLI's result.logs -> strings mapping) and join them into a single diagnostic
message passed to console.error, and apply the identical formatting change to
the CJS failure branch (the cjsResult.logs handling) so both failure messages
produce human-readable joined diagnostics.
In `@betterbase/packages/client/src/client.ts`:
- Around line 13-40: Add runtime Zod validation for the BetterBaseConfig passed
into the constructor: define a zod schema (e.g., BetterBaseConfigSchema) that
requires url as a non-empty string matching a URL pattern (or z.string().url()),
optional key as a non-empty string if present, and optional fetch as a function;
import zod at top, run schema.parse or schema.safeParse at the start of the
constructor, throw or surface validation errors, then use the parsed values
(trim trailing slash from parsed.url before assigning this.url) when
constructing AuthClient and RealtimeClient so invalid configs are rejected
early.
In `@betterbase/packages/client/src/query-builder.ts`:
- Around line 4-14: QueryBuilder is stateful and reuse is a footgun: add a
private flag (e.g. private executed = false) on the QueryBuilder class, set
executed = true at the start or end of execute(), and check this flag in
mutating methods (any methods that modify filters, options, selectFields and the
from() builder factory) to throw a clear error like "QueryBuilder instances are
single-use; create a new one via from()". Update execute(), any
filter/option/select setters, and from() to reference this.executed so once
execute() has run the instance refuses further mutations or execute() calls.
In `@betterbase/packages/client/src/realtime.ts`:
- Around line 107-120: In the WebSocket onerror handler, ensure the socket is
explicitly closed before you null it and schedule reconnects: call
this.ws?.close() (if present) at the start of the error handler, then clear any
existing reconnectTimeout (this.reconnectTimeout) and set this.ws = null, and
finally call this.scheduleReconnect(); update the onerror callback where this.ws
is referenced so the socket is closed explicitly rather than just nulled
(symbols: this.ws, this.reconnectTimeout, this.scheduleReconnect, onerror).
In `@betterbase/packages/client/test/client.test.ts`:
- Around line 4-23: Add behavioral tests that assert query builder chaining and
the network request built by execute(): use createClient to get client and
client.from('users') to build a query (referencing from, select, eq methods),
chain methods (e.g., .select(...).eq(...)) and call execute(), then intercept
global fetch (or use msw) to assert the outgoing request URL, method, headers
(including key when provided) and body/params; ensure tests cover both chained
invocations and a simple execute() call so the fetch request shape and
authentication behavior are verified.
In `@betterbase/packages/client/tsconfig.json`:
- Around line 1-12: Confirm that "strict": true is inherited by verifying
tsconfig.json extends ../../tsconfig.base.json (ensure tsconfig.base.json
contains "compilerOptions.strict": true) and leave this package tsconfig as-is;
optionally, move the Bun-specific build script src/build.ts out of src/ into a
scripts/ or tools/ directory (and update any imports or build references) to
separate build tooling from the client library code.
In `@betterbase/packages/client/tsconfig.test.json`:
- Line 5: The tsconfig includes a redundant "declarationMap" option: since
"declaration" is set to false in the same config, "declarationMap" has no
effect; remove the "declarationMap" entry from
betterbase/packages/client/tsconfig.test.json (i.e., delete the "declarationMap"
property) to clean up the config while leaving the existing "declaration": false
intact.
In `@betterbase/templates/base/.gitignore`:
- Line 3: The .gitignore currently only ignores ".env"; update the .gitignore to
also ignore all env variants by adding a pattern like ".env.*" while keeping the
existing ".env" entry, and add a negation rule "!.env.example" so an example env
file can still be committed; modify the template's .gitignore accordingly to
include these three patterns (".env", ".env.*", "!.env.example").
In `@betterbase/templates/base/src/index.ts`:
- Around line 11-28: WebSocket token fallback currently reads
c.req.query('token') into queryToken and uses it as token in upgradeWebSocket
(authHeaderToken/queryToken/token) which can leak credentials via access logs
and browser history; update the upgradeWebSocket handler to avoid exposing the
raw query token: prefer the Authorization header, only accept query param as a
documented fallback for short-lived tokens, ensure you never log the token
(remove any logging of token value) and add an inline comment near
upgradeWebSocket explaining the trade-off and recommending short-lived tokens or
alternate transports (cookie/subprotocol) for production; also consider emitting
a warning (without the token value) when falling back to the query param so ops
are aware.
In `@betterbase/templates/base/src/lib/realtime.ts`:
- Around line 41-60: The custom deepEqual implementation in function deepEqual
is incomplete for types like Date/Map/Set/RegExp; replace it with a well-tested
utility by importing and using fast-deep-equal instead: remove or replace the
local deepEqual function and import deepEqual from 'fast-deep-equal' (or
default-exported name your codebase expects) and use that exported function
wherever deepEqual(...) is called in this module (refer to the current deepEqual
function name to locate usages); add the package to dependencies if not present.
In `@betterbase/tsconfig.base.json`:
- Line 7: Remove the shared JSX setting "jsx": "react-jsx" from the base TS
config so the monorepo base remains React-agnostic; delete the "jsx":
"react-jsx" entry in betterbase/tsconfig.base.json and, if any package later
needs React, add "jsx": "react-jsx" into that package's local tsconfig.json
under compilerOptions (extending the base) instead of keeping it in the shared
base.
…-necessary-code-fixes Fix CLI/client parsing, dependency, and runtime safety issues
There was a problem hiding this comment.
Actionable comments posted: 12
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
betterbase/packages/cli/src/utils/scanner.ts (1)
113-115:⚠️ Potential issue | 🟡 MinorGuard against missing
nameArgto prevent NPE on malformed table calls.
callExpression.argumentsis destructured without a length check. If a schema file contains a baresqliteTable()with no arguments,nameArgwill beundefined, andts.isStringLiteral(nameArg)will throw aTypeError(cannot readkindofundefined). A defensive check is cheap here.Proposed fix
private parseTable(callExpression: ts.CallExpression): TableInfo { const [nameArg, columnsArg, indexesArg] = callExpression.arguments; - const tableName = ts.isStringLiteral(nameArg) ? nameArg.text : getExpressionText(this.sourceFile, nameArg); + const tableName = nameArg && ts.isStringLiteral(nameArg) ? nameArg.text : getExpressionText(this.sourceFile, nameArg);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/utils/scanner.ts` around lines 113 - 115, In parseTable, guard against a missing nameArg from callExpression.arguments before calling ts.isStringLiteral to avoid the TypeError; check whether nameArg is defined and if not set a safe default tableName (e.g., empty string or descriptive placeholder) or handle the malformed call (log/throw) and only call getExpressionText(this.sourceFile, nameArg) when nameArg exists; update the logic in parseTable to use this guard so ts.isStringLiteral(nameArg) is never invoked with undefined.betterbase/packages/cli/src/commands/init.ts (1)
258-275:⚠️ Potential issue | 🟠 MajorClose the SQLite handle in the generated migration script.
The generated
migrate.tsfile opens a database handle withnew Database(...)but never closes it, causing a resource leak. Wrap the migration in afinallyblock to ensure the handle is closed:Corrected generated script
try { const sqlite = new Database(env.DB_PATH, { create: true }); - const db = drizzle(sqlite); - - migrate(db, { migrationsFolder: './drizzle' }); - console.log('Migrations applied successfully.'); + try { + const db = drizzle(sqlite); + migrate(db, { migrationsFolder: './drizzle' }); + console.log('Migrations applied successfully.'); + } finally { + sqlite.close(); + } } catch (error) {Note:
migrate()is synchronous and should not be awaited; the drizzle-orm bun-sqlite migrator returnsvoiddirectly.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/commands/init.ts` around lines 258 - 275, The generated migration script opens a Database with "new Database(...)" assigned to the sqlite variable but never closes it; update the template in init.ts so the migration run is wrapped in try/finally where sqlite.close() is called in finally (ensure sqlite is in scope for the finally block), keeping the migrate(db, { migrationsFolder: './drizzle' }) call synchronous (do not await) and still logging success/errors as before; reference the Database class, the sqlite variable, drizzle(db) usage, and migrate(db, ...) in your changes.
🧹 Nitpick comments (7)
betterbase/packages/client/tsconfig.json (1)
4-5: Consider addingdeclarationMap: trueand an explicitrootDir.Two optional but beneficial additions for a published SDK:
declarationMap: true— generates.d.ts.mapfiles so consumers' IDEs resolve "Go to Definition" to the original.tssource rather than the emitted.d.ts.rootDir: "./src"— prevents TypeScript from silently widening the inferred root if a file outsidesrc/is ever transitively referenced, keeping the output layout stable.✨ Proposed additions
"compilerOptions": { "outDir": "./dist", + "rootDir": "./src", "declaration": true, + "declarationMap": true, "lib": [ "ES2022", "DOM" ] },🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/client/tsconfig.json` around lines 4 - 5, The tsconfig.json currently sets "outDir" and "declaration" but is missing declarationMap and an explicit rootDir; update the tsconfig to add "declarationMap": true to generate .d.ts.map files for better IDE "Go to Definition" behavior and add "rootDir": "./src" to lock the source root and keep emitted layout stable (keep existing "outDir" and "declaration" entries intact); locate the tsconfig.json properties around "outDir" and "declaration" to insert these two keys.betterbase/packages/cli/src/utils/scanner.ts (3)
126-130: Property-name extraction is duplicated acrossparseTableandparseIndexes.The identifier → string-literal →
getText()pattern for resolving property names appears in bothparseTable(lines 126–130) andparseIndexes(lines 179–183). Consider extracting a small helper (e.g.,getPropertyName(property: ts.PropertyAssignment, sourceFile: ts.SourceFile): string) to DRY this up.Also applies to: 179-183
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/utils/scanner.ts` around lines 126 - 130, Extract the duplicated property-name resolution into a small helper like getPropertyName(property: ts.PropertyAssignment, sourceFile: ts.SourceFile): string that implements the current logic (use ts.isIdentifier -> property.name.text, ts.isStringLiteral -> property.name.text, otherwise property.name.getText(sourceFile)); then replace the inlined logic inside parseTable and parseIndexes with calls to this helper (pass this.sourceFile where needed) to remove duplication and keep behavior identical.
109-110: WrapTablesRecordSchema.parse()to surface a friendlier error for CLI users.If the constructed
tablesobject doesn't conform (e.g., a column type outside the enum),parse()throws a rawZodErrorwith a technical stack trace. In a CLI context, consider catching theZodErrorand re-throwing with a descriptive message (e.g., which table/column failed validation), or use.safeParse()and handle the error branch.Suggested approach
- return TablesRecordSchema.parse(tables); + const result = TablesRecordSchema.safeParse(tables); + if (!result.success) { + const formatted = result.error.issues + .map((i) => ` - ${i.path.join('.')}: ${i.message}`) + .join('\n'); + throw new Error(`Schema validation failed:\n${formatted}`); + } + return result.data;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/utils/scanner.ts` around lines 109 - 110, The call to TablesRecordSchema.parse(tables) can throw a raw ZodError with a stack trace; update the code in scanner.ts to validate more gracefully by using TablesRecordSchema.safeParse(tables) (or wrap parse in a try/catch for ZodError), and when validation fails build and throw a user-friendly error that includes which table and column failed (inspect the safeParse error issues to extract path and message) so CLI users get a clear, contextual message instead of a raw Zod stack trace; keep the returned value the parsed data when validation succeeds.
221-267:parseColumnlacks the sameMAX_ITERguard added toparseIndexes.
parseIndexes(line 166) has aMAX_ITER = 50safety bound for its while-loop chain traversal.parseColumnuses the same pattern (lines 231–267) but has no equivalent guard. While column chains are typically short, applying the same safeguard would be consistent and protect against degenerate ASTs.Proposed fix
let current = unwrapExpression(expression); + const MAX_ITER = 50; + let iter = 0; while (ts.isCallExpression(current)) { + iter += 1; + if (iter > MAX_ITER) { + logger.warn( + `SchemaScanner parseColumn reached MAX_ITER=${MAX_ITER} while scanning column chain for "${columnName}"`, + ); + break; + } const methodName = getCallName(current);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/utils/scanner.ts` around lines 221 - 267, The parseColumn function currently walks call/property-access chains without a safety bound; add the same MAX_ITER guard used in parseIndexes to prevent infinite loops on degenerate ASTs: define const MAX_ITER = 50 (or reuse the existing constant if exported/accessible), initialize a counter before the while (ts.isCallExpression(current)) loop in parseColumn, decrement or increment and break (or throw a clear error) when the counter exceeds MAX_ITER, and keep the existing logic that updates current via unwrapExpression/getCallName and ts.isPropertyAccessExpression; reference parseColumn, parseIndexes, MAX_ITER, current, unwrapExpression, and getCallName when making the change.betterbase/packages/cli/src/commands/init.ts (2)
539-553: Query param parsing bypasses Zod; consider a Zod-permissive parser to match repo guidance.The comment explicitly opts out of strict validation, but the repo guideline says “Implement Zod validation everywhere for type safety.” You can keep the permissive behavior and use Zod by leveraging
.catch(fallback)(orsafeParse).One way to keep permissive behavior while using Zod
function parseNonNegativeInt(value: string | undefined, fallback: number): number { - if (!value || value.trim() === '') { - return fallback; - } - - const parsed = Number(value); - if (!Number.isInteger(parsed) || parsed < 0) { - return fallback; - } - - return parsed; + return z.coerce.number().int().nonnegative().catch(fallback).parse(value); }As per coding guidelines, “Implement Zod validation everywhere for type safety”.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/commands/init.ts` around lines 539 - 553, The helper parseNonNegativeInt bypasses Zod; update it to use a Zod schema so parsing stays permissive but still type-checked—e.g., create a Zod schema using z.coerce.number().int().nonnegative() (or z.number().int().nonnegative() with z.coerce for string inputs) and then use .safeParse() or .catch(fallback) to return the fallback on invalid input; replace the manual Number parsing and integer/negative checks inside parseNonNegativeInt with Zod validation calling that schema and returning either the parsed value or the fallback.
111-129: Avoid “keep in sync” drift: importDEFAULT_DB_PATHinto generateddrizzle.config.ts.Right now the generated config relies on a comment to keep fallback aligned with
src/lib/env.ts. Since this is scaffolding, it’s easy for them to diverge later.Proposed change (in the generated drizzle.config.ts string)
return `import { defineConfig } from 'drizzle-kit'; +import { DEFAULT_DB_PATH } from './src/lib/env'; export default defineConfig({ schema: './src/db/schema.ts', out: './drizzle', dialect: '${dialect[databaseMode]}', dbCredentials: { - // Keep local fallback in sync with src/lib/env.ts DEFAULT_DB_PATH - url: ${databaseUrl[databaseMode]},${tursoAuthTokenLine} + url: ${ + databaseMode === 'local' + ? "process.env.DB_PATH ? `file:${process.env.DB_PATH}` : `file:${DEFAULT_DB_PATH}`" + : databaseUrl[databaseMode] + },${tursoAuthTokenLine} }, }); `;As per coding guidelines, “Implement Zod validation everywhere for type safety” (this suggestion helps keep generated config consistent and less error-prone rather than relying on manual sync).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/commands/init.ts` around lines 111 - 129, Update the generated drizzle.config.ts output so it imports and uses the canonical DEFAULT_DB_PATH instead of embedding a string comment fallback: add an import for DEFAULT_DB_PATH from 'src/lib/env' at the top of the generated file and change the local URL fallback generation (currently produced via the databaseUrl.local entry and the returned template using databaseUrl[databaseMode]) to reference DEFAULT_DB_PATH for the local case; keep the existing tursoAuthTokenLine logic but ensure the generated dbCredentials.url uses the imported DEFAULT_DB_PATH when databaseMode === 'local' rather than a hard-coded/commented string so the scaffold cannot drift from src/lib/env.ts.betterbase/packages/cli/src/commands/auth.ts (1)
86-101: Login input handling improvement is solid and consistent with signup.This reduces “invalid JSON” crashes and enforces payload shape before DB access.
If you want to reduce duplication in the generated route, consider a small helper in the template (e.g.,
parseBody(schema)returning{ ok, data | error }) so signup/login stay in sync as you add fields.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@betterbase/packages/cli/src/commands/auth.ts` around lines 86 - 101, Extract the JSON parsing + zod validation logic used in authRoute.post('/login') (and signup) into a small helper function parseBody(schema) that reads c.req.json(), catches JSON errors, runs schema.safeParse, and returns a consistent result object like { ok: true, data } or { ok: false, error, status } so callers can simply do const res = await parseBody(loginSchema); if (!res.ok) return c.json({ error: res.error }, res.status); and then use res.data; implement parseBody near your route helpers and update authRoute.post('/login') and the signup route to call parseBody(loginSchema) / parseBody(signupSchema) instead of duplicating the try/catch + safeParse logic.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@betterbase/packages/cli/src/commands/auth.ts`:
- Around line 376-400: The current ensureRoutesIndexHook code uses fragile
exact-string anchors (importAnchor, routeAnchor) to insert "import { authRoute }
from './auth';" and "app.route('/auth', authRoute);" which will fail on
formatting changes; change ensureRoutesIndexHook to mirror generate.ts's
updateMainRouter approach: use regex to find the last import statement block
(e.g., the last "import ... from '...';") and insert the auth import after it if
missing, and use a regex that finds the last app.route(...) invocation (or the
block of app.route calls) and insert the auth route after the last route if
missing; update logic that currently checks next.includes(...) to use regex.test
and regex.replace so insertion is robust; keep references to routesIndexPath,
authRoute, usersRoute, importAnchor/routeAnchor only as helpers for locating the
same regions.
- Around line 239-327: ensurePasswordHashColumn currently fails silently and
injects a non-null column; change it to return a boolean success flag, make the
injected column nullable (remove .notNull()), and ensure callers abort auth
generation on failure. Specifically, update ensurePasswordHashColumn to return
true on successful injection and false on any parse/inject failure (including
when the block layout or alternate sqliteTable callback form prevents
injection), change the replacement string from "passwordHash:
text('password_hash').notNull()," to "passwordHash: text('password_hash')," so
the column is nullable, and update runAuthSetupCommand (the call site that
proceeds to generate routes/middleware) to check the boolean and stop/skip auth
route generation when ensurePasswordHashColumn returns false to avoid producing
runtime-broken code that references users.passwordHash.
In `@betterbase/packages/cli/src/commands/generate.ts`:
- Around line 114-147: The current filter parsing in the generation logic
silently drops invalid or unknown filters (occurring in the flatMap handling of
rawKey -> FILTERABLE_COLUMNS / FILTER_COERCE and schema.safeParse branches),
which can return unfiltered results; modify the logic to collect failures
(unknown keys, non-array for *_in, parse errors, or schema.safeParse failures)
into an errors list while building the filters (references: FILTERABLE_COLUMNS,
FILTER_COERCE, the inArray/eq return paths and the tableName usage), and if any
errors exist return a 400 response with the collected error details instead of
proceeding—ensure the same locations that currently "return []" push an error
message into the collector and halt with 400 after processing.
- Around line 75-108: The handler reads query.sort without validation; add a zod
schema (e.g. sortSchema) and safeParse like paginationSchema to validate sort
input: require a field that must be one of FILTERABLE_COLUMNS (use
Array.from(FILTERABLE_COLUMNS) or keys used in FILTER_COERCE) and an order that
is 'asc'|'desc' with a default of 'asc'; in {tableName}Route.get, parse the
incoming sort (e.g. split or parse query.sort into {field,order}), run
sortSchema.safeParse, return c.json({ error: 'Invalid sort params', details:
result.error.format() }, 400) on failure, and then use the validated sort values
for ordering instead of trusting the raw query.sort.
In `@betterbase/packages/cli/src/commands/migrate.ts`:
- Around line 265-361: The splitStatements function has a syntax bug and missing
input validation: replace the invalid literal check if (ch === '
') with the escaped newline check if (ch === '\n') inside splitStatements to
correctly terminate inLineComment, and add Zod validation for the sql parameter
(use the same pattern as migrateOptionsSchema) so splitStatements validates that
sql is a string before processing; update the function signature/entry to run
that schema parse/coerce (or throw) and then proceed with the existing parsing
logic.
In `@betterbase/packages/cli/src/index.ts`:
- Around line 35-69: Validate the projectRoot parameter at the start of the
.action callback using Zod (e.g., z.string().nonempty()), and reject/exit when
validation fails before calling runDevCommand; specifically, in the anonymous
async action handler (the function that calls runDevCommand(projectRoot)) create
a schema and parse/validate projectRoot, and only proceed to call runDevCommand
and register onSigInt/onSigTerm/onProcessExit listeners if validation passes.
Ensure any validation error is logged (via logger.warn or logger.error) and
results in an early return or process.exit(1) to avoid executing cleanup logic
when projectRoot is invalid.
In `@betterbase/packages/client/src/auth.ts`:
- Around line 28-31: credentialsSchema currently only validates email/password
while AuthCredentials includes name, and signUp spreads the raw credentials into
the request body, reintroducing unvalidated fields; update the validation to
only accept and serialize schema-approved fields: either add name to
credentialsSchema (if name is allowed) or create a signUpSchema that includes
the exact fields you expect, then replace usages in signUp (and the similar
block around lines 55-67) to call schema.parse/parseAsync and send only the
parsed result (not the original credentials) in the request body; reference
credentialsSchema, AuthCredentials, signUp (and the analogous function in 55-67)
to locate and fix the spots.
In `@betterbase/packages/client/src/query-builder.ts`:
- Around line 33-68: The client is sending `_in` filters as JSON strings via
QueryBuilder.in(), but the server route handlers are not deserializing them so
those filters are ignored; fix the server handler that processes incoming query
params (the users route/list handler) to detect keys ending with `_in`,
JSON.parse the value, validate with the same valuesSchema (or a Zod array
schema), and replace the string with the parsed array before applying filters;
alternatively you can change the client QueryBuilder.in() to send the array
directly (remove JSON.stringify in QueryBuilder.in and send parsedValues) but if
you choose server-side, use valuesSchema.parse(parsed) and stringSchema.parse
for column names to ensure type-safety when applying the filter.
In `@betterbase/packages/client/src/realtime.ts`:
- Around line 135-159: The public from(table: string) entry needs runtime Zod
validation for the table and filter parameters; add a zod schema (e.g.,
tableSchema = z.string().min(1) and filterSchema = z.record(z.string(),
z.unknown()).optional()) and validate the input at the start of from(table:
string) and inside subscribe(filter) before using this.connect(), creating
subscriber entries, calling this.sendSubscribe(table, filter), or storing in
this.subscriptions; on validation failure throw or return a clear error to
prevent storing invalid keys in the subscriptions Map or sending invalid
subscribe messages.
- Around line 60-69: The bug is that multiple client-side subscribers for the
same table use different filters but the server only keeps the last filter via
client.subscriptions.set(table, { table, filter }), so earlier subscribers get
wrong data; fix by extending the protocol to include a subscription ID: update
client.sendSubscribeAll and sendSubscribe to send a unique subId per subscriber
(from the subscriptions map entries), update server-side storage to use
client.subscriptions.set(table, Map<subId, {table,filter,clientId}>) (or
similar) and update broadcast logic (where it reads
client.subscriptions.get(table)) to iterate all subIds and apply each stored
filter when routing events; alternatively, if you prefer single-subscription
semantics, change the client API (from(...).on(...).subscribe) to merge/replace
callbacks per table so only one filter is ever sent by
sendSubscribe/sendSubscribeAll.
In `@betterbase/packages/client/tsconfig.json`:
- Around line 6-12: Remove the Bun ambient types from the client package TS
config so Bun globals don't leak into the universal SDK: delete "bun" from the
"types" array in the package's tsconfig.json and instead add Bun types only in
the build/CLI packages (where bunx/ build tooling runs); if any source under
src/ genuinely requires Bun-only APIs (e.g., uses Bun.file(), Bun.serve(), or
bun:sqlite), isolate that logic into a separate entry file or a
conditionally-imported module and document the Bun requirement so the main SDK
remains browser/Node type-safe.
In `@betterbase/templates/base/package.json`:
- Line 15: Replace the deprecated z.string().email() usage with the new
standalone validator z.email() in the user input schema defined for user routes
(look for the user schema variable or createUser/createUserSchema in the users
route handler). Locate every occurrence of z.string().email() in that schema and
change it to z.email(), keeping the rest of the validation chain intact; no
import changes are required since z.email() is exported from Zod itself.
---
Outside diff comments:
In `@betterbase/packages/cli/src/commands/init.ts`:
- Around line 258-275: The generated migration script opens a Database with "new
Database(...)" assigned to the sqlite variable but never closes it; update the
template in init.ts so the migration run is wrapped in try/finally where
sqlite.close() is called in finally (ensure sqlite is in scope for the finally
block), keeping the migrate(db, { migrationsFolder: './drizzle' }) call
synchronous (do not await) and still logging success/errors as before; reference
the Database class, the sqlite variable, drizzle(db) usage, and migrate(db, ...)
in your changes.
In `@betterbase/packages/cli/src/utils/scanner.ts`:
- Around line 113-115: In parseTable, guard against a missing nameArg from
callExpression.arguments before calling ts.isStringLiteral to avoid the
TypeError; check whether nameArg is defined and if not set a safe default
tableName (e.g., empty string or descriptive placeholder) or handle the
malformed call (log/throw) and only call getExpressionText(this.sourceFile,
nameArg) when nameArg exists; update the logic in parseTable to use this guard
so ts.isStringLiteral(nameArg) is never invoked with undefined.
---
Nitpick comments:
In `@betterbase/packages/cli/src/commands/auth.ts`:
- Around line 86-101: Extract the JSON parsing + zod validation logic used in
authRoute.post('/login') (and signup) into a small helper function
parseBody(schema) that reads c.req.json(), catches JSON errors, runs
schema.safeParse, and returns a consistent result object like { ok: true, data }
or { ok: false, error, status } so callers can simply do const res = await
parseBody(loginSchema); if (!res.ok) return c.json({ error: res.error },
res.status); and then use res.data; implement parseBody near your route helpers
and update authRoute.post('/login') and the signup route to call
parseBody(loginSchema) / parseBody(signupSchema) instead of duplicating the
try/catch + safeParse logic.
In `@betterbase/packages/cli/src/commands/init.ts`:
- Around line 539-553: The helper parseNonNegativeInt bypasses Zod; update it to
use a Zod schema so parsing stays permissive but still type-checked—e.g., create
a Zod schema using z.coerce.number().int().nonnegative() (or
z.number().int().nonnegative() with z.coerce for string inputs) and then use
.safeParse() or .catch(fallback) to return the fallback on invalid input;
replace the manual Number parsing and integer/negative checks inside
parseNonNegativeInt with Zod validation calling that schema and returning either
the parsed value or the fallback.
- Around line 111-129: Update the generated drizzle.config.ts output so it
imports and uses the canonical DEFAULT_DB_PATH instead of embedding a string
comment fallback: add an import for DEFAULT_DB_PATH from 'src/lib/env' at the
top of the generated file and change the local URL fallback generation
(currently produced via the databaseUrl.local entry and the returned template
using databaseUrl[databaseMode]) to reference DEFAULT_DB_PATH for the local
case; keep the existing tursoAuthTokenLine logic but ensure the generated
dbCredentials.url uses the imported DEFAULT_DB_PATH when databaseMode ===
'local' rather than a hard-coded/commented string so the scaffold cannot drift
from src/lib/env.ts.
In `@betterbase/packages/cli/src/utils/scanner.ts`:
- Around line 126-130: Extract the duplicated property-name resolution into a
small helper like getPropertyName(property: ts.PropertyAssignment, sourceFile:
ts.SourceFile): string that implements the current logic (use ts.isIdentifier ->
property.name.text, ts.isStringLiteral -> property.name.text, otherwise
property.name.getText(sourceFile)); then replace the inlined logic inside
parseTable and parseIndexes with calls to this helper (pass this.sourceFile
where needed) to remove duplication and keep behavior identical.
- Around line 109-110: The call to TablesRecordSchema.parse(tables) can throw a
raw ZodError with a stack trace; update the code in scanner.ts to validate more
gracefully by using TablesRecordSchema.safeParse(tables) (or wrap parse in a
try/catch for ZodError), and when validation fails build and throw a
user-friendly error that includes which table and column failed (inspect the
safeParse error issues to extract path and message) so CLI users get a clear,
contextual message instead of a raw Zod stack trace; keep the returned value the
parsed data when validation succeeds.
- Around line 221-267: The parseColumn function currently walks
call/property-access chains without a safety bound; add the same MAX_ITER guard
used in parseIndexes to prevent infinite loops on degenerate ASTs: define const
MAX_ITER = 50 (or reuse the existing constant if exported/accessible),
initialize a counter before the while (ts.isCallExpression(current)) loop in
parseColumn, decrement or increment and break (or throw a clear error) when the
counter exceeds MAX_ITER, and keep the existing logic that updates current via
unwrapExpression/getCallName and ts.isPropertyAccessExpression; reference
parseColumn, parseIndexes, MAX_ITER, current, unwrapExpression, and getCallName
when making the change.
In `@betterbase/packages/client/tsconfig.json`:
- Around line 4-5: The tsconfig.json currently sets "outDir" and "declaration"
but is missing declarationMap and an explicit rootDir; update the tsconfig to
add "declarationMap": true to generate .d.ts.map files for better IDE "Go to
Definition" behavior and add "rootDir": "./src" to lock the source root and keep
emitted layout stable (keep existing "outDir" and "declaration" entries intact);
locate the tsconfig.json properties around "outDir" and "declaration" to insert
these two keys.
| const usersExportIdx = current.search(/export\s+const\s+users\s*=\s*sqliteTable\s*\(/); | ||
| if (usersExportIdx === -1) { | ||
| logger.warn('Could not find sqlite users table block; skipping passwordHash injection.'); | ||
| return; | ||
| } | ||
|
|
||
| const replacement = usersBlock[0].replace(/\n\}\);$/, "\n passwordHash: text('password_hash').notNull(),\n});"); | ||
| writeFileSync(schemaPath, current.replace(usersBlock[0], replacement)); | ||
| const callStart = current.indexOf('sqliteTable(', usersExportIdx); | ||
| if (callStart === -1) { | ||
| logger.warn('Could not locate sqliteTable call for users; skipping passwordHash injection.'); | ||
| return; | ||
| } | ||
|
|
||
| let i = callStart; | ||
| let parenDepth = 0; | ||
| let inSingle = false; | ||
| let inDouble = false; | ||
| let inBacktick = false; | ||
| let escaped = false; | ||
|
|
||
| while (i < current.length) { | ||
| const ch = current[i]; | ||
|
|
||
| if (escaped) { | ||
| escaped = false; | ||
| i += 1; | ||
| continue; | ||
| } | ||
|
|
||
| if ((inSingle || inDouble || inBacktick) && ch === '\\') { | ||
| escaped = true; | ||
| i += 1; | ||
| continue; | ||
| } | ||
|
|
||
| if (!inDouble && !inBacktick && ch === "'") { | ||
| inSingle = !inSingle; | ||
| i += 1; | ||
| continue; | ||
| } | ||
|
|
||
| if (!inSingle && !inBacktick && ch === '"') { | ||
| inDouble = !inDouble; | ||
| i += 1; | ||
| continue; | ||
| } | ||
|
|
||
| if (!inSingle && !inDouble && ch === '`') { | ||
| inBacktick = !inBacktick; | ||
| i += 1; | ||
| continue; | ||
| } | ||
|
|
||
| if (inSingle || inDouble || inBacktick) { | ||
| i += 1; | ||
| continue; | ||
| } | ||
|
|
||
| if (ch === '(') { | ||
| parenDepth += 1; | ||
| } else if (ch === ')') { | ||
| parenDepth -= 1; | ||
| if (parenDepth === 0) { | ||
| break; | ||
| } | ||
| } | ||
|
|
||
| i += 1; | ||
| } | ||
|
|
||
| if (i >= current.length || parenDepth !== 0) { | ||
| logger.warn('Could not safely parse users sqliteTable block; skipping passwordHash injection.'); | ||
| return; | ||
| } | ||
|
|
||
| const statementEnd = current.indexOf(';', i); | ||
| if (statementEnd === -1) { | ||
| logger.warn('Could not locate end of users sqliteTable statement; skipping passwordHash injection.'); | ||
| return; | ||
| } | ||
|
|
||
| const usersBlock = current.slice(usersExportIdx, statementEnd + 1); | ||
| const replacement = usersBlock.replace(/\n\}\);\s*$/, "\n passwordHash: text('password_hash').notNull(),\n});"); | ||
| if (replacement === usersBlock) { | ||
| logger.warn('Could not inject passwordHash into users table; block layout was unexpected.'); | ||
| return; | ||
| } | ||
|
|
||
| writeFileSync(schemaPath, `${current.slice(0, usersExportIdx)}${replacement}${current.slice(statementEnd + 1)}`); | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Verify how "users" sqliteTable is authored in repo templates/examples and whether a 3rd arg is used.
set -euo pipefail
echo "== Searching for users sqliteTable definitions =="
rg -n --type=ts "export\s+const\s+users\s*=\s*sqliteTable\s*\(" -C 3
echo ""
echo "== Checking for sqliteTable with 3 arguments (callback form) =="
rg -n --type=ts "sqliteTable\s*\(\s*['\"]users['\"]" -C 5Repository: weroperking/Betterbase
Length of output: 9261
🏁 Script executed:
#!/bin/bash
# Check the actual ensurePasswordHashColumn function implementation
cd betterbase/packages/cli/src/commands || cd packages/cli/src/commands || cd .
echo "== Finding auth.ts file =="
find . -name "auth.ts" -type f | head -5
echo ""
echo "== Examining ensurePasswordHashColumn function (lines around 239-327) =="
if [ -f "auth.ts" ]; then
sed -n '230,330p' auth.ts
elif [ -f "betterbase/packages/cli/src/commands/auth.ts" ]; then
sed -n '230,330p' betterbase/packages/cli/src/commands/auth.ts
fiRepository: weroperking/Betterbase
Length of output: 2807
🏁 Script executed:
#!/bin/bash
# Look for all template files and examine schema structure
echo "== Searching for schema files in templates =="
find . -path "*/templates/*" -name "schema.ts" -o -path "*/templates/*" -name "*schema*" | head -10
echo ""
echo "== Examining template schema files for users table definition =="
find . -path "*/templates/*" \( -name "schema.ts" -o -name "*schema*" \) -type f | while read f; do
if [ -s "$f" ]; then
echo "=== File: $f ==="
rg "export.*const.*users" "$f" -C 5 || true
fi
doneRepository: weroperking/Betterbase
Length of output: 943
🏁 Script executed:
#!/bin/bash
# Verify where ensurePasswordHashColumn is called and how the result is used
echo "== Finding ensurePasswordHashColumn() calls =="
rg -n "ensurePasswordHashColumn" --type=ts -A 2 -B 2
echo ""
echo "== Check if auth route generation happens after ensurePasswordHashColumn =="
rg -n "ensureAuthRoute\|ensureAuth\|auth\.ts\|routes/auth" betterbase/packages/cli/src/commands/auth.ts | head -20Repository: weroperking/Betterbase
Length of output: 1063
🏁 Script executed:
#!/bin/bash
# Check the context where ensurePasswordHashColumn is called - look for the full flow
echo "== Looking for function that calls ensurePasswordHashColumn =="
rg -n "function.*\{" betterbase/packages/cli/src/commands/auth.ts | head -10
echo ""
echo "== Get more lines after ensurePasswordHashColumn call to see what happens next =="
sed -n '1,600p' betterbase/packages/cli/src/commands/auth.ts | grep -n "ensurePasswordHashColumn\|ensureAuthRoute\|ensureAuthMiddleware" -A 10Repository: weroperking/Betterbase
Length of output: 1874
ensurePasswordHashColumn() silently fails but auth routes are still generated, risking runtime breakage.
Two confirmed issues:
-
Fragility / partial failure: The function only succeeds if the
userstable ends in exactly\n});, but the 3-arg callback formsqliteTable('users', { ... }, (table) => ({ ... }))exists in the codebase and won't match. When injection fails, the function returns silently with no feedback, yetrunAuthSetupCommand()continues to generate auth middleware and routes that referenceusers.passwordHash— causing build/runtime failure. -
Migration safety: Injecting
text('password_hash').notNull()can breakdb:pushon existing databases with rows, especially Postgres, without a default or backfill strategy.
Fix required:
- Make
ensurePasswordHashColumn()returnbooleanand stop auth setup (or skip route generation) on failure. - Inject the column as nullable by default (remove
.notNull()) to be friendlier to existing databases.
Proposed diff
-function ensurePasswordHashColumn(schemaPath: string): void {
+function ensurePasswordHashColumn(schemaPath: string): boolean {
const current = readFileSync(schemaPath, 'utf-8');
if (/passwordHash\s*:\s*text\('password_hash'\)/.test(current)) {
- return;
+ return true;
}
const usersExportIdx = current.search(/export\s+const\s+users\s*=\s*sqliteTable\s*\(/);
if (usersExportIdx === -1) {
logger.warn('Could not find sqlite users table block; skipping passwordHash injection.');
- return;
+ return false;
}
const callStart = current.indexOf('sqliteTable(', usersExportIdx);
if (callStart === -1) {
logger.warn('Could not locate sqliteTable call for users; skipping passwordHash injection.');
- return;
+ return false;
}
// ... parser loop unchanged ...
if (i >= current.length || parenDepth !== 0) {
logger.warn('Could not safely parse users sqliteTable block; skipping passwordHash injection.');
- return;
+ return false;
}
const statementEnd = current.indexOf(';', i);
if (statementEnd === -1) {
logger.warn('Could not locate end of users sqliteTable statement; skipping passwordHash injection.');
- return;
+ return false;
}
const usersBlock = current.slice(usersExportIdx, statementEnd + 1);
- const replacement = usersBlock.replace(/\n\}\);\s*$/, "\n passwordHash: text('password_hash').notNull(),\n});");
+ const replacement = usersBlock.replace(
+ /\n\}\);\s*$/,
+ "\n passwordHash: text('password_hash'),\n});"
+ );
if (replacement === usersBlock) {
logger.warn('Could not inject passwordHash into users table; block layout was unexpected.');
- return;
+ return false;
}
writeFileSync(schemaPath, `${current.slice(0, usersExportIdx)}${replacement}${current.slice(statementEnd + 1)}`);
+ return true;
}Update call site (line 419):
- ensurePasswordHashColumn(schemaPath);
+ if (!ensurePasswordHashColumn(schemaPath)) {
+ logger.error('Failed to inject passwordHash column; aborting auth setup.');
+ return;
+ }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@betterbase/packages/cli/src/commands/auth.ts` around lines 239 - 327,
ensurePasswordHashColumn currently fails silently and injects a non-null column;
change it to return a boolean success flag, make the injected column nullable
(remove .notNull()), and ensure callers abort auth generation on failure.
Specifically, update ensurePasswordHashColumn to return true on successful
injection and false on any parse/inject failure (including when the block layout
or alternate sqliteTable callback form prevents injection), change the
replacement string from "passwordHash: text('password_hash').notNull()," to
"passwordHash: text('password_hash')," so the column is nullable, and update
runAuthSetupCommand (the call site that proceeds to generate routes/middleware)
to check the boolean and stop/skip auth route generation when
ensurePasswordHashColumn returns false to avoid producing runtime-broken code
that references users.passwordHash.
| const current = readFileSync(routesIndexPath, 'utf-8'); | ||
| const importAnchor = "import { usersRoute } from './users';"; | ||
| const routeAnchor = "app.route('/api/users', usersRoute);"; | ||
|
|
||
| let next = current; | ||
|
|
||
| if (!current.includes("import { authRoute } from './auth';")) { | ||
| current = current.replace("import { usersRoute } from './users';", "import { usersRoute } from './users';\nimport { authRoute } from './auth';"); | ||
| if (!next.includes("import { authRoute } from './auth';")) { | ||
| if (next.includes(importAnchor)) { | ||
| next = next.replace(importAnchor, `${importAnchor}\nimport { authRoute } from './auth';`); | ||
| } else { | ||
| logger.warn(`Could not find import anchor in ${routesIndexPath}; skipping auth route import injection.`); | ||
| } | ||
| } | ||
|
|
||
| if (!current.includes("app.route('/auth', authRoute);")) { | ||
| current = current.replace("app.route('/api/users', usersRoute);", "app.route('/api/users', usersRoute);\n app.route('/auth', authRoute);"); | ||
| if (!next.includes("app.route('/auth', authRoute);")) { | ||
| if (next.includes(routeAnchor)) { | ||
| next = next.replace(routeAnchor, `${routeAnchor}\n app.route('/auth', authRoute);`); | ||
| } else { | ||
| logger.warn(`Could not find route anchor in ${routesIndexPath}; skipping auth route registration injection.`); | ||
| } | ||
| } | ||
|
|
||
| writeFileSync(routesIndexPath, current); | ||
| if (next !== current) { | ||
| writeFileSync(routesIndexPath, next); | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Inspect how routes index is shaped across templates/examples to see if current anchors are sufficient.
set -euo pipefail
echo "== Candidate routes index files =="
fd -a -t f 'index.ts' | grep -E "src/routes/index\.ts$|routes/index\.ts$" || echo "No exact matches found"
echo
echo "== Search with broader pattern =="
fd -a -t f -e ts | rg "routes/index" || echo "No routes/index files found"
echo
echo "== Occurrences of usersRoute import/registration anchors =="
rg -n --type=ts "import.*usersRoute.*from.*users|app\.route\(.*/api/users" -C 2 || echo "No exact anchor matches found"
echo
echo "== Check routes directory structure =="
fd -a -t d "routes" | head -20Repository: weroperking/Betterbase
Length of output: 2353
🏁 Script executed:
#!/bin/bash
echo "== Full routes index.ts template =="
cat -n /home/jailuser/git/betterbase/templates/base/src/routes/index.ts
echo
echo "== Check if there are other routes index variants =="
find /home/jailuser/git/betterbase/templates -name "index.ts" -path "*/routes/*" -exec echo "File: {}" \; -exec cat -n {} \;
echo
echo "== Check init.ts template generation context =="
grep -n "registerRoutes\|usersRoute" /home/jailuser/git/betterbase/packages/cli/src/commands/init.ts | head -30
echo
echo "== Check if there are tests for auth route injection =="
fd -a -t f "\.test\.|\.spec\." | rg -i "auth.*inject|auth.*route|registerRoutes" -lRepository: weroperking/Betterbase
Length of output: 2912
🏁 Script executed:
#!/bin/bash
echo "== Search for auth command tests =="
fd -a -t f "\.test\.|\.spec\." | xargs grep -l "auth" 2>/dev/null | head -10
echo
echo "== Check auth.ts full context around injection code =="
sed -n '360,410p' /home/jailuser/git/betterbase/packages/cli/src/commands/auth.ts
echo
echo "== Look for documentation about auth setup =="
fd -a -t f "\.md$" | xargs grep -l "auth" 2>/dev/null | head -5
echo
echo "== Check if routes/index.ts is generated or template only =="
grep -r "registerRoutes\|routes/index" /home/jailuser/git/betterbase/packages/cli/src --include="*.ts" | grep -i "template\|scaffold\|generate" | head -10Repository: weroperking/Betterbase
Length of output: 2807
🏁 Script executed:
#!/bin/bash
echo "== Check if routes/index.ts is created/modified in generate.ts =="
grep -n "routerPath\|routes/index" /home/jailuser/git/betterbase/packages/cli/src/commands/generate.ts -A 5 -B 5 | head -40
echo
echo "== Look for auth injection tests specifically =="
grep -r "ensureRoutesIndexHook\|authRoute\|auth.*inject" /home/jailuser/git/betterbase/packages/cli/test --include="*.ts" -A 3 -B 3
echo
echo "== Check smoke test to understand auth setup flow =="
grep -n "auth\|Auth" /home/jailuser/git/betterbase/packages/cli/test/smoke.test.ts | head -20
echo
echo "== Check if there are examples of modified routes index files =="
fd -a -t f "index.ts" -path "*/routes/*" | xargs wc -lRepository: weroperking/Betterbase
Length of output: 1774
🏁 Script executed:
#!/bin/bash
echo "== Get full context of updateMainRouter function =="
sed -n '216,246p' /home/jailuser/git/betterbase/packages/cli/src/commands/generate.ts
echo
echo "== Check if ensureRoutesIndexHook is called =="
grep -n "ensureRoutesIndexHook" /home/jailuser/git/betterbase/packages/cli/src/commands/auth.ts
echo
echo "== Check the full runAuthSetupCommand to see what it does =="
sed -n '410,460p' /home/jailuser/git/betterbase/packages/cli/src/commands/auth.tsRepository: weroperking/Betterbase
Length of output: 3024
Use regex-based insertion matching the pattern in generate.ts's updateMainRouter function
The exact string matching for anchors will break on any formatting change. The codebase already solves this more robustly in generate.ts lines 224–242 by using regex patterns to locate healthRoute import and find all app.route() statements. Apply the same approach to ensureRoutesIndexHook: use regex to find the last import statement and last route statement, then insert authRoute appropriately. This prevents silent failures when the template or user-modified routes index differs in formatting.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@betterbase/packages/cli/src/commands/auth.ts` around lines 376 - 400, The
current ensureRoutesIndexHook code uses fragile exact-string anchors
(importAnchor, routeAnchor) to insert "import { authRoute } from './auth';" and
"app.route('/auth', authRoute);" which will fail on formatting changes; change
ensureRoutesIndexHook to mirror generate.ts's updateMainRouter approach: use
regex to find the last import statement block (e.g., the last "import ... from
'...';") and insert the auth import after it if missing, and use a regex that
finds the last app.route(...) invocation (or the block of app.route calls) and
insert the auth route after the last route if missing; update logic that
currently checks next.includes(...) to use regex.test and regex.replace so
insertion is robust; keep references to routesIndexPath, authRoute, usersRoute,
importAnchor/routeAnchor only as helpers for locating the same regions.
| const DEFAULT_LIMIT = 50; | ||
| const MAX_LIMIT = 100; | ||
| const DEFAULT_OFFSET = 0; | ||
|
|
||
| const paginationSchema = z.object({ | ||
| limit: z.coerce.number().int().nonnegative().default(DEFAULT_LIMIT), | ||
| offset: z.coerce.number().int().nonnegative().default(DEFAULT_OFFSET), | ||
| }); | ||
|
|
||
| const FILTERABLE_COLUMNS = new Set([ | ||
| ${filterableColumns} | ||
| ]); | ||
|
|
||
| const FILTER_COERCE = { | ||
| ${filterCoercers} | ||
| } as const; | ||
|
|
||
| ${tableName}Route.get('/', async (c) => { | ||
| const queryParams = c.req.query(); | ||
| const sort = queryParams.sort; | ||
| const paginationResult = paginationSchema.safeParse({ | ||
| limit: queryParams.limit, | ||
| offset: queryParams.offset, | ||
| }); | ||
|
|
||
| if (!paginationResult.success) { | ||
| return c.json({ error: 'Invalid pagination params', details: paginationResult.error.format() }, 400); | ||
| } | ||
|
|
||
| const filters = Object.entries(queryParams).filter(([key, value]) => key !== 'limit' && key !== 'offset' && key !== 'sort' && value !== undefined); | ||
| const { limit, offset } = paginationResult.data; | ||
| const fetchLimit = Math.min(limit, MAX_LIMIT); | ||
| const sort = queryParams.sort; | ||
| const filters = Object.entries(queryParams).filter( | ||
| ([key, value]) => key !== 'limit' && key !== 'offset' && key !== 'sort' && value !== undefined, | ||
| ); |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Examine the sort handling and validation in generate.ts
rg -B5 -A20 "const sort = queryParams\.sort" betterbase/packages/cli/src/commands/generate.tsRepository: weroperking/Betterbase
Length of output: 1009
🏁 Script executed:
#!/bin/bash
# Check for any existing sort validation
rg -n "sortSchema|sortSchema|sort.*safeParse|sort.*validate" betterbase/packages/cli/src/commands/generate.tsRepository: weroperking/Betterbase
Length of output: 48
🏁 Script executed:
#!/bin/bash
# Get the file size to determine how to read it safely
wc -l betterbase/packages/cli/src/commands/generate.tsRepository: weroperking/Betterbase
Length of output: 118
🏁 Script executed:
#!/bin/bash
# Read the context around sort usage to understand how it's applied
sed -n '95,150p' betterbase/packages/cli/src/commands/generate.tsRepository: weroperking/Betterbase
Length of output: 1994
🏁 Script executed:
#!/bin/bash
# Search for where sort is used in the query building
rg -n "if.*sort|sort.*if" betterbase/packages/cli/src/commands/generate.tsRepository: weroperking/Betterbase
Length of output: 83
🏁 Script executed:
#!/bin/bash
# Read the sort application logic
sed -n '154,175p' betterbase/packages/cli/src/commands/generate.tsRepository: weroperking/Betterbase
Length of output: 782
Add Zod validation for the sort parameter to maintain consistency with pagination and filter validation.
The sort parameter is read directly from the query without validation. Invalid field names silently skip ordering, and invalid order values default to asc without error feedback. This violates the coding guideline "Implement Zod validation everywhere for type safety."
Implement a sortSchema with safeParse similar to the existing pagination and filter validation, returning a 400 error for invalid sort parameters instead of silently falling back.
🔧 Suggested implementation
const paginationSchema = z.object({
limit: z.coerce.number().int().nonnegative().default(DEFAULT_LIMIT),
offset: z.coerce.number().int().nonnegative().default(DEFAULT_OFFSET),
});
+const sortSchema = z
+ .string()
+ .optional()
+ .refine((value) => {
+ if (!value) return true;
+ const [field, order] = value.split(':');
+ return Boolean(field)
+ && field in ${tableName}
+ && (!order || order === 'asc' || order === 'desc');
+ }, { message: 'Invalid sort param' });
+
...
- const sort = queryParams.sort;
+ const sortResult = sortSchema.safeParse(queryParams.sort);
+ if (!sortResult.success) {
+ return c.json({ error: 'Invalid sort param', details: sortResult.error.format() }, 400);
+ }
+ const sort = sortResult.data;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@betterbase/packages/cli/src/commands/generate.ts` around lines 75 - 108, The
handler reads query.sort without validation; add a zod schema (e.g. sortSchema)
and safeParse like paginationSchema to validate sort input: require a field that
must be one of FILTERABLE_COLUMNS (use Array.from(FILTERABLE_COLUMNS) or keys
used in FILTER_COERCE) and an order that is 'asc'|'desc' with a default of
'asc'; in {tableName}Route.get, parse the incoming sort (e.g. split or parse
query.sort into {field,order}), run sortSchema.safeParse, return c.json({ error:
'Invalid sort params', details: result.error.format() }, 400) on failure, and
then use the validated sort values for ordering instead of trusting the raw
query.sort.
| .flatMap(([rawKey, value]) => { | ||
| if (rawKey.endsWith('_in')) { | ||
| const key = rawKey.slice(0, -3); | ||
| if (!FILTERABLE_COLUMNS.has(key)) return []; | ||
|
|
||
| const schema = FILTER_COERCE[key as keyof typeof FILTER_COERCE]; | ||
| if (!schema) return []; | ||
|
|
||
| try { | ||
| const parsedInValues = JSON.parse(String(value)); | ||
| if (!Array.isArray(parsedInValues)) return []; | ||
|
|
||
| const coercedValues = parsedInValues | ||
| .map((item) => schema.safeParse(item)) | ||
| .filter((result) => result.success) | ||
| .map((result) => result.data); | ||
|
|
||
| if (coercedValues.length === 0) return []; | ||
|
|
||
| return [inArray(${tableName}[key as keyof typeof ${tableName}] as never, coercedValues as never[])]; | ||
| } catch { | ||
| return []; | ||
| } | ||
| } | ||
|
|
||
| if (!FILTERABLE_COLUMNS.has(rawKey)) return []; | ||
| const schema = FILTER_COERCE[rawKey as keyof typeof FILTER_COERCE]; | ||
| if (!schema) return []; | ||
|
|
||
| const parsed = schema.safeParse(value); | ||
| if (!parsed.success) return []; | ||
|
|
||
| return [eq(${tableName}[rawKey as keyof typeof ${tableName}] as never, parsed.data as never)]; | ||
| }); |
There was a problem hiding this comment.
Invalid filters currently fail open; consider returning 400.
Unknown keys or failed coercions are silently dropped, which can return unfiltered results and hide client bugs. Consider collecting invalid filters and rejecting the request.
🔧 Suggested update
const filters = Object.entries(queryParams).filter(
([key, value]) => key !== 'limit' && key !== 'offset' && key !== 'sort' && value !== undefined,
);
+ const invalidFilters: string[] = [];
let query = db.select().from(${tableName}).$dynamic();
if (filters.length > 0) {
const conditions = filters
.flatMap(([rawKey, value]) => {
if (rawKey.endsWith('_in')) {
const key = rawKey.slice(0, -3);
- if (!FILTERABLE_COLUMNS.has(key)) return [];
+ if (!FILTERABLE_COLUMNS.has(key)) { invalidFilters.push(rawKey); return []; }
const schema = FILTER_COERCE[key as keyof typeof FILTER_COERCE];
- if (!schema) return [];
+ if (!schema) { invalidFilters.push(rawKey); return []; }
try {
const parsedInValues = JSON.parse(String(value));
- if (!Array.isArray(parsedInValues)) return [];
+ if (!Array.isArray(parsedInValues)) { invalidFilters.push(rawKey); return []; }
const coercedValues = parsedInValues
.map((item) => schema.safeParse(item))
.filter((result) => result.success)
.map((result) => result.data);
- if (coercedValues.length === 0) return [];
+ if (coercedValues.length === 0) { invalidFilters.push(rawKey); return []; }
return [inArray(${tableName}[key as keyof typeof ${tableName}] as never, coercedValues as never[])];
} catch {
- return [];
+ invalidFilters.push(rawKey);
+ return [];
}
}
- if (!FILTERABLE_COLUMNS.has(rawKey)) return [];
+ if (!FILTERABLE_COLUMNS.has(rawKey)) { invalidFilters.push(rawKey); return []; }
const schema = FILTER_COERCE[rawKey as keyof typeof FILTER_COERCE];
- if (!schema) return [];
+ if (!schema) { invalidFilters.push(rawKey); return []; }
const parsed = schema.safeParse(value);
- if (!parsed.success) return [];
+ if (!parsed.success) { invalidFilters.push(rawKey); return []; }
return [eq(${tableName}[rawKey as keyof typeof ${tableName}] as never, parsed.data as never)];
});
+ if (invalidFilters.length > 0) {
+ return c.json({ error: 'Invalid filters', details: invalidFilters }, 400);
+ }
if (conditions.length > 0) {
query = query.where(and(...conditions));
}
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@betterbase/packages/cli/src/commands/generate.ts` around lines 114 - 147, The
current filter parsing in the generation logic silently drops invalid or unknown
filters (occurring in the flatMap handling of rawKey -> FILTERABLE_COLUMNS /
FILTER_COERCE and schema.safeParse branches), which can return unfiltered
results; modify the logic to collect failures (unknown keys, non-array for *_in,
parse errors, or schema.safeParse failures) into an errors list while building
the filters (references: FILTERABLE_COLUMNS, FILTER_COERCE, the inArray/eq
return paths and the tableName usage), and if any errors exist return a 400
response with the collected error details instead of proceeding—ensure the same
locations that currently "return []" push an error message into the collector
and halt with 400 after processing.
| function splitStatements(sql: string): string[] { | ||
| return sql | ||
| .split(/;\s*/g) | ||
| .map((statement) => statement.trim()) | ||
| .filter((statement) => statement.length > 0); | ||
| const statements: string[] = []; | ||
| let current = ''; | ||
| let inSingle = false; | ||
| let inDouble = false; | ||
| let inBacktick = false; | ||
| let inLineComment = false; | ||
| let inBlockComment = false; | ||
|
|
||
| for (let i = 0; i < sql.length; i += 1) { | ||
| const ch = sql[i]; | ||
| const next = sql[i + 1]; | ||
|
|
||
| if (inLineComment) { | ||
| current += ch; | ||
| if (ch === ' | ||
| ') { | ||
| inLineComment = false; | ||
| } | ||
| continue; | ||
| } | ||
|
|
||
| if (inBlockComment) { | ||
| current += ch; | ||
| if (ch === '*' && next === '/') { | ||
| current += next; | ||
| i += 1; | ||
| inBlockComment = false; | ||
| } | ||
| continue; | ||
| } | ||
|
|
||
| if (!inSingle && !inDouble && !inBacktick && ch === '-' && next === '-') { | ||
| current += ch + next; | ||
| i += 1; | ||
| inLineComment = true; | ||
| continue; | ||
| } | ||
|
|
||
| if (!inSingle && !inDouble && !inBacktick && ch === '/' && next === '*') { | ||
| current += ch + next; | ||
| i += 1; | ||
| inBlockComment = true; | ||
| continue; | ||
| } | ||
|
|
||
| if (!inDouble && !inBacktick && ch === "'") { | ||
| current += ch; | ||
| if (inSingle && next === "'") { | ||
| current += next; | ||
| i += 1; | ||
| continue; | ||
| } | ||
| inSingle = !inSingle; | ||
| continue; | ||
| } | ||
|
|
||
| if (!inSingle && !inBacktick && ch === '"') { | ||
| current += ch; | ||
| if (inDouble && next === '"') { | ||
| current += next; | ||
| i += 1; | ||
| continue; | ||
| } | ||
| inDouble = !inDouble; | ||
| continue; | ||
| } | ||
|
|
||
| if (!inSingle && !inDouble && ch === '`') { | ||
| current += ch; | ||
| if (inBacktick && next === '`') { | ||
| current += next; | ||
| i += 1; | ||
| continue; | ||
| } | ||
| inBacktick = !inBacktick; | ||
| continue; | ||
| } | ||
|
|
||
| if (ch === ';' && !inSingle && !inDouble && !inBacktick && !inLineComment && !inBlockComment) { | ||
| const statement = current.trim(); | ||
| if (statement.length > 0) { | ||
| statements.push(statement); | ||
| } | ||
| current = ''; | ||
| continue; | ||
| } | ||
|
|
||
| current += ch; | ||
| } | ||
|
|
||
| const tail = current.trim(); | ||
| if (tail.length > 0) { | ||
| statements.push(tail); | ||
| } | ||
|
|
||
| return statements; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
set -euo pipefail
# Read lines 260-320 with line numbers to inspect the exact code
nl -ba betterbase/packages/cli/src/commands/migrate.ts | sed -n '260,320p'Repository: weroperking/Betterbase
Length of output: 108
🏁 Script executed:
#!/bin/bash
set -euo pipefail
# Read lines 260-320 with line numbers using cat -n
cat -n betterbase/packages/cli/src/commands/migrate.ts | sed -n '260,320p'Repository: weroperking/Betterbase
Length of output: 1910
🏁 Script executed:
#!/bin/bash
set -euo pipefail
# Check for TypeScript configuration
find . -name "tsconfig.json" -o -name "tsconfig*.json" | head -5Repository: weroperking/Betterbase
Length of output: 274
🏁 Script executed:
#!/bin/bash
set -euo pipefail
# Check TypeScript configuration for strict mode
cat betterbase/packages/cli/tsconfig.jsonRepository: weroperking/Betterbase
Length of output: 226
🏁 Script executed:
#!/bin/bash
set -euo pipefail
# Check base TypeScript configuration
cat betterbase/tsconfig.base.jsonRepository: weroperking/Betterbase
Length of output: 384
🏁 Script executed:
#!/bin/bash
set -euo pipefail
# Check imports and Zod usage in the file
head -50 betterbase/packages/cli/src/commands/migrate.tsRepository: weroperking/Betterbase
Length of output: 1511
🏁 Script executed:
#!/bin/bash
set -euo pipefixturef
# Check if splitStatements is exported and used
grep -n "splitStatements" betterbase/packages/cli/src/commands/migrate.ts | head -20Repository: weroperking/Betterbase
Length of output: 123
🏁 Script executed:
#!/bin/bash
set -euo pipefail
# Check if splitStatements is exported and used
grep -n "splitStatements" betterbase/packages/cli/src/commands/migrate.tsRepository: weroperking/Betterbase
Length of output: 173
🏁 Script executed:
#!/bin/bash
set -euo pipefail
# Check context around the usage of splitStatements
cat -n betterbase/packages/cli/src/commands/migrate.ts | sed -n '380,390p'Repository: weroperking/Betterbase
Length of output: 545
🏁 Script executed:
#!/bin/bash
set -euo pipefail
# Check the exact bytes at lines 280-281 to confirm literal newline vs escaped \n
sed -n '280,281p' betterbase/packages/cli/src/commands/migrate.ts | od -cRepository: weroperking/Betterbase
Length of output: 108
🏁 Script executed:
#!/bin/bash
set -euo pipefail
# Use Python to inspect the exact bytes at lines 280-281
python <<'EOF'
with open('betterbase/packages/cli/src/commands/migrate.ts', 'r') as f:
lines = f.readlines()
# Get lines 279-280 (0-indexed)
for i in range(279, 281):
if i < len(lines):
line = lines[i]
print(f"Line {i+1}: {repr(line)}")
EOFRepository: weroperking/Betterbase
Length of output: 117
Fix critical syntax error in line-comment parsing logic.
Lines 280-281 contain a literal newline character inside the string literal instead of an escaped newline. The code has:
if (ch === '
') {
This is invalid TypeScript and will fail compilation. Change to:
if (ch === '\n') {
This breaks the line-comment state termination logic, causing SQL parser to fail on SQL files with line comments (e.g., -- comment).
Additionally, the splitStatements function accepts a sql: string parameter without Zod validation. Add input validation using the Zod schema pattern already established in this file (see migrateOptionsSchema).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@betterbase/packages/cli/src/commands/migrate.ts` around lines 265 - 361, The
splitStatements function has a syntax bug and missing input validation: replace
the invalid literal check if (ch === '
') with the escaped newline check if (ch === '\n') inside splitStatements to
correctly terminate inLineComment, and add Zod validation for the sql parameter
(use the same pattern as migrateOptionsSchema) so splitStatements validates that
sql is a string before processing; update the function signature/entry to run
that schema parse/coerce (or throw) and then proceed with the existing parsing
logic.
| select(fields = '*'): this { | ||
| this.assertMutable(); | ||
| this.selectFields = stringSchema.parse(fields); | ||
| return this; | ||
| } | ||
|
|
||
| eq(column: string, value: unknown): this { | ||
| this.assertMutable(); | ||
| this.filters[stringSchema.parse(column)] = value; | ||
| return this; | ||
| } | ||
|
|
||
| in(column: string, values: unknown[]): this { | ||
| this.assertMutable(); | ||
| const parsedValues = valuesSchema.parse(values); | ||
| this.filters[`${stringSchema.parse(column)}_in`] = JSON.stringify(parsedValues); | ||
| return this; | ||
| } | ||
|
|
||
| limit(count: number): this { | ||
| this.assertMutable(); | ||
| this.options.limit = nonNegativeIntSchema.parse(count); | ||
| return this; | ||
| } | ||
|
|
||
| offset(count: number): this { | ||
| this.assertMutable(); | ||
| this.options.offset = nonNegativeIntSchema.parse(count); | ||
| return this; | ||
| } | ||
|
|
||
| order(column: string, direction: 'asc' | 'desc' = 'asc'): this { | ||
| this.assertMutable(); | ||
| this.options.orderBy = { column: stringSchema.parse(column), direction }; | ||
| return this; | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Look for _in query parsing on server routes/templates
rg -n "_in|JSON.parse|filter" betterbase/templates/base/src -g '*.ts' -C 3Repository: weroperking/Betterbase
Length of output: 5935
🏁 Script executed:
#!/bin/bash
# Search for API routes and handlers that might process _in parameters
# Look in common API route directories
find betterbase/templates/base/src -type f \( -name "*route*" -o -name "*api*" -o -name "*handler*" \) -name "*.ts" | head -20Repository: weroperking/Betterbase
Length of output: 48
🏁 Script executed:
#!/bin/bash
# List the directory structure
ls -la betterbase/templates/base/src/
tree betterbase/templates/base/src -I node_modules -L 2Repository: weroperking/Betterbase
Length of output: 727
🏁 Script executed:
#!/bin/bash
# Check routes/index.ts
echo "=== routes/index.ts ==="
cat betterbase/templates/base/src/routes/index.ts
echo ""
echo "=== routes/users.ts ==="
cat betterbase/templates/base/src/routes/users.ts
echo ""
echo "=== db/index.ts ==="
cat betterbase/templates/base/src/db/index.tsRepository: weroperking/Betterbase
Length of output: 3841
🏁 Script executed:
#!/bin/bash
# Read the full query-builder.ts file
cat -n betterbase/packages/client/src/query-builder.tsRepository: weroperking/Betterbase
Length of output: 9191
Implement server-side parsing for _in filter parameters.
The in() method serializes values as JSON strings under ${column}_in keys and sends them as URL query parameters (line 48, 79-81). However, the server routes (users.ts) don't parse or deserialize these _in parameters—they're silently ignored. All _in filter conditions will be dropped from queries.
Add Zod validation and JSON deserialization in the server route handlers to process _in parameters, or update the client to transmit filters in a format the server currently handles (e.g., explicit query parameter structure with pre-parsed values).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@betterbase/packages/client/src/query-builder.ts` around lines 33 - 68, The
client is sending `_in` filters as JSON strings via QueryBuilder.in(), but the
server route handlers are not deserializing them so those filters are ignored;
fix the server handler that processes incoming query params (the users
route/list handler) to detect keys ending with `_in`, JSON.parse the value,
validate with the same valuesSchema (or a Zod array schema), and replace the
string with the parsed array before applying filters; alternatively you can
change the client QueryBuilder.in() to send the array directly (remove
JSON.stringify in QueryBuilder.in and send parsedValues) but if you choose
server-side, use valuesSchema.parse(parsed) and stringSchema.parse for column
names to ensure type-safety when applying the filter.
| private sendSubscribeAll(table: string): void { | ||
| const tableSubscribers = this.subscriptions.get(table); | ||
| if (!tableSubscribers || tableSubscribers.size === 0) { | ||
| return; | ||
| } | ||
|
|
||
| for (const subscriber of tableSubscribers.values()) { | ||
| this.sendSubscribe(table, subscriber.filter); | ||
| } | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Inspect server subscription storage and protocol fields
echo "=== SERVER REALTIME IMPLEMENTATION ==="
rg -n "subscriptions|subscribe|filter" betterbase/templates/base/src/lib/realtime.ts -C 3
echo ""
echo "=== CLIENT REALTIME IMPLEMENTATION ==="
rg -n "sendSubscribe|subscribe" betterbase/packages/client/src/realtime.ts -C 3Repository: weroperking/Betterbase
Length of output: 7876
🏁 Script executed:
sed -n '135,189p' betterbase/packages/client/src/realtime.ts | cat -nRepository: weroperking/Betterbase
Length of output: 2045
Server stores only the last per-table filter; earlier subscribers receive mismatched data.
The client can register multiple subscribers with different filters on the same table via from(table).on(...).subscribe(filter) (lines 135-158), with each call to sendSubscribe() sending a separate filter. However, the server's client.subscriptions.set(table, { table, filter }) (line 209) overwrites the previous subscription for that table. When broadcasting, the server retrieves only this single subscription and applies its filter (lines 168-169), causing earlier subscribers to miss updates that don't match the last registered filter.
Consider either:
- Enforcing one subscription per table on the client (merge or replace callbacks), or
- Extending the protocol with a subscription ID field so the server can track multiple filters per client/table.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@betterbase/packages/client/src/realtime.ts` around lines 60 - 69, The bug is
that multiple client-side subscribers for the same table use different filters
but the server only keeps the last filter via client.subscriptions.set(table, {
table, filter }), so earlier subscribers get wrong data; fix by extending the
protocol to include a subscription ID: update client.sendSubscribeAll and
sendSubscribe to send a unique subId per subscriber (from the subscriptions map
entries), update server-side storage to use client.subscriptions.set(table,
Map<subId, {table,filter,clientId}>) (or similar) and update broadcast logic
(where it reads client.subscriptions.get(table)) to iterate all subIds and apply
each stored filter when routing events; alternatively, if you prefer
single-subscription semantics, change the client API
(from(...).on(...).subscribe) to merge/replace callbacks per table so only one
filter is ever sent by sendSubscribe/sendSubscribeAll.
| from(table: string): { | ||
| on: <T = unknown>(event: RealtimeEvent, callback: RealtimeCallback<T>) => { | ||
| subscribe: (filter?: Record<string, unknown>) => RealtimeSubscription; | ||
| }; | ||
| } { | ||
| return { | ||
| on: (event, callback) => ({ | ||
| subscribe: (filter) => { | ||
| if (!this.disabled) { | ||
| this.connect(); | ||
| } | ||
|
|
||
| const tableSubscribers = this.subscriptions.get(table) ?? new Map<string, SubscriberEntry>(); | ||
| const id = `${table}:${this.subscriberSequence++}`; | ||
|
|
||
| tableSubscribers.set(id, { | ||
| event, | ||
| filter, | ||
| callback: (payload) => callback(payload as Parameters<typeof callback>[0]), | ||
| }); | ||
|
|
||
| this.subscriptions.set(table, tableSubscribers); | ||
| if (!this.disabled) { | ||
| this.sendSubscribe(table, filter); | ||
| } |
There was a problem hiding this comment.
Add Zod validation for public inputs (table, filter).
These are public entry points and currently accept arbitrary values. Add schemas (e.g., z.string().min(1) for table, z.record(z.string(), z.unknown()) for filter) to align with the project requirement for runtime validation.
✅ Suggested direction
+import { z } from 'zod';
+
+const tableSchema = z.string().min(1);
+const filterSchema = z.record(z.string(), z.unknown()).optional();
@@
- from(table: string): {
+ from(table: string): {
@@
- subscribe: (filter) => {
+ subscribe: (filter) => {
+ const parsedTable = tableSchema.parse(table);
+ const parsedFilter = filterSchema.parse(filter);
@@
- const tableSubscribers = this.subscriptions.get(table) ?? new Map<string, SubscriberEntry>();
+ const tableSubscribers = this.subscriptions.get(parsedTable) ?? new Map<string, SubscriberEntry>();
@@
- this.subscriptions.set(table, tableSubscribers);
+ this.subscriptions.set(parsedTable, tableSubscribers);
@@
- if (!this.disabled) {
- this.sendSubscribe(table, filter);
- }
+ if (!this.disabled) {
+ this.sendSubscribe(parsedTable, parsedFilter);
+ }As per coding guidelines, Implement Zod validation everywhere for type safety.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@betterbase/packages/client/src/realtime.ts` around lines 135 - 159, The
public from(table: string) entry needs runtime Zod validation for the table and
filter parameters; add a zod schema (e.g., tableSchema = z.string().min(1) and
filterSchema = z.record(z.string(), z.unknown()).optional()) and validate the
input at the start of from(table: string) and inside subscribe(filter) before
using this.connect(), creating subscriber entries, calling
this.sendSubscribe(table, filter), or storing in this.subscriptions; on
validation failure throw or return a clear error to prevent storing invalid keys
in the subscriptions Map or sending invalid subscribe messages.
| "lib": [ | ||
| "ES2022", | ||
| "DOM" | ||
| ], | ||
| "types": [ | ||
| "bun" | ||
| ] |
There was a problem hiding this comment.
types: ["bun"] creates a cross-environment type safety gap for a universal SDK.
lib: ["ES2022", "DOM"] signals that this client package is intended to run in browser environments (and presumably Node.js). Adding types: ["bun"] injects all Bun-specific globals — Bun, bun:* module augmentations, etc. — into the compilation scope of every source file. This means TypeScript will accept calls to Bun-only APIs (e.g., Bun.file(), Bun.serve(), bun:sqlite) without error, even though those APIs simply don't exist at runtime in browsers or Node.js. The result is a silent, runtime-only failure that type-checking won't catch.
The Bun runtime is only needed to run the build toolchain (bunx tsc, bun build), not to type-check client source code. Bun types belong in the build/CLI packages, not in the distributed SDK sources.
🛠️ Proposed fix
"compilerOptions": {
"outDir": "./dist",
"declaration": true,
"lib": [
"ES2022",
"DOM"
- ],
- "types": [
- "bun"
]
},If any file under src/ genuinely needs a Bun-only feature, isolate it behind a conditional or a separate entry-point and document the Bun requirement explicitly.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| "lib": [ | |
| "ES2022", | |
| "DOM" | |
| ], | |
| "types": [ | |
| "bun" | |
| ] | |
| "lib": [ | |
| "ES2022", | |
| "DOM" | |
| ] |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@betterbase/packages/client/tsconfig.json` around lines 6 - 12, Remove the Bun
ambient types from the client package TS config so Bun globals don't leak into
the universal SDK: delete "bun" from the "types" array in the package's
tsconfig.json and instead add Bun types only in the build/CLI packages (where
bunx/ build tooling runs); if any source under src/ genuinely requires Bun-only
APIs (e.g., uses Bun.file(), Bun.serve(), or bun:sqlite), isolate that logic
into a separate entry file or a conditionally-imported module and document the
Bun requirement so the main SDK remains browser/Node type-safe.
| "hono": "^4.6.10", | ||
| "zod": "^3.23.8", | ||
| "drizzle-orm": "^0.44.5" | ||
| "zod": "^4.0.0", |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, explore the betterbase/templates/base directory structure
cd betterbase/templates/base && find . -type f -name "*.ts" -o -name "*.tsx" -o -name "*.js" -o -name "*.json" | head -30Repository: weroperking/Betterbase
Length of output: 348
🏁 Script executed:
# List the contents of the template directory
ls -la betterbase/templates/base/Repository: weroperking/Betterbase
Length of output: 679
🏁 Script executed:
# Check the package.json to understand the project structure
head -40 betterbase/templates/base/package.jsonRepository: weroperking/Betterbase
Length of output: 683
🏁 Script executed:
# Check validation.ts for Zod usage
cat -n betterbase/templates/base/src/middleware/validation.tsRepository: weroperking/Betterbase
Length of output: 737
🏁 Script executed:
# Check routes for Zod patterns
cat -n betterbase/templates/base/src/routes/users.tsRepository: weroperking/Betterbase
Length of output: 3209
🏁 Script executed:
# Check other routes
cat -n betterbase/templates/base/src/routes/index.tsRepository: weroperking/Betterbase
Length of output: 1189
🏁 Script executed:
# Search for fast-deep-equal usage across the template
rg -n "fast-deep-equal" betterbase/templates/base/srcRepository: weroperking/Betterbase
Length of output: 154
🏁 Script executed:
# Search for any .message parameter usage (Zod v3 pattern)
rg -n "message\s*:" betterbase/templates/base/srcRepository: weroperking/Betterbase
Length of output: 1220
🏁 Script executed:
# Check how fast-deep-equal is used in realtime.ts
cat -n betterbase/templates/base/src/lib/realtime.ts | head -100Repository: weroperking/Betterbase
Length of output: 4071
🏁 Script executed:
# Verify if there are other deprecated Zod patterns
rg -n "z\.(string|number|boolean)\(\).*\.(email|uuid|url|ip|datetime|date|time)" betterbase/templates/base/srcRepository: weroperking/Betterbase
Length of output: 143
Replace deprecated z.string().email() with z.email() in Zod v4.
In src/routes/users.ts:10, the pattern z.string().email() is deprecated in Zod v4 and should be replaced with the standalone validator z.email(). While the current code will continue to work, it will trigger deprecation warnings and may break in future releases.
Change required
export const createUserSchema = z.object({
- email: z.string().email(),
+ email: z.email(),
name: z.string().min(1),
});The fast-deep-equal dependency addition is justified; it's actively used in src/lib/realtime.ts for subscription filter comparison.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@betterbase/templates/base/package.json` at line 15, Replace the deprecated
z.string().email() usage with the new standalone validator z.email() in the user
input schema defined for user routes (look for the user schema variable or
createUser/createUserSchema in the users route handler). Locate every occurrence
of z.string().email() in that schema and change it to z.email(), keeping the
rest of the validation chain intact; no import changes are required since
z.email() is exported from Zod itself.
Summary by CodeRabbit
Release Notes
New Features
Improvements
Documentation
Dependencies