Skip to content

feat: Sprint 2 — Shared types, database layer, and credential vault#40

Merged
Work90210 merged 5 commits intomasterfrom
feat/sprint-2-db-layer
Mar 19, 2026
Merged

feat: Sprint 2 — Shared types, database layer, and credential vault#40
Work90210 merged 5 commits intomasterfrom
feat/sprint-2-db-layer

Conversation

@Work90210
Copy link
Copy Markdown
Owner

@Work90210 Work90210 commented Mar 19, 2026

Summary

  • Shared types package (packages/types/src/) — 11 type files with branded PlaintextKey, discriminated union ApiResponse, derived CredentialAuthType, BaseEvent shared base, HttpMethod union, and createPlaintextKey runtime factory
  • Drizzle ORM schema (apps/web/lib/db/schema/) — 6 tables (specs, mcp_servers, mcp_tools, credentials, usage_events, request_logs) with CHECK constraints, composite unique indexes, partial indexes on is_active, FK cascades, and updatedAt triggers
  • Repository pattern (apps/web/lib/db/repositories/) — 6 repositories with userId scoping on every query, transaction-wrapped mutations, LIKE pattern escaping, explicit field allowlists, query limits, and safeColumns projection excluding encryptedKey
  • AES-256-GCM credential vault (apps/web/lib/vault/) — 600k-iteration PBKDF2-SHA256, per-installation salt, versioned ciphertext envelope (v1), constant-time cache comparison via timingSafeEqual, key buffer zeroing, and uniform error messages
  • Vault rotation script (scripts/rotate-vault-secret.ts) — per-batch transactions, pre-derived keys, round-trip verification before commit, cursor-based pagination, key zeroing in finally
  • Infrastructure hardening — nginx TLS with HTTP→HTTPS redirect, CSP header, HSTS, X-Request-ID forwarding, secure Redis healthcheck, VAULT_SALT in docker-compose
  • 56 tests (34 vault + 22 types) — all passing

Security design

  • Row-level access control: every query scoped by userId; tool/credential ownership verified via server join
  • TOCTOU prevention: ownership re-verified in DML WHERE clauses within transactions
  • Credential isolation: encryptedKey excluded from all list/detail projections via safeColumns; only getDecryptedKey accesses it with expiry enforcement
  • Input safety: LIKE metacharacters escaped, filter lengths capped at 200 chars, query results limited (100 default / 500 max)
  • Vault: AES-256-GCM with random 12-byte IV per encryption, versioned ciphertext envelope, constant-time cache comparison, key material zeroed on eviction
  • Infrastructure: TLS-only with HSTS preload, CSP, rate limiting, no exposed database/Redis ports

Test plan

  • pnpm --filter @model-translator/types test — 22 tests passing
  • pnpm --filter @model-translator/web test — 34 tests passing
  • pnpm --filter @model-translator/types build — clean
  • pnpm --filter @model-translator/web typecheck — clean
  • pnpm --filter @model-translator/types lint — clean
  • Run migration against real Postgres: psql < apps/web/lib/db/migrations/0001_initial_schema.sql
  • Verify VAULT_SECRET and VAULT_SALT are set in deployment environment
  • Integration tests against real Postgres (future sprint)

KyleFuehri added 2 commits March 19, 2026 07:38
- Remove NODE_AUTH_TOKEN from release workflow (no more npm tokens)
- npm Trusted Publisher configured for Work90210/model-translator + release.yml
- All future publishes authenticated via OIDC with provenance
…ntial vault

Sprint 2 implementation — shared types package, Drizzle ORM schema with 6 tables,
repository pattern with row-level security, AES-256-GCM credential vault with
PBKDF2 key derivation, and vault secret rotation script.

Key deliverables:
- 11 shared type files (spec, server, tool, credential, usage, log, api, errors,
  pagination, base-event, index) with branded PlaintextKey type, discriminated
  union ApiResponse, and derived CredentialAuthType
- 6 Drizzle table schemas with full indexes, CHECK constraints, composite uniques,
  partial indexes on is_active, and updatedAt triggers
- 6 repository files (base + spec, server, tool, credential, log) with userId
  scoping on all queries, transaction-wrapped mutations, LIKE pattern escaping,
  explicit field allowlists, and query limits
- AES-256-GCM vault with 600k-iteration PBKDF2, versioned ciphertext envelope,
  constant-time cache comparison, key buffer zeroing, and uniform error messages
- Vault rotation script with per-batch transactions, pre-derived keys, round-trip
  verification, and cursor-based pagination
- 56 tests (34 vault, 22 types) covering round-trip encryption, tamper detection,
  wrong key rejection, error uniformity, API response narrowing, and type safety
- Nginx hardened with TLS, CSP, HSTS, X-Request-ID, and rate limiting
- Docker Compose updated with VAULT_SALT env var and secure Redis healthcheck

Security: 5 rounds of multi-agent audit — all CRITICAL, HIGH, and MEDIUM issues resolved.
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Mar 19, 2026

Warning

Rate limit exceeded

@Work90210 has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 5 minutes and 18 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: 88045179-489f-42bb-bbf7-26430f472e9c

📥 Commits

Reviewing files that changed from the base of the PR and between a4af398 and a968ac3.

📒 Files selected for processing (7)
  • apps/web/lib/db/repositories/credential.repository.ts
  • apps/web/lib/db/repositories/server.repository.ts
  • apps/web/lib/db/repositories/spec.repository.ts
  • apps/web/lib/db/repositories/tool.repository.ts
  • apps/web/lib/vault/__tests__/vault.test.ts
  • packages/types/src/credential.ts
  • packages/types/src/index.ts
📝 Walkthrough

Walkthrough

Adds a Drizzle/Postgres persistence layer (schemas, migration, repositories), a vault crypto module (PBKDF2 key derivation, AES‑GCM encrypt/decrypt, validation, caching), a secret rotation script, expanded TypeScript domain types and tests, and CI/infra/env updates for vault salt and pool sizing.

Changes

Cohort / File(s) Summary
Environment & CI
\.env.example`, infra/docker-compose.yml, turbo.json, .github/workflows/release.yml`
Replaced hardcoded dev secrets with REPLACE_ME, added DATABASE_POOL_MAX and VAULT_SALT, adjusted Docker healthcheck auth, and changed publish step to set NPM_CONFIG_PROVENANCE=true instead of exposing token envs.
DB Config & Client
apps/web/lib/db/drizzle.config.ts, apps/web/lib/db/index.ts
Added Drizzle Kit config and a lazily-initialized Postgres-js + Drizzle client honoring DATABASE_URL and DATABASE_POOL_MAX with PgBouncer-friendly options.
Migrations
apps/web/lib/db/migrations/0001_initial_schema.sql
Added initial migration creating tables (specs, mcp_servers, mcp_tools, credentials, usage_events, request_logs), indexes, constraints, and update_updated_at() triggers.
Schema Definitions
apps/web/lib/db/schema/*.ts, apps/web/lib/db/schema/index.ts
Defined Drizzle pgTable schemas and re-export index for specs, servers, tools, credentials, usage events, and request logs (FKs, defaults, UUID PKs, indexes).
Repository Base & Constants
apps/web/lib/db/repositories/base.repository.ts, .../constants.ts
Added generic BaseRepository (typed CRUD signatures, freeze helpers) and query constants plus escapeLikePattern() helper.
Repository Implementations
apps/web/lib/db/repositories/*.ts
Added Spec/Server/Tool/Credential/Log repositories with user-scoped queries, ownership checks, transactional create/update flows, append-only logs, and credential encrypt/decrypt integration.
Vault Crypto Module
apps/web/lib/vault/*.ts, apps/web/lib/vault/__tests__/vault.test.ts
Implemented PBKDF2 deriveKey with secure cache/zeroing, AES‑256‑GCM encrypt/decrypt with versioning and auth-tag checks, validators for secret/salt, high-level helpers, and comprehensive Vitest tests (round‑trip, tamper, caching, validation).
Secret Rotation Script
scripts/rotate-vault-secret.ts
Added batch rotation script to re-encrypt credentials rows from old→new secret+salt with verification, transactional updates, and zeroing of key material.
Type Definitions
packages/types/src/*.ts, packages/types/src/index.ts
Added domain types (Spec, McpServer, McpTool, Credential, BaseEvent, UsageEvent, RequestLog), pagination/api helpers, error codes/http mapping, and value constructors (e.g., createPlaintextKey, createSuccessResponse).
Testing & Tooling
apps/web/vitest.config.ts, apps/web/package.json, packages/types/package.json, packages/types/src/__tests__/*.ts
Added Vitest config and scripts, test suites for types and vault, and added Drizzle/postgres/zod and Vitest dev deps.
Infra: nginx
infra/nginx/nginx.conf
Enabled HTTPS (listen 443 ssl), added HTTP→HTTPS redirect, HSTS and CSP headers, removed obsolete map block, and forwarded X-Request-ID to upstreams.

Sequence Diagram(s)

sequenceDiagram
  participant Client as "Client"
  participant API as "Web API"
  participant Vault as "Vault Module"
  participant DB as "Postgres DB"

  Client->>API: POST /credentials (plaintextKey, metadata)
  API->>DB: verify server belongs to user (SELECT)
  DB-->>API: server row
  API->>Vault: deriveKey(secret,salt)
  Vault-->>API: key
  API->>Vault: encrypt(plaintextKey, key)
  Vault-->>API: ciphertext
  API->>DB: INSERT credential (ciphertext, metadata) within transaction
  DB-->>API: inserted row
  API-->>Client: 201 Created (credential metadata, no plaintext)
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~75 minutes

Possibly related PRs

Poem

🐰 I hopped through schemas, salts, and keys,

I cached, I zeroed, and danced with ease,
I rotated secrets, ran tests through night,
Now vaults are snug and migrations right,
A rabbit’s thump—your data sleeps tight. ✨

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 4.35% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title accurately summarizes the main change: adding a shared types package, database layer with Drizzle ORM, and credential vault infrastructure for Sprint 2.
Description check ✅ Passed The description provides comprehensive detail about the changeset, including summaries of each major component, security design principles, test coverage, and outstanding manual steps.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch feat/sprint-2-db-layer
📝 Coding Plan
  • Generate coding plan for human review comments

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

- Remove unused `beforeEach` and `vi` imports from vault tests
- Remove unused `PlaintextKey` type import from types tests
- Add blank lines between import groups (node: vs local)
- Sort local imports alphabetically (constants before modules)
- Replace inline `import()` type annotation with top-level import
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Mar 19, 2026

Coverage Report for packages/transformer

Status Category Percentage Covered / Total
🔵 Lines 96.52% (🎯 80%) 584 / 605
🔵 Statements 96.52% (🎯 80%) 584 / 605
🔵 Functions 100% (🎯 80%) 35 / 35
🔵 Branches 94.67% (🎯 80%) 231 / 244
File CoverageNo changed files found.
Generated in workflow #45 for commit a968ac3 by the Vitest Coverage Report Action

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 23

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@apps/web/lib/db/migrations/0001_initial_schema.sql`:
- Around line 23-36: The migration currently only references parent ids (e.g.,
specs(id)) so tenant ownership (user_id) isn’t enforced; update the parent
tables (e.g., add UNIQUE or PRIMARY KEY constraint on (id, user_id) in specs and
any other parent tables) and change child table foreign keys (e.g., in
mcp_servers) to reference the composite (id, user_id) columns—i.e., add user_id
to the referenced key and create FOREIGN KEY (spec_id, user_id) REFERENCES
specs(id, user_id) (and similarly for other child tables mentioned around lines
60–70) so Postgres enforces tenant scoping at the DB level.

In `@apps/web/lib/db/repositories/credential.repository.ts`:
- Around line 111-125: The update method allows an empty UpdateCredentialInput
which leads to updateValues being empty and executing .set(updateValues) with no
changes; after building updateValues in the async update(userId: string, id:
string, input: UpdateCredentialInput) transaction, add a guard that checks if
updateValues is empty (e.g., Object.keys(updateValues).length === 0) and reject
the operation by throwing a suitable error (BadRequestError or Error with a
clear message like "no fields to update") so the repository does not execute
tx.update(credentials).set({}) and the caller receives a failure for empty
patches.

In `@apps/web/lib/db/repositories/log.repository.ts`:
- Around line 40-41: The query in findAll (where rows are cast with "as
RequestLog[]") is using an unsafe type assertion because Drizzle's inferred row
type doesn't match RequestLog; update the code to either: 1) align the Drizzle
table/schema definitions to match the RequestLog interface so the query result
is already typed as RequestLog[], or 2) change the query to explicitly select
columns and map them to RequestLog (similar to ToolRepository.findAll) before
calling freezeAll, and then return freezeAll(mappedRows) without the "as" cast;
adjust the function's return type if needed and remove the unsafe assertion
around rows.

In `@apps/web/lib/db/repositories/server.repository.ts`:
- Around line 94-114: The update method in server.repository.ts (async update in
the ServerRepository using mcpServers and freeze) must handle an empty
UpdateServerInput and align with ToolRepository.update's behavior: if
updateValues is empty (Object.keys(updateValues).length === 0) either throw a
validation error (e.g., "No fields to update") or return the existing server
row, and to be consistent consider wrapping the operation in the same
transaction+pre-check pattern used by ToolRepository.update (start a
transaction, select the mcpServers row for the given id+userId, throw if not
found, then perform the update and returning) so access checks and atomicity
match other repositories.

In `@apps/web/lib/db/repositories/spec.repository.ts`:
- Around line 24-28: The capped query in spec.repository.ts uses
.limit(DEFAULT_QUERY_LIMIT) without ordering, causing non-deterministic results;
add a stable ORDER BY before the LIMIT (e.g., .orderBy(specs.created_at, 'desc')
or .orderBy(specs.id, 'asc') on the same query chain that contains
this.db.select().from(specs).where(and(...conditions))). Also update the API to
support deterministic pagination (cursor or offset) so callers can page through
results reliably instead of relying on a moving subset.
- Around line 60-72: The update method builds updateValues and may call
.set(updateValues) with an empty object which makes Drizzle ORM fail; add a
validation in SpecRepository.update right after constructing updateValues to
check if Object.keys(updateValues).length === 0 and throw a clear
repository-level error (e.g., new Error or a BadRequest/ValidationError)
indicating "empty spec patch" so callers receive a meaningful validation error
instead of a DB error before calling .set().

In `@apps/web/lib/db/repositories/tool.repository.ts`:
- Around line 108-128: The update path can call
tx.update(mcpTools).set(updateValues) with an empty updateValues when input has
no defined fields; validate that at least one updatable field is present before
calling the DB. In the function containing updateValues, check whether
updateValues is empty (no keys) after populating it from
input.name/description/inputSchema/isActive and, if empty, return an appropriate
response or throw a validation error (e.g., "No fields to update") instead of
executing tx.update(...).set({}); this prevents relying on Drizzle's behavior
and keeps the subsequent re-verification/this.freeze(rows[0]) logic safe.

In `@apps/web/lib/db/schema/credentials.ts`:
- Line 14: The schema currently defines authType via text('auth_type', { enum:
['api_key', 'bearer'] }).notNull() but lacks a Drizzle-level CHECK constraint;
update the table definition that declares authType to include a constraints
callback and add check('auth_type_check', sql`${table.authType} IN ('api_key',
'bearer')`) so the CHECK is visible in the Drizzle schema (reference the table
variable where authType is declared and the authType symbol itself when adding
the check).

In `@apps/web/lib/db/schema/request-logs.ts`:
- Around line 9-12: The request-logs schema allows toolId to reference mcpTools
independently of serverId, so a log row can point to a tool that belongs to a
different server; update the schema to enforce the server-tool pairing by either
adding a composite foreign key constraint on (server_id, tool_id) that
references the corresponding composite key in mcp_tools or, if your ORM/schema
layer doesn’t support composite references, add a validation step in the insert
logic to check that the referenced tool (mcpTools.id) has mcpTools.server_id ===
serverId before inserting; specifically modify the request-logs definition
around serverId and toolId (and/or the insert/create function that writes logs)
to implement one of these fixes so serverId and toolId are always consistent.

In `@apps/web/lib/db/schema/specs.ts`:
- Around line 16-19: Add a composite unique index on (userId, name, version) to
prevent duplicate specs per user by updating the index block that currently
defines userIdIdx and userNameIdx; define a new index (e.g.,
uidx_specs_user_name_version) using the same index(...) builder and mark it
unique, targeting table.userId, table.name, and table.version so the combination
is enforced as unique (leave as-is only if duplicates are intentional).

In `@apps/web/lib/db/schema/tools.ts`:
- Line 13: The field definition inputSchema uses
jsonb('input_schema').notNull().$type<Record<string, unknown>>() which is too
permissive; replace the generic Record type with the project's recursive JSON
serializable type (the same recursive Json/JsonValue/JsonObject alias used
elsewhere) so only JSON-serializable values are allowed at compile time; update
the $type<> parameter on jsonb('input_schema').notNull() to that recursive JSON
type and run type checks to ensure Date/Map/BigInt etc. are rejected.

In `@apps/web/lib/db/schema/usage-events.ts`:
- Around line 9-12: The usage_events schema currently allows a toolId that
belongs to a different server than serverId; enforce the pair by adding a
composite foreign-key constraint on (serverId, toolId) referencing the mcpTools
compound key (server_id, id) or, if mcpTools doesn't expose server_id as a key,
validate/lookup in the insertion path: in the DB schema change add a composite
FK referencing mcpTools(server_id, id) with the desired onDelete behavior (or
alter mcpTools to expose a unique (server_id,id) key), or alternatively update
the insert logic that writes to usage_events to query mcpTools by id and compare
its server_id to serverId and reject the insert when they mismatch (affecting
the code that constructs usage_events rows).

In `@apps/web/lib/vault/__tests__/vault.test.ts`:
- Around line 209-220: The test currently verifies derivation works after
clearKeyCache but not that the cached key buffer was zeroed; update the code and
test so clearKeyCache actually zeroes the internal cached buffer and the test
can assert it: ensure the implementation of clearKeyCache iterates over the
internal cached key Buffer (the module-level cache used by deriveKey) and
overwrites bytes with 0, and either expose a test-only accessor (e.g.,
getCachedKeyForTest) or have clearKeyCache return the previous buffer so the
test can assert every byte is 0; update the test to call that accessor or
inspect the returned buffer and assert buffer.every(b => b === 0).

In `@apps/web/lib/vault/decrypt.ts`:
- Around line 22-31: Derive the AES key as before with deriveKey(secret, salt)
but move createDecipheriv(ALGORITHM, key, iv) and decipher.setAuthTag(authTag)
outside the decryption try/catch so any setup errors propagate; then wrap only
the decipher.update(encrypted)/decipher.final() call in a try/catch that throws
the existing 'Decryption failed' error on failure, and add a finally block that
securely zeroes the derived key buffer (e.g. key.fill(0)) to wipe key material
before returning or rethrowing.

In `@apps/web/lib/vault/encrypt.ts`:
- Around line 5-19: encrypt and decrypt both call deriveKey() which returns a
fresh Buffer of sensitive key material that must be zeroed after use; wrap the
key usage in a try/finally so the derived key (from deriveKey(secret, salt)) is
always wiped via key.fill(0) in the finally block. Update the encrypt function
(and the corresponding decrypt function) to obtain key, use it to
createCipheriv/createDecipheriv and perform crypto ops inside the try, then in
finally call key.fill(0) to clear the buffer, preserving existing behavior
around IV_LENGTH, AUTH_TAG_LENGTH and CIPHERTEXT_VERSION.

In `@infra/docker-compose.yml`:
- Line 63: The redis healthcheck fails because the container does not have
REDIS_PASSWORD in its environment; add an environment section to the redis
service so the container exposes REDIS_PASSWORD (e.g., set REDIS_PASSWORD:
${REDIS_PASSWORD}) so the healthcheck command (the test using
REDISCLI_AUTH=$$REDIS_PASSWORD redis-cli ping) can read the password at runtime;
update the redis service block to include the environment mapping and keep the
existing healthcheck test unchanged.

In `@infra/nginx/nginx.conf`:
- Line 49: The nginx config is setting a stricter Content-Security-Policy via
the add_header Content-Security-Policy directive which conflicts with the app's
CSP in next.config.mjs (missing script-src 'unsafe-inline' needed for Next.js
hydration); either remove the add_header Content-Security-Policy line from
nginx.conf so the app's CSP is used, or update that add_header value to match
the app's policy (include script-src 'self' 'unsafe-inline' and other directives
exactly) so both proxies and the app send the same CSP until you migrate to
nonces.

In `@packages/types/package.json`:
- Around line 18-19: The project is using ESLint 10 which requires the flat
config format, so replace the legacy packages/types/.eslintrc.js with a
flat-style config file named packages/types/eslint.config.js following the
ESLint migration guide: convert the exported legacy configuration into the new
flat array/module format (using eslint.config.js style exports), ensure plugins,
parserOptions, env, rules, and any overrides are translated into the flat config
API, update any custom rule or plugin resolution if needed, and verify the
existing "lint" script ("eslint src/") still points to the package config by
running eslint locally to confirm no config errors.

In `@packages/types/src/__tests__/types.test.ts`:
- Around line 11-25: Remove the unused PlaintextKey import from the type import
list (the import statement containing ApiResponse, Spec, McpServer, McpTool,
Credential, UsageEvent, RequestLog, BaseEvent, PlaintextKey, CredentialAuthType,
AuthMode, TransportType, HttpMethod) and add a blank line between different
import groups so ESLint grouping rules are satisfied; update the single "import
type { ... } from '../index.js';" statement by deleting PlaintextKey and ensure
there's an empty line separating this type import group from any adjacent
imports.
- Around line 195-200: The test uses a dynamic type import (type CreateKeys =
keyof import('../spec.js').CreateSpecInput) which ESLint forbids; replace it
with a static type import by adding "import type { CreateSpecInput } from
'../spec.js';" at the top of the test file and then change the test to use
CreateSpecInput directly (e.g., type CreateKeys = keyof CreateSpecInput) so the
test references the CreateSpecInput type without using import().

In `@packages/types/src/credential.ts`:
- Around line 15-25: The public credential shape currently includes encryptedKey
which is omitted by safeColumns; fix by splitting types: keep a persistence type
(e.g., CredentialRow or rename current Credential) that includes readonly
encryptedKey: string and add an exported public/read model type (e.g.,
CredentialPublic or CredentialSafe) that omits encryptedKey (encryptedKey?:
never or simply omitted) and uses the same other fields (id, serverId, userId,
label, authType, expiresAt, createdAt, updatedAt). Update repository method
signatures (findAll(), findById(), create(), update()) and any usages to return
or accept the appropriate type (public/read methods return CredentialPublic,
persistence methods accept/return CredentialRow) and ensure safeColumns remains
the source of truth for what the repo exposes so callers no longer need unsafe
casts.

In `@packages/types/src/spec.ts`:
- Line 7: Replace the loose Record<string, unknown> types with a recursive JSON
type (e.g., JsonObject / JsonValue) so the TypeScript types match JSONB storage
and disallow non-JSON values like Date/Map/BigInt; add or import a shared
JsonValue/JsonObject definition and change the type of the rawSpec property
(readonly rawSpec), and the other two occurrences mentioned in the comment (the
other spec-related properties at lines referenced) to use JsonObject (or
JsonValue as appropriate) instead of Record<string, unknown>, and ensure those
Json types are exported from a shared types file so callers can use them
consistently.

In `@scripts/rotate-vault-secret.ts`:
- Around line 98-130: The rotation code currently rewrites rows in-place (using
decryptDirect/encryptDirect on credentials.encryptedKey inside the transaction),
which breaks readers still using oldKey; change to a dual-key/versioned rollout:
add a key_version or new_encrypted_key field on the credentials row, update the
rotation loop (the batch/tx block around decryptDirect/encryptDirect) to write
the reEncrypted value into a separate column (e.g., encryptedKeyV2 or set
key_version = 'v2' + encryptedKey = reEncrypted) without deleting the old
encryptedKey, and update runtime decryption logic to attempt decryptDirect(...,
newKey) first and fall back to decryptDirect(..., oldKey) if that fails; once
all rows have the new column/version and the app is switched to use newKey, run
a cleanup to remove the old encryptedKey column. Ensure references:
decryptDirect, encryptDirect, credentials.encryptedKey, the batch loop and
tx.update are changed accordingly.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

Run ID: 6109612f-3850-4b78-bcdc-5578c8c5e0b2

📥 Commits

Reviewing files that changed from the base of the PR and between b27462f and 9008a60.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (45)
  • .env.example
  • .github/workflows/release.yml
  • apps/web/lib/db/drizzle.config.ts
  • apps/web/lib/db/index.ts
  • apps/web/lib/db/migrations/0001_initial_schema.sql
  • apps/web/lib/db/repositories/base.repository.ts
  • apps/web/lib/db/repositories/constants.ts
  • apps/web/lib/db/repositories/credential.repository.ts
  • apps/web/lib/db/repositories/log.repository.ts
  • apps/web/lib/db/repositories/server.repository.ts
  • apps/web/lib/db/repositories/spec.repository.ts
  • apps/web/lib/db/repositories/tool.repository.ts
  • apps/web/lib/db/schema/credentials.ts
  • apps/web/lib/db/schema/index.ts
  • apps/web/lib/db/schema/request-logs.ts
  • apps/web/lib/db/schema/servers.ts
  • apps/web/lib/db/schema/specs.ts
  • apps/web/lib/db/schema/tools.ts
  • apps/web/lib/db/schema/usage-events.ts
  • apps/web/lib/vault/__tests__/vault.test.ts
  • apps/web/lib/vault/constants.ts
  • apps/web/lib/vault/decrypt.ts
  • apps/web/lib/vault/derive-key.ts
  • apps/web/lib/vault/encrypt.ts
  • apps/web/lib/vault/index.ts
  • apps/web/lib/vault/validate.ts
  • apps/web/package.json
  • apps/web/vitest.config.ts
  • infra/docker-compose.yml
  • infra/nginx/nginx.conf
  • packages/types/package.json
  • packages/types/src/__tests__/types.test.ts
  • packages/types/src/api.ts
  • packages/types/src/base-event.ts
  • packages/types/src/credential.ts
  • packages/types/src/errors.ts
  • packages/types/src/index.ts
  • packages/types/src/log.ts
  • packages/types/src/pagination.ts
  • packages/types/src/server.ts
  • packages/types/src/spec.ts
  • packages/types/src/tool.ts
  • packages/types/src/usage.ts
  • scripts/rotate-vault-secret.ts
  • turbo.json

Comment thread apps/web/lib/db/migrations/0001_initial_schema.sql
Comment thread apps/web/lib/db/repositories/credential.repository.ts Outdated
Comment thread apps/web/lib/db/repositories/log.repository.ts
Comment thread apps/web/lib/db/repositories/server.repository.ts
Comment thread apps/web/lib/db/repositories/spec.repository.ts
Comment thread packages/types/src/__tests__/types.test.ts
Comment thread packages/types/src/__tests__/types.test.ts Outdated
Comment thread packages/types/src/credential.ts Outdated
readonly name: string;
readonly version: string;
readonly sourceUrl: string | null;
readonly rawSpec: Record<string, unknown>;
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot Mar 19, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Use a JSON type instead of Record<string, unknown>.

These properties are persisted as JSONB, but Record<string, unknown> still accepts non-JSON values like Date, Map, or BigInt. That lets invalid payloads typecheck here and then fail or serialize lossy at runtime. A shared recursive JsonObject / JsonValue type would make the contract match the storage format.

Also applies to: 17-17, 24-24

🧰 Tools
🪛 GitHub Actions: CI

[error] Lint step failed: 'eslint src/' (command exited non-zero due to error)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/types/src/spec.ts` at line 7, Replace the loose Record<string,
unknown> types with a recursive JSON type (e.g., JsonObject / JsonValue) so the
TypeScript types match JSONB storage and disallow non-JSON values like
Date/Map/BigInt; add or import a shared JsonValue/JsonObject definition and
change the type of the rawSpec property (readonly rawSpec), and the other two
occurrences mentioned in the comment (the other spec-related properties at lines
referenced) to use JsonObject (or JsonValue as appropriate) instead of
Record<string, unknown>, and ensure those Json types are exported from a shared
types file so callers can use them consistently.

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Record<string, unknown> is the standard approach for storing arbitrary JSON in TypeScript. A more specific JSON Schema type would be overly restrictive given that rawSpec stores the full OpenAPI/MCP spec which varies significantly.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rate Limit Exceeded

@Work90210 have exceeded the limit for the number of chat messages per hour. Please wait 1 minutes and 51 seconds before sending another message.

Comment on lines +98 to +130
while (processed < totalCount) {
const batch = await db
.select({ id: credentials.id, encryptedKey: credentials.encryptedKey })
.from(credentials)
.where(gt(credentials.id, lastId))
.orderBy(credentials.id)
.limit(BATCH_SIZE);

if (batch.length === 0) break;

await db.transaction(async (tx) => {
for (const cred of batch) {
let plaintext: string;
try {
plaintext = decryptDirect(cred.encryptedKey, oldKey);
} catch {
throw new Error(`Decryption failed for credential ${cred.id}`);
}

const reEncrypted = encryptDirect(plaintext, newKey);

// Verify round-trip before committing
const verified = decryptDirect(reEncrypted, newKey);
if (verified !== plaintext) {
throw new Error(`Round-trip verification failed for credential ${cred.id}`);
}

await tx
.update(credentials)
.set({ encryptedKey: reEncrypted })
.where(eq(credentials.id, cred.id));
}
});
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot Mar 19, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Do not rotate in place while the app is using a single active vault key.

After Line 125 rewrites a row, any running process that still decrypts with OLD_VAULT_SECRET can no longer read that credential. Concurrent inserts/updates can also leave old-key rows behind, and a mid-run failure leaves a mixed table that reruns cannot resume from because Line 112 always decrypts with oldKey. This needs either a maintenance window that stops credential traffic or a dual-key/versioned rollout before release.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/rotate-vault-secret.ts` around lines 98 - 130, The rotation code
currently rewrites rows in-place (using decryptDirect/encryptDirect on
credentials.encryptedKey inside the transaction), which breaks readers still
using oldKey; change to a dual-key/versioned rollout: add a key_version or
new_encrypted_key field on the credentials row, update the rotation loop (the
batch/tx block around decryptDirect/encryptDirect) to write the reEncrypted
value into a separate column (e.g., encryptedKeyV2 or set key_version = 'v2' +
encryptedKey = reEncrypted) without deleting the old encryptedKey, and update
runtime decryption logic to attempt decryptDirect(..., newKey) first and fall
back to decryptDirect(..., oldKey) if that fails; once all rows have the new
column/version and the app is switched to use newKey, run a cleanup to remove
the old encryptedKey column. Ensure references: decryptDirect, encryptDirect,
credentials.encryptedKey, the batch loop and tx.update are changed accordingly.

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The rotation script is designed to run during a maintenance window. It uses per-batch transactions with cursor-based pagination and round-trip verification. The app should be stopped or put in read-only mode during rotation — this is documented in the script header.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rate Limit Exceeded

@Work90210 have exceeded the limit for the number of chat messages per hour. Please wait 1 minutes and 51 seconds before sending another message.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

♻️ Duplicate comments (3)
apps/web/lib/vault/decrypt.ts (1)

23-33: ⚠️ Potential issue | 🟠 Major

Zero the derived key on every decryption path.

This caller also owns the fresh Buffer returned by deriveKey(). Right now both successful decrypts and "Decryption failed" paths leave that key material in memory until GC.

🔐 Proposed fix
   const encrypted = combined.subarray(1 + IV_LENGTH, combined.length - AUTH_TAG_LENGTH);
 
   const key = deriveKey(secret, salt);
-  const decipher = createDecipheriv(ALGORITHM, key, iv);
-  decipher.setAuthTag(authTag);
-
   try {
+    const decipher = createDecipheriv(ALGORITHM, key, iv);
+    decipher.setAuthTag(authTag);
+
     const decrypted = Buffer.concat([decipher.update(encrypted), decipher.final()]);
     return decrypted.toString('utf8');
   } catch {
     throw new Error('Decryption failed');
+  } finally {
+    key.fill(0);
   }
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/lib/vault/decrypt.ts` around lines 23 - 33, The derived key returned
by deriveKey(secret, salt) must be zeroed on every code path to avoid leaving
key material in memory; update the decrypt function (the block that calls
deriveKey, createDecipheriv, decipher.update and decipher.final) to zero the key
Buffer (e.g., key.fill(0) or equivalent secure zeroing) in a finally block so it
runs after successful decryption and also when the catch path throws "Decryption
failed" (ensure the key is cleared before returning or rethrowing).
apps/web/lib/vault/encrypt.ts (1)

6-19: ⚠️ Potential issue | 🟠 Major

Wipe the derived key buffer before returning.

deriveKey() returns a fresh Buffer copy, so this function owns raw AES key material that the cache scrubber will never clear later. As written, both the success path and any thrown crypto error leave that copy resident until GC.

🔐 Proposed fix
 export function encrypt(plaintext: string, secret: string, salt: string): string {
   const key = deriveKey(secret, salt);
-  const iv = randomBytes(IV_LENGTH);
-  const cipher = createCipheriv(ALGORITHM, key, iv);
-
-  const encrypted = Buffer.concat([cipher.update(plaintext, 'utf8'), cipher.final()]);
-  const authTag = cipher.getAuthTag();
-
-  if (authTag.length !== AUTH_TAG_LENGTH) {
-    throw new Error('Unexpected auth tag length');
-  }
-
-  const combined = Buffer.concat([Buffer.from([CIPHERTEXT_VERSION]), iv, encrypted, authTag]);
-  return combined.toString('base64');
+  try {
+    const iv = randomBytes(IV_LENGTH);
+    const cipher = createCipheriv(ALGORITHM, key, iv);
+
+    const encrypted = Buffer.concat([cipher.update(plaintext, 'utf8'), cipher.final()]);
+    const authTag = cipher.getAuthTag();
+
+    if (authTag.length !== AUTH_TAG_LENGTH) {
+      throw new Error('Unexpected auth tag length');
+    }
+
+    const combined = Buffer.concat([Buffer.from([CIPHERTEXT_VERSION]), iv, encrypted, authTag]);
+    return combined.toString('base64');
+  } finally {
+    key.fill(0);
+  }
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/lib/vault/encrypt.ts` around lines 6 - 19, encrypt currently calls
deriveKey(secret, salt) and leaves the raw key Buffer alive; wrap the encryption
logic in a try/finally so the derived key is zeroed before any return or thrown
error. Specifically, in encrypt() (which calls deriveKey, createCipheriv,
cipher.update/final, cipher.getAuthTag), store the key, perform all crypto
operations inside a try block, then in finally call key.fill(0) (or equivalent)
to overwrite the Buffer; ensure this covers both the success path and any thrown
errors so the derived AES key material is scrubbed from memory.
apps/web/lib/vault/__tests__/vault.test.ts (1)

210-220: ⚠️ Potential issue | 🟡 Minor

This test never inspects the buffer that clearKeyCache() wipes.

Line 211 already gets a caller-owned copy from deriveKey(), and Line 212 clones it again. apps/web/lib/vault/derive-key.ts:42-47 only zeroes the module cache, so this test can stay green even if the internal scrub disappears.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/lib/vault/__tests__/vault.test.ts` around lines 210 - 220, The test
currently copies the derived key (Buffer.from(key)) so it never inspects the
module's cached buffer that clearKeyCache() zeroes; instead, keep the original
buffer reference returned by deriveKey (do not clone it into keyBytes), call
clearKeyCache(), and then assert that that original buffer (e.g., the variable
currently named key or keyBytes if you rename it) has been zeroed; reference
deriveKey and clearKeyCache to locate where to change the test.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@apps/web/lib/db/index.ts`:
- Around line 27-41: The getDb() singleton creates a postgres client but never
exposes a way to close it; add a module-level variable to keep the raw postgres
client (e.g., sqlClient) alongside dbInstance and implement/export an async
shutdown function (e.g., closeDb or shutdownDb) that, if sqlClient is non-null,
calls the postgres client's termination method (e.g., sqlClient.end() or
equivalent), then sets sqlClient and dbInstance back to null; update getDb() to
assign that module-level sqlClient when creating the client so the shutdown
function can cleanly close the pool for tests and graceful shutdown.
- Line 43: Replace the untyped alias ReturnType<typeof drizzle> with an explicit
DrizzleClient type that includes the schema by importing/using
PostgresJsDatabase and the project's schema symbol (e.g., schema) so
DrizzleClient = PostgresJsDatabase<typeof schema>; update the DrizzleClient
alias in apps/web/lib/db/index.ts accordingly and ensure any usages of
DrizzleClient or drizzle-instantiated db still type-check against the new
explicitly typed DrizzleClient to restore full schema-aware autocomplete and
relational query APIs.

In `@apps/web/lib/vault/__tests__/vault.test.ts`:
- Around line 123-149: The test "should use identical error messages for all
failure modes" currently only asserts inside catch blocks so a non-throwing call
would still pass; update each case that calls decrypt (Wrong version, Tampered
data, Too short, Wrong key) to make the failure assertion mandatory by either
replacing the try/catch with Jest's expect(() =>
decrypt(...)).toThrow('Decryption failed') or by keeping the try/catch but
adding a explicit fail() (or throw) immediately after the decrypt call to ensure
the test fails if no error is thrown; reference the decrypt and encrypt usages
in this test to locate and update the four assertions.

In `@packages/types/src/__tests__/types.test.ts`:
- Around line 54-59: Tests use raw error-code strings; update them to use the
exported ErrorCodes enum/constant instead. Replace occurrences of 'NOT_FOUND'
(and other raw codes used in the same tests for createErrorResponse and related
assertions) with ErrorCodes.NOT_FOUND (import ErrorCodes from its module if not
already) and adjust expected object comparisons to reference
ErrorCodes.<CONSTANT> so tests assert against the canonical constants rather
than string literals; ensure createErrorResponse and any other helpers remain
the same and only the test assertions are updated.
- Around line 180-194: The test is only asserting runtime values and doesn't
ensure the id property is compile-time readonly; add a TypeScript-only
compile-time assertion helper (e.g., a type alias like AssertHasReadonlyId<T
extends { readonly id: string }> = T) and use it against the entity interfaces
in this test: Spec, McpServer, McpTool, Credential, UsageEvent, RequestLog (for
example declare const _assertSpec: AssertHasReadonlyId<Spec> = {} as Spec) so
the compiler will fail if any of those types no longer have a readonly id.
- Around line 196-199: The current runtime keys test can false-pass; replace it
with a compile-time/type-level assertion that fails if CreateSpecInput ever
includes 'toolCount'. Add a conditional type like type _EnsureNoToolCount =
'toolCount' extends keyof CreateSpecInput ? never : true and then bind it to a
const (e.g., const _ensureNoToolCount: _EnsureNoToolCount = true) inside the
test file so TypeScript will error at compile time if CreateSpecInput (the
symbol) gains a 'toolCount' key; remove the runtime keys array assertion in the
"Spec should not include toolCount in CreateSpecInput" test.

---

Duplicate comments:
In `@apps/web/lib/vault/__tests__/vault.test.ts`:
- Around line 210-220: The test currently copies the derived key
(Buffer.from(key)) so it never inspects the module's cached buffer that
clearKeyCache() zeroes; instead, keep the original buffer reference returned by
deriveKey (do not clone it into keyBytes), call clearKeyCache(), and then assert
that that original buffer (e.g., the variable currently named key or keyBytes if
you rename it) has been zeroed; reference deriveKey and clearKeyCache to locate
where to change the test.

In `@apps/web/lib/vault/decrypt.ts`:
- Around line 23-33: The derived key returned by deriveKey(secret, salt) must be
zeroed on every code path to avoid leaving key material in memory; update the
decrypt function (the block that calls deriveKey, createDecipheriv,
decipher.update and decipher.final) to zero the key Buffer (e.g., key.fill(0) or
equivalent secure zeroing) in a finally block so it runs after successful
decryption and also when the catch path throws "Decryption failed" (ensure the
key is cleared before returning or rethrowing).

In `@apps/web/lib/vault/encrypt.ts`:
- Around line 6-19: encrypt currently calls deriveKey(secret, salt) and leaves
the raw key Buffer alive; wrap the encryption logic in a try/finally so the
derived key is zeroed before any return or thrown error. Specifically, in
encrypt() (which calls deriveKey, createCipheriv, cipher.update/final,
cipher.getAuthTag), store the key, perform all crypto operations inside a try
block, then in finally call key.fill(0) (or equivalent) to overwrite the Buffer;
ensure this covers both the success path and any thrown errors so the derived
AES key material is scrubbed from memory.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

Run ID: 61ea3801-4b80-41d2-9de0-83062b92f92d

📥 Commits

Reviewing files that changed from the base of the PR and between 9008a60 and 8b88692.

📒 Files selected for processing (6)
  • apps/web/lib/db/index.ts
  • apps/web/lib/vault/__tests__/vault.test.ts
  • apps/web/lib/vault/decrypt.ts
  • apps/web/lib/vault/derive-key.ts
  • apps/web/lib/vault/encrypt.ts
  • packages/types/src/__tests__/types.test.ts

Comment thread apps/web/lib/db/index.ts Outdated
Comment on lines +27 to +41
let dbInstance: ReturnType<typeof drizzle> | null = null;

export function getDb(): ReturnType<typeof drizzle> {
if (dbInstance === null) {
const sql = postgres(getConnectionString(), {
max: getPoolSize(),
idle_timeout: 20,
connect_timeout: 10,
// Disable prepared statements for PgBouncer transaction-mode compatibility
prepare: false,
});
dbInstance = drizzle(sql, { schema });
}
return dbInstance;
}
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot Mar 19, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

Add a shutdown function to close the connection pool gracefully.

The singleton pattern works correctly, but there's no way to close the connection pool. This is important for:

  • Clean shutdown in production (avoiding connection leaks)
  • Test isolation (releasing connections between test suites)
  • Serverless/edge environments where connection cleanup matters
♻️ Proposed fix to add shutdown capability
 let dbInstance: ReturnType<typeof drizzle> | null = null;
+let sqlInstance: ReturnType<typeof postgres> | null = null;
 
 export function getDb(): ReturnType<typeof drizzle> {
   if (dbInstance === null) {
-    const sql = postgres(getConnectionString(), {
+    sqlInstance = postgres(getConnectionString(), {
       max: getPoolSize(),
       idle_timeout: 20,
       connect_timeout: 10,
       // Disable prepared statements for PgBouncer transaction-mode compatibility
       prepare: false,
     });
-    dbInstance = drizzle(sql, { schema });
+    dbInstance = drizzle(sqlInstance, { schema });
   }
   return dbInstance;
 }
+
+export async function closeDb(): Promise<void> {
+  if (sqlInstance !== null) {
+    await sqlInstance.end();
+    sqlInstance = null;
+    dbInstance = null;
+  }
+}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/lib/db/index.ts` around lines 27 - 41, The getDb() singleton creates
a postgres client but never exposes a way to close it; add a module-level
variable to keep the raw postgres client (e.g., sqlClient) alongside dbInstance
and implement/export an async shutdown function (e.g., closeDb or shutdownDb)
that, if sqlClient is non-null, calls the postgres client's termination method
(e.g., sqlClient.end() or equivalent), then sets sqlClient and dbInstance back
to null; update getDb() to assign that module-level sqlClient when creating the
client so the shutdown function can cleanly close the pool for tests and
graceful shutdown.

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good suggestion — a shutdown hook will be added when we implement graceful shutdown in the application lifecycle. For now, the lazy initialization pattern prevents premature connection creation.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rate Limit Exceeded

@Work90210 have exceeded the limit for the number of chat messages per hour. Please wait 1 minutes and 51 seconds before sending another message.

Comment thread apps/web/lib/db/index.ts Outdated
Comment thread apps/web/lib/vault/__tests__/vault.test.ts
Comment thread packages/types/src/__tests__/types.test.ts
Comment thread packages/types/src/__tests__/types.test.ts Outdated
Comment thread packages/types/src/__tests__/types.test.ts Outdated
- Add empty update guards to all repository update methods
- Use PostgresJsDatabase<typeof schema> for typed DrizzleClient
- Replace try/catch with expect().toThrow for mandatory assertions
- Use ErrorCodes constants instead of raw strings in tests
- Add compile-time readonly and CreateSpecInput exclusion assertions
Copy link
Copy Markdown
Owner Author

@Work90210 Work90210 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All feedback has been addressed in commits a4af398 and 8b88692. Resolving.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

♻️ Duplicate comments (1)
apps/web/lib/db/repositories/spec.repository.ts (1)

24-28: ⚠️ Potential issue | 🟠 Major

Add a stable sort before capping findAll().

LIMIT without ORDER BY lets PostgreSQL return any matching DEFAULT_QUERY_LIMIT rows, so once a user has more than the cap the subset can shift between calls and make older specs unreachable. The same capped-list pattern appears in the new server/tool/credential repositories as well.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/lib/db/repositories/spec.repository.ts` around lines 24 - 28, The
query in findAll (using
this.db.select().from(specs).where(...).limit(DEFAULT_QUERY_LIMIT)) lacks a
deterministic ORDER BY, causing nondeterministic subsets when LIMIT is applied;
fix by adding a stable order before the limit (e.g., .orderBy(specs.id, 'asc')
or the canonical created_at/primary key column) so results are consistently
paged, and apply the same change to the analogous server/tool/credential
repository queries that use DEFAULT_QUERY_LIMIT.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@apps/web/lib/db/repositories/credential.repository.ts`:
- Around line 14-30: safeColumns is a narrowed projection that intentionally
omits encryptedKey but is being cast back to the full Credential type in
CredentialRepository methods (findAll, findById, create, update), which allows
callers to access credential.encryptedKey and get undefined at runtime; fix by
defining and returning a dedicated safe DTO/interface (e.g., SafeCredentialDto)
that matches safeColumns and use that type for repository return values and
projections, or alternatively include encryptedKey in the projection
consistently so Credential remains accurate; update method signatures and any
usages to consume the new SafeCredentialDto (or the full Credential if you add
encryptedKey) to ensure compile-time and runtime shape agreement.

In `@apps/web/lib/db/repositories/tool.repository.ts`:
- Around line 136-153: In the async delete method, after performing the DML
delete via tx.delete(mcpTools).where(and(eq(mcpTools.id, id),
eq(mcpTools.serverId, existing[0]!.serverId))), check the delete result to
ensure a row was actually removed and if not throw the same Error('Tool not
found or access denied'); specifically capture the delete call's return (the
result/count provided by the query builder) and validate it is > 0—if it equals
0, throw the error to preserve the pre-check's semantics and prevent silent
success on concurrent deletes.

In `@apps/web/lib/vault/__tests__/vault.test.ts`:
- Around line 187-200: The test currently uses wall-clock timing to assert
caching for deriveKey(TEST_SECRET, TEST_SALT); instead, modify the test to
deterministically assert the cached path by instrumenting or spying on the
expensive derivation call: either (a) add an optional observable hook/counter
parameter or exported internal (e.g., deriveKeyCache or deriveKeyComputeCounter)
to the deriveKey implementation so the test can assert the compute counter
increments only once and subsequent calls are cache hits, or (b) spy/mock the
underlying expensive function used by deriveKey (e.g., pbkdf2/computeDerivedKey)
and assert it was called once while deriveKey was invoked 101 times; update the
test to use TEST_SECRET/TEST_SALT and assert on the counter/spied-call count
instead of comparing warmTime and coldTime.

---

Duplicate comments:
In `@apps/web/lib/db/repositories/spec.repository.ts`:
- Around line 24-28: The query in findAll (using
this.db.select().from(specs).where(...).limit(DEFAULT_QUERY_LIMIT)) lacks a
deterministic ORDER BY, causing nondeterministic subsets when LIMIT is applied;
fix by adding a stable order before the limit (e.g., .orderBy(specs.id, 'asc')
or the canonical created_at/primary key column) so results are consistently
paged, and apply the same change to the analogous server/tool/credential
repository queries that use DEFAULT_QUERY_LIMIT.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

Run ID: 367c00ce-e829-47d0-9b6b-11d938f1e5f1

📥 Commits

Reviewing files that changed from the base of the PR and between 8b88692 and a4af398.

📒 Files selected for processing (7)
  • apps/web/lib/db/index.ts
  • apps/web/lib/db/repositories/credential.repository.ts
  • apps/web/lib/db/repositories/server.repository.ts
  • apps/web/lib/db/repositories/spec.repository.ts
  • apps/web/lib/db/repositories/tool.repository.ts
  • apps/web/lib/vault/__tests__/vault.test.ts
  • packages/types/src/__tests__/types.test.ts

Comment thread apps/web/lib/db/repositories/credential.repository.ts
Comment thread apps/web/lib/db/repositories/tool.repository.ts Outdated
Comment thread apps/web/lib/vault/__tests__/vault.test.ts
- Add SafeCredential type — repository returns SafeCredential (no encryptedKey)
  instead of casting full Credential, eliminating runtime shape mismatch
- Verify delete result in ToolRepository.delete to prevent silent no-ops
- Add deterministic ORDER BY (createdAt) to all findAll queries
- Replace timing-based cache test with deterministic copy-mutation assertion
@Work90210 Work90210 merged commit 910a79c into master Mar 19, 2026
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant