Skip to content

Feat/one2b agent schema#97

Merged
Prajjawalk merged 4 commits intomainfrom
feat/one2b-agent-schema
Apr 28, 2026
Merged

Feat/one2b agent schema#97
Prajjawalk merged 4 commits intomainfrom
feat/one2b-agent-schema

Conversation

@Prajjawalk
Copy link
Copy Markdown
Collaborator

@Prajjawalk Prajjawalk commented Apr 27, 2026

Description

Brief description of what this PR does.

Existing tables modified

Table Column Type Default / Constraint Notes
User phone_encrypted BYTEA nullable Encrypted phone, mirrors CrmContact.phone
User preferences JSONB nullable Reminder time, brief delivery channel, Drive prefs
User isServiceAccount BOOLEAN false Flags bot users (e.g. bot+one2b@one2b.io)
Action sourceType TEXT nullable meeting | email | whatsapp | manual
Action sourceId TEXT nullable FK-by-convention to source record
Action lastUpdatedBy TEXT nullable AGENT | USER_EMAIL | USER_WHATSAPP | USER_UI
Action lastUpdatedSource TEXT nullable Message ID, email Message-Id, etc.
TranscriptionSession firefliesId TEXT unique, nullable Idempotent webhook processing
TranscriptionSession analyticsJson JSONB nullable Fireflies analytics blob
TranscriptionSession sentencesJson JSONB nullable Structured sentences with timestamps + speakers
TranscriptionSession videoUrl TEXT nullable
TranscriptionSession durationSeconds INTEGER nullable
TranscriptionSession participantCount INTEGER nullable
Integration webhookId TEXT unique, nullable Powers per-workspace webhook URLs
Integration providerConfig JSONB nullable Generic provider-specific config blob
WhatsAppConfig provider TEXT 'META_CLOUD' Discriminator: META_CLOUD | DIALOG_360
WhatsAppConfig providerConfig JSONB nullable Provider-specific config
WhatsAppConfig phoneNumberId TEXT relaxed to nullable Meta-only field
WhatsAppConfig businessAccountId TEXT relaxed to nullable Meta-only field
WhatsAppConfig webhookVerifyToken TEXT relaxed to nullable Meta-only field
KnowledgeChunk workspaceId TEXT nullable (NOT NULL in follow-up) Workspace isolation; backfill pending
KnowledgeChunk embeddingProvider TEXT nullable openai, jina, etc.
KnowledgeChunk embeddingModel TEXT nullable e.g. text-embedding-3-small
KnowledgeChunk embeddingDim INTEGER nullable e.g. 1536
KnowledgeChunk embeddingGeneratedAt TIMESTAMP(3) nullable For re-embedding detection
KnowledgeChunk speakerName TEXT nullable Transcript-chunk only
KnowledgeChunk speakerEmail TEXT nullable Transcript-chunk only
KnowledgeChunk startTimeMs INTEGER nullable Transcript-chunk only
KnowledgeChunk endTimeMs INTEGER nullable Transcript-chunk only
KnowledgeChunk userId TEXT relaxed to nullable Workspace-only chunks need no user
CrmCommunication reviewStatus TEXT nullable NEEDS_REVIEW | IN_REVIEW | REVISING | APPROVED
CrmCommunication sourceType TEXT nullable What meeting/action this draft came from
CrmCommunication sourceId TEXT nullable
CrmCommunication agentGenerated BOOLEAN false
CrmCommunication revisionCount INTEGER 0
CrmCommunication revisionHistory JSONB nullable Array of {snapshot, generatedAt, prompt}

New tables

Table Purpose Workspace-scoped
Document Parent record for ingested files (Drive / S3 upload / URL / email attachment). KnowledgeChunk references via sourceId when sourceType='document'. yes
TranscriptionSessionParticipant Many-to-many between TranscriptionSession and (User OR CrmContact OR raw email). Carries Fireflies speaker label and isHost. Lets us record meeting attendees who don't have exponential accounts. yes
ActionParticipantAssignee Sibling to ActionAssignee — links an Action to a TranscriptionSessionParticipant when the assignee email doesn't resolve to a workspace User. Avoids forcing every meeting participant to onboard. Backfill job will convert these to ActionAssignee if a participant later signs up. yes
PreMeetingBrief Per-user pre-meeting briefs scheduled against Google Calendar events. Tracks generation/delivery state and channel (EMAIL | WHATSAPP). yes
ReminderLog System audit trail for action item reminders, daily digests, overdue notices, pre-meeting briefs. Kept distinct from CrmCommunication because it's system-generated, not user-facing comms. yes
PendingDriveUpload WhatsApp file upload state machine — staging area while the user picks a Google Drive folder for inbound attachments. Has TTL via expiresAt. yes

Indexes added

Type On Columns
Unique Integration (webhookId)
Unique TranscriptionSession (firefliesId)
Unique TranscriptionSessionParticipant (transcriptionSessionId, email)
Unique PreMeetingBrief (userId, calendarEventId)
Unique ActionParticipantAssignee (actionId, participantId)
B-tree Action (sourceType, sourceId)
B-tree KnowledgeChunk (workspaceId)
B-tree TranscriptionSession (firefliesId)
B-tree CrmCommunication (reviewStatus)
B-tree every new table per-FK indexes on workspaceId, userId, etc.

Nullability relaxations on existing columns

Table Column Reason
KnowledgeChunk userId Workspace-level chunks have no per-user owner
WhatsAppConfig phoneNumberId Meta-specific; null for DIALOG_360 rows
WhatsAppConfig businessAccountId Meta-specific
WhatsAppConfig webhookVerifyToken Meta-specific

Type of Change

  • Bug fix (non-breaking change which fixes an issue)
  • [ x] New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation update
  • Performance improvement
  • Code refactoring

Database Changes

  • [ x] This PR includes database schema changes (migrations)
  • NO database changes in this PR

If migrations are included:

  • Migration name(s): 20260426075815_merge_one2b_agent_schema, 20260427175749_add_action_participant_assignee
  • Schema changes: listed above
  • Coordinated with team: Yes

Testing

  • Tested locally
  • Tested in preview deployment
  • Added/updated tests
  • All tests passing

Checklist

  • [ x] My code follows the project's style guidelines
  • [ x] I have performed a self-review of my own code
  • [ x] I have commented my code in hard-to-understand areas
  • [ x] My changes generate no new warnings
  • [ x] Database migrations tested on test database
  • [ x] Coordinated with team on any migration conflicts

Deployment Notes

Any special deployment considerations?

Related Issues

Closes #

Summary by CodeRabbit

  • Chores

    • Upgraded Prisma dependencies to 6.19.3.
  • New Features

    • Expanded database schema: transcription session analytics, participant tracking, documents, pre-meeting briefs, reminders, pending uploads, and action–participant assignments.
    • Improved integration/webhook and WhatsApp provider configuration support.
    • Added embedding provenance and speaker/time metadata for knowledge content.
  • Bug Fixes / Reliability

    • Stronger WhatsApp config validation and encryption output compatibility updates.

Additive schema changes preparing for the one2b-internal-agent merge.
No data is moved here; this only widens schemas so the next phase can
populate them.

Existing models extended:
- User: encrypted phone, JSON preferences, isServiceAccount flag
- Action: source provenance (sourceType/sourceId) + lastUpdatedBy/Source
  for tracking which channel last updated a task (UI, agent, email, etc.)
- TranscriptionSession: Fireflies meeting metadata (firefliesId unique,
  analytics/sentences JSON, video URL, duration, participantCount)
- Integration: webhookId for URL-scoped webhook routing + providerConfig
- WhatsAppConfig: provider discriminator (META_CLOUD | DIALOG_360),
  per-provider config blob; Meta-specific fields relaxed to nullable
- KnowledgeChunk: workspaceId (nullable for now; backfill + NOT NULL
  in follow-up migration), embedding provenance (provider/model/dim/
  generatedAt) for re-embedding support, speaker fields for transcript
  chunks, userId relaxed to nullable
- CrmContact: reverse relation to TranscriptionSessionParticipant
- CrmCommunication: agent-driven follow-up draft workflow fields
  (reviewStatus separate from delivery status, source provenance,
  agentGenerated flag, revision history)

New models:
- Document: parent for ingested files (Drive/upload/URL/email
  attachment); KnowledgeChunk references via sourceId when
  sourceType='document'
- TranscriptionSessionParticipant: join between meeting and
  User-or-CrmContact-or-raw-email, with speaker labels
- PreMeetingBrief: scheduled pre-meeting briefs per user per
  calendar event; unique on (userId, calendarEventId)
- ReminderLog: system audit trail for action item reminders, daily
  digests, overdue notices, pre-meeting briefs
- PendingDriveUpload: WhatsApp file upload state machine awaiting
  Drive folder selection from the user

All workspace-scoped. All relations use cascade-on-workspace-delete
or set-null-on-user-delete consistent with existing conventions.

Note: pgvector HNSW index on KnowledgeChunk.embedding to be added
manually to the generated migration SQL (Prisma cannot model pgvector
indexes natively).
Migration 20260426075815_merge_one2b_agent_schema:
Companion to the schema commit. Adds the new one2b agent columns and
tables, with a manually-appended HNSW pgvector index on
KnowledgeChunk.embedding (Prisma cannot model pgvector indexes
natively).

Migration 20260426081707_add_hnsw_index:
Inadvertent drift artifact — Prisma's `migrate dev` (without
--create-only) noticed the HNSW index in the DB but not in
schema.prisma and auto-generated a DROP. This was applied before
review. The index can be recreated when production embedding volume
justifies it; until then queries fall back to sequential scan, which
is fine for current data sizes.

Lesson: pgvector indexes will trigger this drift trap on every
unguarded `migrate dev`. Use `--create-only` for any migration in
this project and review the generated SQL for spurious DROP INDEX
statements before applying.

Prisma client + CLI bumped from 6.14.0 to 6.19.3 by `migrate dev`.
Lets the agent assign action items to meeting participants who are
not (yet) registered exponential Users. Sibling to ActionAssignee:
the agent picks ActionAssignee when the assignee email resolves to
a workspace User, otherwise links to a TranscriptionSessionParticipant.

Why this matters: forcing every meeting participant to onboard to
exponential just to be assigned a task is unwanted friction. With
this table, action items can flow through the agent end-to-end
(extract from transcript, assign, remind via WhatsApp/email) without
exponential accounts. If a participant later signs up, a backfill
job converts these assignments to ActionAssignee.

Includes:
- New table with (actionId, participantId, workspaceId, createdAt)
- Unique constraint on (actionId, participantId)
- B-tree indexes on each FK column
- Cascade-on-delete from Action, TranscriptionSessionParticipant,
  and Workspace (consistent with the rest of one2b agent models)
- Reverse relations on Action, TranscriptionSessionParticipant,
  and Workspace
@vercel
Copy link
Copy Markdown

vercel Bot commented Apr 27, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
exponential Ready Ready Preview, Comment Apr 28, 2026 10:42am

Request Review

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 27, 2026

📝 Walkthrough

Walkthrough

Upgrades Prisma to v6.19.3 and adds One2b agent schema: six new models, multiple field extensions (actions, transcription sessions, knowledge chunks, integrations, WhatsApp), several migrations (including pgvector index changes), utility and service adjustments for encryption and WhatsApp credential handling, and an optimistic UI payload alignment.

Changes

Cohort / File(s) Summary
Dependency Updates
package.json
Bump @prisma/client and prisma from ^6.14.0^6.19.3.
Main Schema Migration
prisma/migrations/20260426075815_merge_one2b_agent_schema/migration.sql, prisma/schema.prisma
Adds models Document, TranscriptionSessionParticipant, PreMeetingBrief, ReminderLog, PendingDriveUpload, ActionParticipantAssignee; extends User, Workspace, Action, TranscriptionSession, Integration, WhatsAppConfig, KnowledgeChunk, CrmContact, CrmCommunication, and Ticket with new fields, indexes, and constraints; adds FK relations and a manual pgvector HNSW index on KnowledgeChunk.embedding.
Index Removal
prisma/migrations/20260426081707_add_hnsw_index/migration.sql
Removes KnowledgeChunk_embedding_hnsw_idx index.
Assignee Mapping Migration
prisma/migrations/20260427175749_add_action_participant_assignee/migration.sql
Creates ActionParticipantAssignee table with FK constraints, composite uniqueness on (actionId,participantId), and supporting indexes.
Optimistic UI
src/app/_components/layout/GlobalAddTaskButton.tsx
Adds sourceType, sourceId, lastUpdatedBy, lastUpdatedSource (initialized to null) to optimistic api.action.create payload.
WhatsApp Service
src/server/services/notifications/WhatsAppNotificationService.ts
Fails early when WhatsApp config lacks Meta-specific IDs; validates phoneNumberId before sending and returns structured failure instead of proceeding with invalid ID.
WhatsApp Query
src/server/services/whatsapp/OptimizedQueries.ts
getWhatsAppConfig now selects provider and providerConfig and caches result under whatsapp-config:${configId}.
Encryption Utils
src/server/utils/encryption.ts
encryptString now returns a Uint8Array (ArrayBuffer-backed) to satisfy Prisma 6.19 Bytes expectations; encryptToBase64 converts that into a Buffer for base64 output.

Sequence Diagram(s)

sequenceDiagram
  participant Client
  participant UI as Frontend UI
  participant Server
  participant DB as Database
  participant Notifier as WhatsAppService
  participant WhatsAppAPI as MetaWhatsApp

  Client->>UI: User creates Action (optimistic payload incl. source fields)
  UI->>Server: POST /api/action.create (payload)
  Server->>DB: INSERT Action (+ participantAssignees if any)
  DB-->>Server: Insert OK
  Server->>Notifier: prepare/send notification (load WhatsApp config)
  Notifier->>DB: SELECT WhatsAppConfig (includes provider, providerConfig)
  DB-->>Notifier: config (nullable IDs possible)
  Notifier-->>Notifier: validate config (phoneNumberId,businessAccountId)
  alt config invalid
    Notifier-->>Server: return structured failure (no send)
  else config valid
    Notifier->>WhatsAppAPI: send message (uses phoneNumberId)
    WhatsAppAPI-->>Notifier: send result
    Notifier-->>Server: delivery result
  end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

  • Feat: Config Fireflies Wizard #66: Related to adding Integration.webhookId and providerConfig and webhook/token handling for integrations (direct schema/feature overlap).
  • Feat fireflies wizard #64: Related changes to encryption helpers and base64/Bytes handling; overlaps with src/server/utils/encryption.ts adjustments.

Poem

🐰 I hopped through migrations, nibbling on schema seeds,

Six new tables sprouted from tidy developer deeds,
Keys cascade gently, embeddings take flight,
WhatsApp checks early to keep sends polite,
Prisma grew wiser — a carrot-shaped delight! 🥕

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 40.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed The PR title 'Feat/one2b agent schema' clearly describes the main change—introducing a one2b agent schema layer with new database tables, extended models, and related code updates.
Description check ✅ Passed The PR description is comprehensive, including detailed tables of schema changes, new tables, indexes, and nullability relaxations. All major template sections are completed with checked boxes and specific technical details aligned with the changeset.
Linked Issues check ✅ Passed Check skipped because no linked issues were found for this pull request.
Out of Scope Changes check ✅ Passed Check skipped because no linked issues were found for this pull request.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch feat/one2b-agent-schema

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (1)
package.json (1)

62-62: Upgrade React and React DOM to v19 for Next.js 15 compatibility.

While touching this manifest, upgrade react and react-dom from ^18.3.1 to ^19.0.0 (or latest v19). The official Next.js 15 upgrade guide specifies React 19 as the minimum required version. Since you're already modifying dependencies here, align these to meet the Next.js 15 baseline per your coding guidelines.

Also applies to: 146-146

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@package.json` at line 62, Update the package.json dependency entries for
"react" and "react-dom" from ^18.3.1 to ^19.0.0 (or the latest v19) so the
project meets Next.js 15's React 19 requirement; locate the "react" and
"react-dom" entries in package.json (there are two occurrences referenced) and
change their version strings to ^19.0.0, then run your package manager to
refresh lockfile and verify no other dependency constraints break.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@prisma/migrations/20260426081707_add_hnsw_index/migration.sql`:
- Around line 1-2: The migration currently drops the HNSW vector index
"KnowledgeChunk_embedding_hnsw_idx" which breaks semantic search used by
KnowledgeService (refer to src/server/services/KnowledgeService.ts where it runs
ORDER BY kc.embedding <=> ... LIMIT); revert this by removing the DROP INDEX
from the migration (or instead add a CREATE INDEX ... USING hnsw_cosine/ivfflat
with the original options) so the "KnowledgeChunk_embedding_hnsw_idx" index
remains available for those ORDER BY ... <=> queries; ensure the migration
preserves or recreates the exact index name "KnowledgeChunk_embedding_hnsw_idx"
so existing queries in KnowledgeService continue to use the index.

In `@prisma/schema.prisma`:
- Around line 3882-3897: The denormalized workspaceId on the
ActionParticipantAssignee model can diverge from the referenced Action and
TranscriptionSessionParticipant leading to cross-workspace leakage; either
remove workspaceId and derive workspace via the relations (use
action.workspaceId or participant.workspaceId for workspace-scoped queries) or
keep it but add DB-level constraints in the migration to guarantee consistency
(CREATE CHECK constraints or triggers that assert workspaceId = (SELECT
workspaceId FROM "Action" WHERE id = actionId) AND workspaceId = (SELECT
workspaceId FROM "TranscriptionSessionParticipant" WHERE id = participantId)).
Update the Prisma model ActionParticipantAssignee (fields: workspaceId,
actionId, participantId and relations action and participant) and add a
migration that implements the chosen fix so workspaceId cannot drift from
Action.workspaceId/TranscriptionSessionParticipant.workspaceId.
- Around line 2182-2205: The schema made KnowledgeChunk.userId nullable while
KnowledgeService still filters reads by WHERE kc."userId" = ${userId}, causing
chunks written with only workspaceId to be invisible; fix by updating the
read/query logic in src/server/services/KnowledgeService.ts to also include
workspace-scoped results (e.g., include OR kc."workspaceId" = ${workspaceId} or
add a combined conditional that checks both userId and workspaceId), and update
the Prisma model KnowledgeChunk to declare a Workspace relation/foreign key for
workspaceId (e.g., add workspace Workspace? `@relation`(fields: [workspaceId],
references: [id])) and then create and run the migration to add the FK/index so
writes and reads are consistent.

---

Nitpick comments:
In `@package.json`:
- Line 62: Update the package.json dependency entries for "react" and
"react-dom" from ^18.3.1 to ^19.0.0 (or the latest v19) so the project meets
Next.js 15's React 19 requirement; locate the "react" and "react-dom" entries in
package.json (there are two occurrences referenced) and change their version
strings to ^19.0.0, then run your package manager to refresh lockfile and verify
no other dependency constraints break.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 66aefac7-357d-4bb8-a391-04e422570831

📥 Commits

Reviewing files that changed from the base of the PR and between 883dd2e and 6a6b40d.

⛔ Files ignored due to path filters (1)
  • package-lock.json is excluded by !**/package-lock.json
📒 Files selected for processing (5)
  • package.json
  • prisma/migrations/20260426075815_merge_one2b_agent_schema/migration.sql
  • prisma/migrations/20260426081707_add_hnsw_index/migration.sql
  • prisma/migrations/20260427175749_add_action_participant_assignee/migration.sql
  • prisma/schema.prisma

Comment on lines +1 to +2
-- DropIndex
DROP INDEX "KnowledgeChunk_embedding_hnsw_idx";
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Don't ship the migration that drops the vector index.

This removes the manual HNSW index created in prisma/migrations/20260426075815_merge_one2b_agent_schema/migration.sql, but src/server/services/KnowledgeService.ts:448-531 still does ORDER BY kc.embedding <=> ... LIMIT queries against KnowledgeChunk. After this runs, semantic search falls back to a full scan as the table grows.

💡 Safer migration shape
 -- DropIndex
-DROP INDEX "KnowledgeChunk_embedding_hnsw_idx";
+CREATE INDEX IF NOT EXISTS "KnowledgeChunk_embedding_hnsw_idx"
+  ON "KnowledgeChunk"
+  USING hnsw (embedding vector_cosine_ops);
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
-- DropIndex
DROP INDEX "KnowledgeChunk_embedding_hnsw_idx";
-- DropIndex
CREATE INDEX IF NOT EXISTS "KnowledgeChunk_embedding_hnsw_idx"
ON "KnowledgeChunk"
USING hnsw (embedding vector_cosine_ops);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@prisma/migrations/20260426081707_add_hnsw_index/migration.sql` around lines 1
- 2, The migration currently drops the HNSW vector index
"KnowledgeChunk_embedding_hnsw_idx" which breaks semantic search used by
KnowledgeService (refer to src/server/services/KnowledgeService.ts where it runs
ORDER BY kc.embedding <=> ... LIMIT); revert this by removing the DROP INDEX
from the migration (or instead add a CREATE INDEX ... USING hnsw_cosine/ivfflat
with the original options) so the "KnowledgeChunk_embedding_hnsw_idx" index
remains available for those ORDER BY ... <=> queries; ensure the migration
preserves or recreates the exact index name "KnowledgeChunk_embedding_hnsw_idx"
so existing queries in KnowledgeService continue to use the index.

Comment thread prisma/schema.prisma
Comment on lines +2182 to +2205
userId String?
projectId String?

// Workspace scoping (initially nullable; backfill then a follow-up migration enforces NOT NULL)
workspaceId String?

// Embedding provenance (provider-agnostic; supports re-embedding)
embeddingProvider String?
embeddingModel String?
embeddingDim Int?
embeddingGeneratedAt DateTime?

// Speaker info for transcript chunks
speakerName String?
speakerEmail String?
startTimeMs Int?
endTimeMs Int?

createdAt DateTime @default(now())

@@index([sourceType, sourceId])
@@index([userId])
@@index([projectId])
@@index([workspaceId])
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

This makes workspace chunks writable before they're readable.

Line 2182 makes KnowledgeChunk.userId nullable, but src/server/services/KnowledgeService.ts:448-531 still filters semantic search with WHERE kc."userId" = ${userId}. Any chunk written with only workspaceId set will disappear from retrieval immediately, and this model still doesn't declare a Workspace relation/FK to enforce the new scope.

Based on learnings, Use Prisma with proper error handling and migrations. Always test schema changes before pushing to production.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@prisma/schema.prisma` around lines 2182 - 2205, The schema made
KnowledgeChunk.userId nullable while KnowledgeService still filters reads by
WHERE kc."userId" = ${userId}, causing chunks written with only workspaceId to
be invisible; fix by updating the read/query logic in
src/server/services/KnowledgeService.ts to also include workspace-scoped results
(e.g., include OR kc."workspaceId" = ${workspaceId} or add a combined
conditional that checks both userId and workspaceId), and update the Prisma
model KnowledgeChunk to declare a Workspace relation/foreign key for workspaceId
(e.g., add workspace Workspace? `@relation`(fields: [workspaceId], references:
[id])) and then create and run the migration to add the FK/index so writes and
reads are consistent.

Comment thread prisma/schema.prisma
Comment on lines +3882 to +3897
model ActionParticipantAssignee {
id String @id @default(cuid())
actionId String
participantId String
workspaceId String // denormalized for workspace-scoped queries
createdAt DateTime @default(now())

action Action @relation(fields: [actionId], references: [id], onDelete: Cascade)
participant TranscriptionSessionParticipant @relation(fields: [participantId], references: [id], onDelete: Cascade)
workspace Workspace @relation(fields: [workspaceId], references: [id], onDelete: Cascade)

@@unique([actionId, participantId])
@@index([actionId])
@@index([participantId])
@@index([workspaceId])
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

workspaceId here can drift away from the parent records.

This table only enforces three independent FKs, so a row can point at an Action, a TranscriptionSessionParticipant, and a Workspace from different workspaces. Because workspaceId is explicitly denormalized for workspace-scoped queries, a bad write turns into cross-workspace leakage instead of a rejected insert.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@prisma/schema.prisma` around lines 3882 - 3897, The denormalized workspaceId
on the ActionParticipantAssignee model can diverge from the referenced Action
and TranscriptionSessionParticipant leading to cross-workspace leakage; either
remove workspaceId and derive workspace via the relations (use
action.workspaceId or participant.workspaceId for workspace-scoped queries) or
keep it but add DB-level constraints in the migration to guarantee consistency
(CREATE CHECK constraints or triggers that assert workspaceId = (SELECT
workspaceId FROM "Action" WHERE id = actionId) AND workspaceId = (SELECT
workspaceId FROM "TranscriptionSessionParticipant" WHERE id = participantId)).
Update the Prisma model ActionParticipantAssignee (fields: workspaceId,
actionId, participantId and relations action and participant) and add a
migration that implements the chosen fix so workspaceId cannot drift from
Action.workspaceId/TranscriptionSessionParticipant.workspaceId.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/server/services/notifications/WhatsAppNotificationService.ts`:
- Around line 93-96: The guard that returns "no credentials" is too strict: in
WhatsAppNotificationService (check the code that inspects
integration.whatsappConfig) only require phoneNumberId and accessToken for
send/test flows rather than also requiring businessAccountId; update the
conditional to return null only when phoneNumberId OR accessToken is missing
(i.e., require both phoneNumberId and accessToken), and remove the
businessAccountId dependency so configs with a null businessAccountId but valid
phoneNumberId+accessToken are allowed.

In `@src/server/utils/encryption.ts`:
- Line 22: Add a JSDoc block above the exported function encryptString
describing its purpose (e.g., "Encrypts a UTF-8 plaintext string using the
module's crypto routine") and specifying the return format clearly (e.g., that
it returns an encrypted byte buffer as a Uint8Array or ArrayBuffer, whichever
the implementation returns). Include parameter description for plaintext:
string, and a `@returns` line that states the exact type returned (Uint8Array or
ArrayBuffer) and that it represents the encrypted bytes. Ensure the comment is
concise and placed immediately above the encryptString declaration.
- Line 22: The exported function signature uses an invalid generic type; change
the return type of encryptString from Uint8Array<ArrayBuffer> to plain
Uint8Array and add JSDoc above the function describing its purpose, parameters,
and return value (e.g., `@param` plaintext: string, `@returns`: Uint8Array of
encrypted bytes) so the public API is documented; update the function
declaration for encryptString and place the JSDoc comment directly above it.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: d16763ac-b00d-4869-a798-5db30c2b9208

📥 Commits

Reviewing files that changed from the base of the PR and between 6a6b40d and abb1337.

📒 Files selected for processing (4)
  • src/app/_components/layout/GlobalAddTaskButton.tsx
  • src/server/services/notifications/WhatsAppNotificationService.ts
  • src/server/services/whatsapp/OptimizedQueries.ts
  • src/server/utils/encryption.ts
✅ Files skipped from review due to trivial changes (1)
  • src/server/services/whatsapp/OptimizedQueries.ts

Comment on lines +93 to +96
// Meta-specific fields are nullable (null for non-Meta providers like DIALOG_360)
if (!integration.whatsappConfig.phoneNumberId || !integration.whatsappConfig.businessAccountId) {
return null;
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Over-strict credential gate can block valid sends

Line 94 currently requires both phoneNumberId and businessAccountId, but this service only needs phoneNumberId + accessToken for message send/test calls. Since businessAccountId is nullable in schema, this can cause false “No WhatsApp credentials found” failures for otherwise valid configs.

Suggested fix
- private async getCredentials(): Promise<{ accessToken: string; phoneNumberId: string; businessAccountId: string } | null> {
-   if (this.accessToken && this.phoneNumberId && this.businessAccountId) {
+ private async getCredentials(): Promise<{ accessToken: string; phoneNumberId: string; businessAccountId?: string } | null> {
+   if (this.accessToken && this.phoneNumberId) {
      return {
        accessToken: this.accessToken,
        phoneNumberId: this.phoneNumberId,
        businessAccountId: this.businessAccountId,
      };
    }
...
-   if (!integration.whatsappConfig.phoneNumberId || !integration.whatsappConfig.businessAccountId) {
+   if (!integration.whatsappConfig.phoneNumberId) {
      return null;
    }

    this.accessToken = accessTokenCred.key;
    this.phoneNumberId = integration.whatsappConfig.phoneNumberId;
-   this.businessAccountId = integration.whatsappConfig.businessAccountId;
+   this.businessAccountId = integration.whatsappConfig.businessAccountId ?? undefined;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/server/services/notifications/WhatsAppNotificationService.ts` around
lines 93 - 96, The guard that returns "no credentials" is too strict: in
WhatsAppNotificationService (check the code that inspects
integration.whatsappConfig) only require phoneNumberId and accessToken for
send/test flows rather than also requiring businessAccountId; update the
conditional to return null only when phoneNumberId OR accessToken is missing
(i.e., require both phoneNumberId and accessToken), and remove the
businessAccountId dependency so configs with a null businessAccountId but valid
phoneNumberId+accessToken are allowed.

}

export function encryptString(plaintext: string): Buffer {
export function encryptString(plaintext: string): Uint8Array<ArrayBuffer> {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Add JSDoc for exported encryptString.

This public function was modified and still lacks documentation. Please add a short JSDoc block (purpose + return format).

As per coding guidelines **/*.{ts,tsx,js,jsx}: “Add JSDoc/documentation comments for public functions”.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/server/utils/encryption.ts` at line 22, Add a JSDoc block above the
exported function encryptString describing its purpose (e.g., "Encrypts a UTF-8
plaintext string using the module's crypto routine") and specifying the return
format clearly (e.g., that it returns an encrypted byte buffer as a Uint8Array
or ArrayBuffer, whichever the implementation returns). Include parameter
description for plaintext: string, and a `@returns` line that states the exact
type returned (Uint8Array or ArrayBuffer) and that it represents the encrypted
bytes. Ensure the comment is concise and placed immediately above the
encryptString declaration.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "== TypeScript version from package.json files =="
fd -HI '^package.json$' | while read -r pkg; do
  echo "--- $pkg ---"
  jq -r '.devDependencies.typescript // .dependencies.typescript // "typescript not declared"' "$pkg"
done

echo
echo "== tsconfig compiler target/lib (if present) =="
fd -HI '^tsconfig.*\.json$' | while read -r cfg; do
  echo "--- $cfg ---"
  jq -r '.compilerOptions.target // "target: <unset>"' "$cfg"
  jq -r '.compilerOptions.lib // "lib: <unset>"' "$cfg"
done

echo
echo "== Generic Uint8Array usage in TS files =="
rg -n --type=ts 'Uint8Array<'

Repository: positonic/exponential

Length of output: 50460


🏁 Script executed:

#!/bin/bash

# Root tsconfig.json - TypeScript configuration
echo "=== Root tsconfig.json ==="
cat tsconfig.json 2>/dev/null || echo "Not found"

echo -e "\n=== Root package.json - TypeScript version ==="
jq '.devDependencies.typescript // .dependencies.typescript' package.json 2>/dev/null || echo "Not found"

# Read the specific file and lines around line 22
echo -e "\n=== src/server/utils/encryption.ts (lines 15-35) ==="
sed -n '15,35p' src/server/utils/encryption.ts 2>/dev/null || echo "File not found"

# Search for Uint8Array usage in src directory only (not node_modules)
echo -e "\n=== Uint8Array usage in src/ ==="
rg 'Uint8Array' src/ --type=ts -n 2>/dev/null | head -20

# Check if JSDoc exists above the function
echo -e "\n=== Check for JSDoc above encryptString (lines 1-25) ==="
head -25 src/server/utils/encryption.ts 2>/dev/null

Repository: positonic/exponential

Length of output: 4105


Fix TypeScript type error and add JSDoc to exported function.

Line 22 uses invalid syntax Uint8Array<ArrayBuffer>. The Uint8Array type is not generic; use Uint8Array as the return type instead. Additionally, add JSDoc documentation to this public exported function explaining its purpose and parameters.

Type error details `Uint8Array` is invalid TypeScript (error: "Type 'Uint8Array' is not generic"). The correct return type is simply `Uint8Array`. The ArrayBuffer backing store is implicit and does not require explicit generic syntax.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/server/utils/encryption.ts` at line 22, The exported function signature
uses an invalid generic type; change the return type of encryptString from
Uint8Array<ArrayBuffer> to plain Uint8Array and add JSDoc above the function
describing its purpose, parameters, and return value (e.g., `@param` plaintext:
string, `@returns`: Uint8Array of encrypted bytes) so the public API is
documented; update the function declaration for encryptString and place the
JSDoc comment directly above it.

@Prajjawalk Prajjawalk merged commit 45e040d into main Apr 28, 2026
8 checks passed
@coderabbitai coderabbitai Bot mentioned this pull request May 2, 2026
22 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant