feat: add unique field constraints support with validation and model …#52
Conversation
|
@Special7ka is attempting to deploy a commit to the Yash Pouranik's projects Team on Vercel. A member of the Team first needs to authorize it. |
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds end-to-end support for per-field unique constraints: model/schema changes, index creation with duplicate detection and rollback, duplicate-key (11000) -> HTTP 409 handling for data endpoints, and improved collection lifecycle/error recovery including jwtSecret initialization. Changes
Sequence DiagramsequenceDiagram
participant Client
participant SchemaCtrl as Schema Controller
participant ProjectModel as Project Model
participant ModelUtils as Model Utils
participant MongoDB as MongoDB
Client->>SchemaCtrl: POST /schema (payload with unique fields)
SchemaCtrl->>ProjectModel: persist collection config
ProjectModel-->>SchemaCtrl: config persisted
SchemaCtrl->>ModelUtils: getCompiledModel / compile model
ModelUtils->>MongoDB: compile model / create collection if needed
MongoDB-->>ModelUtils: model/collection ready
SchemaCtrl->>ModelUtils: createUniqueIndexes(Model, fields)
ModelUtils->>MongoDB: aggregate to detect duplicates
MongoDB-->>ModelUtils: duplicates found? (yes/no)
alt No duplicates
ModelUtils->>MongoDB: create unique indexes (partialFilterExpression if optional)
MongoDB-->>ModelUtils: indexes created
ModelUtils-->>SchemaCtrl: success
SchemaCtrl-->>Client: 201 Created
else Duplicates or index failure
SchemaCtrl->>ProjectModel: remove collection config (rollback)
SchemaCtrl->>ModelUtils: clearCompiledModel
SchemaCtrl->>MongoDB: drop collection if newly created
SchemaCtrl-->>Client: 400 Bad Request / error
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related issues
Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Code Review
This pull request introduces support for unique constraints within collection schemas. Key changes include updating the Project model and Zod validation schemas to include a 'unique' property, implementing logic to create MongoDB unique indexes, and adding error handling for duplicate key violations (Mongo error 11000) in both the dashboard and public APIs. Additionally, a rollback mechanism was added to revert schema changes if unique index creation fails due to existing duplicate data. Feedback is provided regarding the duplication of the UNIQUE_SUPPORTED_TYPES constant across utility files, which should be moved to a shared constants file to improve maintainability.
|
|
||
| // FUNCTION - BUILD FIELD SCHEMA ZOD | ||
| const MAX_FIELD_DEPTH = 3; | ||
| const UNIQUE_SUPPORTED_TYPES = ["String", "Number", "Boolean", "Date"]; |
There was a problem hiding this comment.
The UNIQUE_SUPPORTED_TYPES constant is duplicated. It is also defined in packages/common/src/utils/injectModel.js, and implemented differently (as a Set there, and an Array here).
This duplication can lead to inconsistencies if the list of supported types changes in the future and is only updated in one place.
To improve maintainability and ensure a single source of truth, this constant should be defined in a shared file (e.g., packages/common/src/utils/constants.js) and imported where needed.
There was a problem hiding this comment.
Actionable comments posted: 6
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/dashboard-api/src/controllers/project.controller.js`:
- Around line 477-492: The current delete flow swallows all errors from
connection.db.dropCollection and proceeds to remove the collection from project
metadata (project.collections.splice) and update project storage (project.save),
which can leave the physical collection and caches stale; change the flow so
that dropCollection errors are treated as failures: if
connection.db.dropCollection(finalCollectionName) throws, propagate/return the
error (do not call project.collections.splice or project.save), otherwise on
successful drop proceed to remove the collection from project.collections and
await project.save(), and then perform cache invalidation for compiled models
and API-key caches (add/ call the appropriate cache-clear functions after
project.save and before calling deleteProjectById/setProjectById) to ensure no
stale schema or keys remain.
- Around line 398-428: The rollback can fail because you save the new collection
before creating indexes and you re-call getConnection() in the catch; fix by
obtaining the DB connection and creating the compiled model/indexes before
persisting the collection: call getConnection(projectId) and build the Model via
getCompiledModel/createUniqueIndexes first, then push to project.collections and
await project.save(); alternatively, if you must save first, reuse the same
connection variable (do not re-call getConnection in the catch) so
clearCompiledModel(connection, compiledCollectionName) can run even if
getConnection() would fail.
- Around line 710-722: The quota update is being persisted before the document
update and can remain changed if docToEdit.save() fails (e.g., duplicate key
11000); change the order so you check the limit and update project.databaseUsed
in memory but only call project.save() after docToEdit.save() succeeds (or wrap
both saves in a transaction/session if supported), i.e., keep the limit check
using project.resources.db.isExternal, project.databaseLimit and
project.databaseUsed as now, but move the project.save() call to after the
successful await docToEdit.save() (or perform both saves inside a transactional
session) to ensure quota is only persisted when the document update commits.
In `@apps/public-api/src/controllers/data.controller.js`:
- Around line 12-14: Update the isDuplicateKeyError helper so it only recognizes
Mongo duplicate-key errors by checking the numeric error code 11000; remove the
broad err.name === "MongoServerError" branch and return true only when err &&
err.code === 11000 (optionally handling string vs numeric codes if needed),
leaving other MongoServerError cases to propagate normally.
In `@apps/public-api/src/controllers/schema.controller.js`:
- Around line 93-123: The rollback currently re-awaits getConnection() inside
the catch, which can fail and prevent removing the persisted collection; call
getConnection(fullProject._id) before the try and reuse that connection in the
try/catch so rollback never depends on reconnecting, then in the catch
immediately clear the compiled model using the existing connection
(clearCompiledModel(connection, compiledCollectionName)) and remove the saved
entry from fullProject.collections and await fullProject.save(); ensure the code
uses the earlier connection and does not re-call getConnection in the catch path
(also consider moving creation of compiledCollectionName into the outer scope so
it’s available for rollback).
In `@packages/common/src/utils/input.validation.js`:
- Around line 108-131: The validation currently allows unique: true on nested
fields which are ignored later; update the .refine predicate(s) that reference
MAX_FIELD_DEPTH and UNIQUE_SUPPORTED_TYPES to reject unique when not at the
top-level by adding a check that data.unique === true is invalid unless depth
=== 0 (or equivalent root indicator); modify both occurrences (the current
refine block and the similar one at lines 198-231) so the condition becomes e.g.
if (data.unique === true && (depth > 0 ||
!UNIQUE_SUPPORTED_TYPES.includes(data.type))) return false, ensuring unique
flags are only accepted on top-level fields.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: d62cb57f-232d-4a12-9c6e-5998c72701bc
📒 Files selected for processing (7)
apps/dashboard-api/src/controllers/project.controller.jsapps/public-api/src/controllers/data.controller.jsapps/public-api/src/controllers/schema.controller.jspackages/common/src/index.jspackages/common/src/models/Project.jspackages/common/src/utils/injectModel.jspackages/common/src/utils/input.validation.js
| fullProject.collections.push({ name, model: transformedFields }); | ||
| await fullProject.save(); | ||
|
|
||
| res.status(201).json({ message: "Schema created successfully", project: projectObj }); | ||
| } catch (err) { | ||
| if (err instanceof z.ZodError) return res.status(400).json({ error: err.errors }); | ||
| console.error(err); | ||
| res.status(500).json({ error: err.message }); | ||
| try { | ||
| const collectionConfig = fullProject.collections.find( | ||
| (c) => c.name === name, | ||
| ); | ||
|
|
||
| const connection = await getConnection(fullProject._id); | ||
| const Model = getCompiledModel( | ||
| connection, | ||
| collectionConfig, | ||
| fullProject._id, | ||
| fullProject.resources.db.isExternal, | ||
| ); | ||
|
|
||
| await createUniqueIndexes(Model, collectionConfig.model); | ||
| } catch (error) { | ||
| const compiledCollectionName = fullProject.resources.db.isExternal | ||
| ? name | ||
| : `${fullProject._id}_${name}`; | ||
|
|
||
| const connection = await getConnection(fullProject._id); | ||
| clearCompiledModel(connection, compiledCollectionName); | ||
|
|
||
| fullProject.collections = fullProject.collections.filter( | ||
| (c) => c.name !== name, | ||
| ); | ||
| await fullProject.save(); | ||
|
|
||
| return res.status(400).json({ error: error.message }); |
There was a problem hiding this comment.
Rollback shouldn’t depend on reconnecting to the same database.
Line 94 persists the new collection before index creation, but the catch does another getConnection() on Line 115 before it removes that saved entry. If the original failure is the connection/index step itself, that second await can fail too and the rollback never reaches Lines 118-121. This also leaves any backing collection that was already created in place.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/public-api/src/controllers/schema.controller.js` around lines 93 - 123,
The rollback currently re-awaits getConnection() inside the catch, which can
fail and prevent removing the persisted collection; call
getConnection(fullProject._id) before the try and reuse that connection in the
try/catch so rollback never depends on reconnecting, then in the catch
immediately clear the compiled model using the existing connection
(clearCompiledModel(connection, compiledCollectionName)) and remove the saved
entry from fullProject.collections and await fullProject.save(); ensure the code
uses the earlier connection and does not re-call getConnection in the catch path
(also consider moving creation of compiledCollectionName into the outer scope so
it’s available for rollback).
|
|
||
| // FUNCTION - BUILD FIELD SCHEMA ZOD | ||
| const MAX_FIELD_DEPTH = 3; | ||
| const UNIQUE_SUPPORTED_TYPES = ["String", "Number", "Boolean", "Date"]; |
| .status(400) | ||
| .json({ error: "Collection/Schema already exists" }); | ||
|
|
||
| if (!fullProject.jwtSecret) fullProject.jwtSecret = uuidv4(); |
There was a problem hiding this comment.
@copilot
@gemini
@coderabbitai
i think uuid is predictable, we should use random bytes 32 by crypto module ?
There was a problem hiding this comment.
@yash-pouranik, great observation! Here's a breakdown of the tradeoffs:
Is uuidv4 predictable?
Not exactly — the uuid library's v4 implementation uses a CSPRNG (the Node.js crypto module) internally, so it isn't directly predictable. Both the uuid library (v4 variant) and nanoid use a CSPRNG and transform the bytes in an unbiased, secure way.
However, crypto.randomBytes(32) is still the better choice for a JWT secret, for two reasons:
-
Higher entropy: Although UUID hex characters can each represent 4 bits of entropy, 6 bits of the hex representation in a UUID are fixed, resulting in only 122 bits of entropy (not 128). By contrast,
crypto.randomBytes(32)gives you a full 256 bits of entropy — double what a UUID provides, and well above the NIST-recommended 128-bit floor for secrets. -
No structural leakage: UUIDs have a well-known, fixed format (version and variant nibbles are always the same), which marginally reduces the effective search space. A raw random byte string has no such structure.
Suggested change:
Replace:
if (!fullProject.jwtSecret) fullProject.jwtSecret = uuidv4();With:
const { randomBytes } = require("crypto");
// ...
if (!fullProject.jwtSecret)
fullProject.jwtSecret = randomBytes(32).toString("hex");This also removes any dependency on the uuid library if uuidv4 is no longer used elsewhere in the file.
Note: This same pattern applies to the equivalent code in
apps/dashboard-api/src/controllers/project.controller.jsifuuidv4()is used forjwtSecretgeneration there too.
There was a problem hiding this comment.
@Special7ka use generate api key function to create jwt if not present
There was a problem hiding this comment.
Seems like the humans are having a chat. I'll hop back into my burrow for now. If you need me again, just tag @coderabbitai in a new comment, and I'll come hopping out!
|
PR tries to solve #49 |
There was a problem hiding this comment.
Pull request overview
This PR migrates support for unique field constraints into the monorepo, covering schema validation, persistence in project definitions, MongoDB unique index creation (with rollback), and translating duplicate-key writes into 409 Conflict responses across both public-api and dashboard-api.
Changes:
- Extend Zod schemas to accept
uniqueon collection fields and validate allowed unique-capable types. - Persist
uniqueinProject.collections[].model[]field definitions. - Add unique index creation + rollback logic in the shared mongoose model injection flow; map duplicate-key errors to
409.
Reviewed changes
Copilot reviewed 7 out of 7 changed files in this pull request and generated 7 comments.
Show a summary per file
| File | Description |
|---|---|
| packages/common/src/utils/input.validation.js | Adds unique to schema field validation and enforces allowed types. |
| packages/common/src/utils/injectModel.js | Adds duplicate detection + unique index creation with rollback; exposes createUniqueIndexes. |
| packages/common/src/models/Project.js | Persists unique on stored field definitions. |
| packages/common/src/index.js | Exports createUniqueIndexes from @urbackend/common. |
| apps/public-api/src/controllers/schema.controller.js | Creates unique indexes during collection creation; rolls back collection config on failure. |
| apps/public-api/src/controllers/data.controller.js | Returns 409 Conflict for duplicate-key errors on insert/update; enables runValidators on update. |
| apps/dashboard-api/src/controllers/project.controller.js | Mirrors unique index creation + rollback in dashboard collection creation and returns 409 on dup keys. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| if (depth >= MAX_FIELD_DEPTH && (data.type === 'Object' || (data.type === 'Array' && data.items?.type === 'Object'))) return false; | ||
| const base = z | ||
| .object({ | ||
| key: z.string().min(1, "Field name is required"), |
There was a problem hiding this comment.
key is only validated as a non-empty string. MongoDB field names containing . or starting with $ are problematic (can't be stored reliably and can break schema/index creation). Consider adding a stricter validation (e.g., reject $ prefix and .) for field keys (and API name) to prevent invalid schemas from being persisted.
| key: z.string().min(1, "Field name is required"), | |
| key: z | |
| .string() | |
| .min(1, "Field name is required") | |
| .regex(/^(?!\$)(?!.*\.)[^\s]+$/, { | |
| message: | |
| "Field name must not start with '$', contain '.', or include whitespace", | |
| }), |
| if (data.unique === true && !UNIQUE_SUPPORTED_TYPES.includes(data.type)) | ||
| return false; |
There was a problem hiding this comment.
unique is allowed at any nesting depth here, but index creation later only iterates the top-level collectionConfig.model fields (no recursion). This means nested fields can be marked unique and stored but will never be enforced. Either disallow unique when depth > 1, or implement recursive unique index creation for nested paths.
| return Model.aggregate([ | ||
| { | ||
| $match: { | ||
| [fieldKey]: { $exists: true, $ne: null }, |
There was a problem hiding this comment.
findDuplicates excludes null values ($ne: null), but the unique index you create later will still index documents where the field exists with null (even with sparse: true). If multiple docs have { fieldKey: null }, createIndex can fail even though this pre-check passes. Consider aligning the pre-check with index behavior (e.g., include nulls, or use a partialFilterExpression that excludes null).
| [fieldKey]: { $exists: true, $ne: null }, | |
| [fieldKey]: { $exists: true }, |
| for (const field of fields) { | ||
| if (!field.unique) continue; | ||
| if (!UNIQUE_SUPPORTED_TYPES.has(field.type)) continue; | ||
|
|
There was a problem hiding this comment.
createUniqueIndexes only walks the top-level fields array. Since schemas can include nested fields/items.fields, any nested unique: true flags will be silently ignored (no index created). If top-level-only is intended, validate/reject nested unique earlier; otherwise recurse and create indexes for nested paths.
| const isValidId = (id) => mongoose.Types.ObjectId.isValid(id); | ||
|
|
||
| const isDuplicateKeyError = (err) => { | ||
| return err && (err.code === 11000 || err.name === "MongoServerError"); |
There was a problem hiding this comment.
isDuplicateKeyError treats any MongoServerError as a duplicate-key conflict. Many non-duplicate server errors share that name, which would cause incorrect 409 responses. Recommend checking err.code === 11000 (and/or err.codeName === "DuplicateKey") rather than err.name.
| return err && (err.code === 11000 || err.name === "MongoServerError"); | |
| return err && (err.code === 11000 || err.codeName === "DuplicateKey"); |
| } catch (error) { | ||
| const compiledCollectionName = fullProject.resources.db.isExternal | ||
| ? name | ||
| : `${fullProject._id}_${name}`; | ||
|
|
||
| const connection = await getConnection(fullProject._id); | ||
| clearCompiledModel(connection, compiledCollectionName); | ||
|
|
||
| fullProject.collections = fullProject.collections.filter( | ||
| (c) => c.name !== name, | ||
| ); | ||
| await fullProject.save(); |
There was a problem hiding this comment.
On unique-index creation failure, the rollback removes the collection config and clears the compiled model, but it doesn't attempt to drop the newly created MongoDB collection. Since createIndex can implicitly create the collection, this can leave an orphaned (likely empty) collection in the DB. Consider dropping compiledCollectionName in the rollback (ignore "ns not found" errors).
| } catch (error) { | ||
| const compiledCollectionName = project.resources.db.isExternal | ||
| ? collectionName | ||
| : `${project._id}_${collectionName}`; | ||
|
|
||
| const connection = await getConnection(projectId); | ||
| clearCompiledModel(connection, compiledCollectionName); | ||
|
|
||
| project.collections = project.collections.filter( | ||
| (c) => c.name !== collectionName, | ||
| ); | ||
| await project.save(); | ||
|
|
||
| return res.status(400).json({ error: error.message }); |
There was a problem hiding this comment.
On unique-index creation failure, the rollback removes the collection config and clears the compiled model, but it doesn't drop the MongoDB collection that may have been created implicitly by createIndex. This can leave orphaned collections in the DB. Consider dropping compiledCollectionName during rollback (and ignore "ns not found" errors).
|
@Special7ka apply changes based on the comments in this thread |
There was a problem hiding this comment.
Actionable comments posted: 4
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/dashboard-api/src/controllers/project.controller.js`:
- Around line 424-441: The code currently saves the incoming schema into
project.collections then reads it back to build the Mongoose model and indexes,
which lets unsupported "unique" flags slip through and can lose field metadata;
instead, validate top-level primitive-only fields for the unique constraint
against the original schema (the local variable schema) before calling
project.save(), strip or reject unsupported unique flags there, and use that
original schema when calling getCompiledModel(connection, schema, projectId,
project.resources.db.isExternal) and createUniqueIndexes(Model, schema) rather
than re-reading collectionConfig from project.collections; keep the persistence
(project.collections.push and await project.save()) after validation so saved
data is clean.
- Around line 463-465: The rollback unconditionally drops the physical
collection via clearCompiledModel(connection, compiledCollectionName) and
dropCollectionIfExists(connection, compiledCollectionName), which can delete
pre-existing data; add a boolean flag (e.g., collectionExistedBefore) determined
before creating the collection/indexes (check whether compiledCollectionName
exists on connection) and set it into the surrounding scope, then in the
catch/rollback path only call clearCompiledModel and dropCollectionIfExists when
collectionExistedBefore === false (i.e., the namespace was newly created by this
request); ensure any code paths that create indexes (createUniqueIndexes) or the
model check/update this flag so rollback only removes collections you created.
In `@apps/public-api/src/controllers/schema.controller.js`:
- Around line 168-170: The rollback currently always calls
clearCompiledModel(connection, compiledCollectionName) and await
dropCollectionIfExists(connection, compiledCollectionName) even if the
collection preexisted; change the flow to detect and record whether the
collection existed before the create flow (e.g., call
connection.listCollections({name: compiledCollectionName}) or use an exists
check right after you obtain connection/compiledCollectionName) and pass that
boolean into the error/rollback path; then in the rollback handler (the block
calling clearCompiledModel and dropCollectionIfExists) only call
dropCollectionIfExists when the recorded flag indicates the collection did not
exist prior to this request—still call clearCompiledModel unconditionally to
remove in-memory models. Ensure the flag is set early (before
createUniqueIndexes or model compilation) and propagated to the code that
performs the cleanup.
- Around line 86-95: The code currently persists any field.unique and later
calls createUniqueIndexes on the saved collection, which silently ignores unique
flags on nested/object/array/ref fields; update transformField and the schema
save flow to validate and reject unsupported unique shapes up-front: in
transformField (and the higher-level schema validation) ensure unique === true
is only allowed for top-level fields whose mapped type is one of
String/Number/Boolean/Date, reject (throw/return validation error) any field
where unique is set on nested/object/array/ref types or non-top-level paths, and
then pass the validated transformedFields (not the raw persisted entry) to
createUniqueIndexes/model/index creation so index generation is driven from the
validated transformedFields rather than the round-tripped project entry.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: b3d87003-d4ea-46c0-926b-3facb2dbcffa
⛔ Files ignored due to path filters (2)
apps/web-dashboard/package-lock.jsonis excluded by!**/package-lock.jsonpackage-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (7)
apps/dashboard-api/src/controllers/project.controller.jsapps/public-api/src/controllers/data.controller.jsapps/public-api/src/controllers/schema.controller.jsapps/web-dashboard/package.jsonpackages/common/src/utils/injectModel.jspackages/common/src/utils/input.validation.jspackages/common/src/utils/schema.constants.js
✅ Files skipped from review due to trivial changes (2)
- apps/web-dashboard/package.json
- packages/common/src/utils/schema.constants.js
🚧 Files skipped from review as they are similar to previous changes (3)
- apps/public-api/src/controllers/data.controller.js
- packages/common/src/utils/injectModel.js
- packages/common/src/utils/input.validation.js
| project.collections.push({ name: collectionName, model: schema }); | ||
| await project.save(); | ||
| collectionWasPersisted = true; | ||
|
|
||
| const connection = await getConnection(projectId); | ||
| const collectionConfig = project.collections.find( | ||
| (c) => c.name === collectionName, | ||
| ); | ||
|
|
||
| const finalCollectionName = isExternal ? collectionName : `${project._id}_${collectionName}`; | ||
| connection = await getConnection(projectId); | ||
|
|
||
| try { | ||
| await connection.db.dropCollection(finalCollectionName); | ||
| } catch (e) { | ||
| console.warn("Failed to drop collection (might not exist):", finalCollectionName, e.message); | ||
| } | ||
| const Model = getCompiledModel( | ||
| connection, | ||
| collectionConfig, | ||
| projectId, | ||
| project.resources.db.isExternal, | ||
| ); | ||
|
|
||
| project.collections.splice(collectionIndex, 1); | ||
| await project.save(); | ||
| await createUniqueIndexes(Model, collectionConfig.model); |
There was a problem hiding this comment.
Build indexes from the original schema, not the round-tripped project entry.
Lines 424-441 save schema into project.collections and then immediately read it back for getCompiledModel() and createUniqueIndexes(). That still accepts unsupported unique flags, and it can lose field metadata the index builder relies on, such as required. Validate top-level primitive-only unique fields before save(), and use the original schema for model/index creation.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/dashboard-api/src/controllers/project.controller.js` around lines 424 -
441, The code currently saves the incoming schema into project.collections then
reads it back to build the Mongoose model and indexes, which lets unsupported
"unique" flags slip through and can lose field metadata; instead, validate
top-level primitive-only fields for the unique constraint against the original
schema (the local variable schema) before calling project.save(), strip or
reject unsupported unique flags there, and use that original schema when calling
getCompiledModel(connection, schema, projectId, project.resources.db.isExternal)
and createUniqueIndexes(Model, schema) rather than re-reading collectionConfig
from project.collections; keep the persistence (project.collections.push and
await project.save()) after validation so saved data is clean.
| if (connection && compiledCollectionName) { | ||
| clearCompiledModel(connection, compiledCollectionName); | ||
| await dropCollectionIfExists(connection, compiledCollectionName); |
There was a problem hiding this comment.
Rollback can delete an existing collection.
Lines 463-465 always drop the physical collection once a connection/model was created. If the target namespace already existed and createUniqueIndexes() fails because duplicate data is present, this catch removes data that was never created by this request. Guard the drop with a collectionExistedBefore flag so rollback only deletes newly-created namespaces.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/dashboard-api/src/controllers/project.controller.js` around lines 463 -
465, The rollback unconditionally drops the physical collection via
clearCompiledModel(connection, compiledCollectionName) and
dropCollectionIfExists(connection, compiledCollectionName), which can delete
pre-existing data; add a boolean flag (e.g., collectionExistedBefore) determined
before creating the collection/indexes (check whether compiledCollectionName
exists on connection) and set it into the surrounding scope, then in the
catch/rollback path only call clearCompiledModel and dropCollectionIfExists when
collectionExistedBefore === false (i.e., the namespace was newly created by this
request); ensure any code paths that create indexes (createUniqueIndexes) or the
model check/update this flag so rollback only removes collections you created.
There was a problem hiding this comment.
Actionable comments posted: 3
♻️ Duplicate comments (1)
apps/dashboard-api/src/controllers/project.controller.js (1)
559-562:⚠️ Potential issue | 🟠 MajorRollback may drop pre-existing external collections.
For external databases (
project.resources.db.isExternal === true), the physical collection might already exist with data before this request (e.g., manually created, orphaned, or in migration scenarios). IfcreateUniqueIndexesfails on line 63, the rollback at lines 109-111 will unconditionally drop that pre-existing collection, causing data loss.This same vulnerability exists in
apps/public-api/src/controllers/schema.controller.js(lines 168-170).Check if the collection existed before this request and only drop it if newly created:
🛡️ Suggested approach
let collectionWasPersisted = false; let collectionNameForRollback; + let collectionExistedBefore = false; try { // ... existing validation ... compiledCollectionName = project.resources.db.isExternal ? collectionName : `${project._id}_${collectionName}`; + // Check if collection already exists (important for external DBs) + connection = await getConnection(projectId); + const collections = await connection.db.listCollections({ name: compiledCollectionName }).toArray(); + collectionExistedBefore = collections.length > 0; project.collections.push({ // ... });Then in the catch block:
if (connection && compiledCollectionName) { clearCompiledModel(connection, compiledCollectionName); - await dropCollectionIfExists(connection, compiledCollectionName); + if (!collectionExistedBefore) { + await dropCollectionIfExists(connection, compiledCollectionName); + } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/dashboard-api/src/controllers/project.controller.js` around lines 559 - 562, The rollback unconditionally clears and drops the compiled collection (clearCompiledModel and dropCollectionIfExists) even for external DBs (project.resources.db.isExternal === true), which can delete pre-existing data; modify the create/deploy flow to detect whether the compiledCollectionName already existed before this request (e.g., call a checkExistsLike collectionExists(connection, compiledCollectionName) prior to creating indexes) and record a flag (e.g., wasNewlyCreated) in the request scope, then in the catch/rollback path only call clearCompiledModel and dropCollectionIfExists when that flag indicates the collection was created by this operation; apply the same existence-check-and-conditional-drop pattern for the corresponding code path in apps/public-api/src/controllers/schema.controller.js.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/dashboard-api/src/controllers/project.controller.js`:
- Around line 462-464: The cache is being updated (deleteProjectById and
setProjectById) before the database save, risking cache/db inconsistency if
project.save() fails; change the order so you call await project.save() first
and only after it succeeds call deleteProjectById(projectId) and
setProjectById(projectId, project), or wrap the save in a try/catch and only
update the cache in the success path (and optionally revert cache changes in the
catch); locate the sequence around deleteProjectById, setProjectById, and
project.save to implement this reorder or conditional cache update.
- Around line 396-404: Replace inconsistent Zod error checks and invalid
property access by standardizing on z.ZodError and using the .issues array:
update the createProject handler (where it currently reads err.errors) to read
err.issues and format the first issue message, and change the
updateExternalConfig catch (which checks err.name === "ZodError") to use err
instanceof z.ZodError; ensure all Zod error branches across this file use the
same pattern (instanceof z.ZodError) and extract messages from
err.issues?.[0]?.message (falling back to a generic "Validation failed").
In `@packages/common/src/utils/injectModel.js`:
- Around line 71-82: In buildMongooseSchema, the local helper normalizeKey is
defined but never called — the code references an undefined variable
normalizedKey; fix by calling normalizeKey with the actual field key (e.g.,
normalizeKey(field.key) or normalizeKey(field?.key)) and assign the result to a
const normalizedKey before using it; ensure you skip entries when normalizedKey
is falsy and continue to use buildFieldDef(field) to populate schemaDef, then
return new mongoose.Schema(schemaDef, { timestamps: true, strict: false }) as
before.
---
Duplicate comments:
In `@apps/dashboard-api/src/controllers/project.controller.js`:
- Around line 559-562: The rollback unconditionally clears and drops the
compiled collection (clearCompiledModel and dropCollectionIfExists) even for
external DBs (project.resources.db.isExternal === true), which can delete
pre-existing data; modify the create/deploy flow to detect whether the
compiledCollectionName already existed before this request (e.g., call a
checkExistsLike collectionExists(connection, compiledCollectionName) prior to
creating indexes) and record a flag (e.g., wasNewlyCreated) in the request
scope, then in the catch/rollback path only call clearCompiledModel and
dropCollectionIfExists when that flag indicates the collection was created by
this operation; apply the same existence-check-and-conditional-drop pattern for
the corresponding code path in
apps/public-api/src/controllers/schema.controller.js.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 9c7a5395-0c5c-4e9b-b6a1-8c339f937aef
📒 Files selected for processing (6)
apps/dashboard-api/src/controllers/project.controller.jsapps/web-dashboard/package.jsonpackages/common/src/index.jspackages/common/src/models/Project.jspackages/common/src/utils/injectModel.jspackages/common/src/utils/input.validation.js
✅ Files skipped from review due to trivial changes (1)
- apps/web-dashboard/package.json
🚧 Files skipped from review as they are similar to previous changes (3)
- packages/common/src/index.js
- packages/common/src/models/Project.js
- packages/common/src/utils/input.validation.js
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/common/src/utils/injectModel.js`:
- Around line 79-83: Duplicate detection and index creation use the raw
field.key while schema paths are normalized via normalizeKey(field.key), causing
mismatches; update the unique-index flow to use the normalizedKey variable (the
same value assigned to schemaDef[normalizedKey] via buildFieldDef) wherever the
code currently references field.key for duplicate checks and when building
uniqueIndexes and index creation logic (the blocks that collect uniqueIndexes /
check duplicates around the 170-205 region). Ensure you compute normalizedKey
once (as done in fieldsArray.forEach) and use that normalizedKey for all
duplicate detection, index key names, and when adding entries to uniqueIndexes
so the index paths match the compiled schema.
- Around line 166-217: The rollback currently may drop pre-existing indexes
because createUniqueIndexes unconditionally pushes indexName into createdIndexes
after Model.collection.createIndex; fix by checking whether the index existed
before creating it (e.g., call Model.collection.indexExists or iterate
Model.collection.listIndexes() to detect an existing index for field.key) and
only push indexName into createdIndexes if the index did not already exist and
your createIndex call actually created it; alternatively, inspect the
createIndex/createIndexes result to determine creation vs no-op and only
record/drop those newly created indexes in the catch rollback loop.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: e2a5e901-f175-4bab-be3c-c5099b670181
📒 Files selected for processing (3)
apps/dashboard-api/src/controllers/project.controller.jsapps/public-api/src/controllers/schema.controller.jspackages/common/src/utils/injectModel.js
✅ Files skipped from review due to trivial changes (1)
- apps/public-api/src/controllers/schema.controller.js
🚧 Files skipped from review as they are similar to previous changes (1)
- apps/dashboard-api/src/controllers/project.controller.js
| async function createUniqueIndexes(Model, fields = []) { | ||
| const createdIndexes = []; | ||
|
|
||
| try { | ||
| for (const field of fields) { | ||
| if (!field.unique) continue; | ||
| if (!UNIQUE_SUPPORTED_TYPES_SET.has(field.type)) continue; | ||
|
|
||
| const duplicates = await findDuplicates( | ||
| Model, | ||
| field.key, | ||
| !!field.required, | ||
| ); | ||
|
|
||
| if (duplicates.length > 0) { | ||
| const examples = duplicates | ||
| .slice(0, 3) | ||
| .map((d) => JSON.stringify(d._id)) | ||
| .join(", "); | ||
|
|
||
| throw new Error( | ||
| `Cannot create unique index on '${field.key}'. ${duplicates.length} duplicate values exist.${examples ? ` Examples: ${examples}` : ""}`, | ||
| ); | ||
| } | ||
|
|
||
| const indexName = `unique_${field.key}_1`; | ||
|
|
||
| const indexOptions = { | ||
| unique: true, | ||
| name: indexName, | ||
| }; | ||
|
|
||
| if (!field.required) { | ||
| indexOptions.partialFilterExpression = { | ||
| [field.key]: { $exists: true, $ne: null }, | ||
| }; | ||
| } | ||
|
|
||
| await Model.collection.createIndex({ [field.key]: 1 }, indexOptions); | ||
|
|
||
| createdIndexes.push(indexName); | ||
| } | ||
| if (connection.models[collectionName]) { | ||
| delete connection.models[collectionName]; | ||
| } catch (error) { | ||
| for (const indexName of createdIndexes) { | ||
| try { | ||
| await Model.collection.dropIndex(indexName); | ||
| } catch (_) { | ||
| // ignore rollback drop errors | ||
| } | ||
| } | ||
| throw error; | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Verify rollback currently tracks created indexes without checking pre-existing index names.
rg -n -A120 -B20 'async function createUniqueIndexes' packages/common/src/utils/injectModel.jsRepository: yash-pouranik/urBackend
Length of output: 2136
🌐 Web query:
In MongoDB (and via Mongoose's underlying collection.createIndex), if createIndex is called with an index that already exists with identical key/options, does it return the existing index name? If yes, would dropIndex(name) remove that pre-existing index?
💡 Result:
No, MongoDB's createIndex (and Mongoose's collection.createIndex, which wraps the MongoDB Node.js driver's createIndex) does not return the existing index name when called on an identical pre-existing index with matching keys and options. Instead, it succeeds silently without recreating the index (idempotent operation), typically returning the index name only if a new index is created. The official MongoDB docs confirm: "If you call db.collection.createIndex for an index that already exists, MongoDB does not recreate the index." Node.js driver docs show createIndex callback receives the new index name on success, but for existing identical indexes, it succeeds without error and returns the name (though some older behaviors varied). If different options are specified, it errors with "Index already exists with different options". However, if you specify the exact index name in options and it matches an existing index with identical spec, MongoDB treats it as existing and succeeds without rebuilding. Yes, dropIndex(name) will remove the pre-existing index if you call it with that name, regardless of how it was created or "returned". dropIndex drops by name, keys spec, or "*' for all.
Citations:
- 1: https://www.mongodb.com/docs/v6.0/reference/method/db.collection.createindex/
- 2: https://docs.mongodb.com/manual/reference/method/db.collection.createIndex/index.html
- 3: https://www.mongodb.com/docs/manual/reference/method/db.collection.createindex/
- 4: https://www.mongodb.com/docs/v6.0/reference/command/createindexes/
- 5: https://mongoosejs.com/docs/guide
- 6: https://mongodb.github.io/node-mongodb-native/markdown-docs/indexes.html
- 7: https://www.mongodb.com/docs/drivers/node/v6.15/indexes/
- 8: https://mongodb.github.io/node-mongodb-native/3.6/tutorials/create-indexes/
Rollback can drop pre-existing indexes that existed before this function call.
On line 206, createdIndexes.push(indexName) unconditionally records the index name after createIndex(). If that index already existed (from prior deployment, manual creation, or another source), a later failure will trigger rollback and dropIndex() will remove that pre-existing index, causing data integrity loss.
Proposed fix
async function createUniqueIndexes(Model, fields = []) {
+ const existingIndexes = await Model.collection.indexes();
+ const existingIndexNames = new Set(existingIndexes.map((idx) => idx.name));
const createdIndexes = [];
try {
for (const field of fields) {
if (!field.unique) continue;
if (!UNIQUE_SUPPORTED_TYPES_SET.has(field.type)) continue;
@@
- await Model.collection.createIndex({ [field.key]: 1 }, indexOptions);
-
- createdIndexes.push(indexName);
+ const createdOrExistingName = await Model.collection.createIndex(
+ { [field.key]: 1 },
+ indexOptions,
+ );
+
+ if (!existingIndexNames.has(createdOrExistingName)) {
+ createdIndexes.push(createdOrExistingName);
+ existingIndexNames.add(createdOrExistingName);
+ }
}
} catch (error) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/common/src/utils/injectModel.js` around lines 166 - 217, The
rollback currently may drop pre-existing indexes because createUniqueIndexes
unconditionally pushes indexName into createdIndexes after
Model.collection.createIndex; fix by checking whether the index existed before
creating it (e.g., call Model.collection.indexExists or iterate
Model.collection.listIndexes() to detect an existing index for field.key) and
only push indexName into createdIndexes if the index did not already exist and
your createIndex call actually created it; alternatively, inspect the
createIndex/createIndexes result to determine creation vs no-op and only
record/drop those newly created indexes in the catch rollback loop.
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
packages/common/src/utils/injectModel.js (1)
17-19: Normalize nested keys before adding them to sub-schemas.Only the top-level builder uses
normalizeKey(). Line 18 and Line 39 still use rawf.key, so BOM/whitespace cleanup changes behavior at depth 1 but not for nested objects or arrays of objects.Suggested refactor
if (field.type === "Object" && field.fields && field.fields.length > 0) { const subSchema = {}; field.fields.forEach((f) => { - subSchema[f.key] = buildFieldDef(f); + const normalizedKey = normalizeKey(f.key); + if (!normalizedKey) return; + subSchema[normalizedKey] = buildFieldDef(f); }); return { type: subSchema, required: !!field.required }; } @@ const subSchema = {}; field.items.fields.forEach((f) => { - subSchema[f.key] = buildFieldDef(f); + const normalizedKey = normalizeKey(f.key); + if (!normalizedKey) return; + subSchema[normalizedKey] = buildFieldDef(f); }); return { type: [subSchema], required: !!field.required }; }Also applies to: 38-40
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/common/src/utils/injectModel.js` around lines 17 - 19, Nested field keys aren't normalized before being added to subSchema; update the code that iterates field.fields to call normalizeKey on each f.key (e.g., const key = normalizeKey(f.key)) and use that normalized key when assigning to subSchema (subSchema[key] = buildFieldDef(f)). Do the same normalization wherever nested keys are currently used (the array/object nested-field handling that uses f.key) so all nested keys go through normalizeKey before being stored or referenced; reference functions/identifiers: normalizeKey, buildFieldDef, subSchema, field.fields, and f.key.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/common/src/utils/injectModel.js`:
- Around line 139-145: getUniqueFieldFilter currently returns { [fieldKey]: {
$exists: true } } for isRequired, which excludes documents missing the field and
lets createIndex fail later; update getUniqueFieldFilter so the required-field
precheck also matches documents where the field is missing or null (so duplicate
"missing/null" values are detected). In practice, change the isRequired branch
in getUniqueFieldFilter to include $exists: false and null cases (e.g. use an
$or that matches missing or null values in addition to present values) so
findDuplicates() will catch those documents before createIndex() runs.
---
Nitpick comments:
In `@packages/common/src/utils/injectModel.js`:
- Around line 17-19: Nested field keys aren't normalized before being added to
subSchema; update the code that iterates field.fields to call normalizeKey on
each f.key (e.g., const key = normalizeKey(f.key)) and use that normalized key
when assigning to subSchema (subSchema[key] = buildFieldDef(f)). Do the same
normalization wherever nested keys are currently used (the array/object
nested-field handling that uses f.key) so all nested keys go through
normalizeKey before being stored or referenced; reference functions/identifiers:
normalizeKey, buildFieldDef, subSchema, field.fields, and f.key.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: cefda5c2-bd8d-471f-8bc7-14f260a3a7f0
📒 Files selected for processing (1)
packages/common/src/utils/injectModel.js
| function getUniqueFieldFilter(fieldKey, isRequired) { | ||
| if (isRequired) { | ||
| return { [fieldKey]: { $exists: true } }; | ||
| } | ||
|
|
||
| return { [fieldKey]: { $exists: true, $ne: null } }; | ||
| } |
There was a problem hiding this comment.
Include missing documents in the required-field duplicate precheck.
Line 140 filters required fields with { $exists: true }, so legacy docs where the field is missing never reach findDuplicates(). A full unique index still treats repeated missing values as duplicate nulls, so the precheck can pass and createIndex() then fail anyway.
Proposed fix
function getUniqueFieldFilter(fieldKey, isRequired) {
if (isRequired) {
- return { [fieldKey]: { $exists: true } };
+ return {};
}
return { [fieldKey]: { $exists: true, $ne: null } };
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| function getUniqueFieldFilter(fieldKey, isRequired) { | |
| if (isRequired) { | |
| return { [fieldKey]: { $exists: true } }; | |
| } | |
| return { [fieldKey]: { $exists: true, $ne: null } }; | |
| } | |
| function getUniqueFieldFilter(fieldKey, isRequired) { | |
| if (isRequired) { | |
| return {}; | |
| } | |
| return { [fieldKey]: { $exists: true, $ne: null } }; | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/common/src/utils/injectModel.js` around lines 139 - 145,
getUniqueFieldFilter currently returns { [fieldKey]: { $exists: true } } for
isRequired, which excludes documents missing the field and lets createIndex fail
later; update getUniqueFieldFilter so the required-field precheck also matches
documents where the field is missing or null (so duplicate "missing/null" values
are detected). In practice, change the isRequired branch in getUniqueFieldFilter
to include $exists: false and null cases (e.g. use an $or that matches missing
or null values in addition to present values) so findDuplicates() will catch
those documents before createIndex() runs.
There was a problem hiding this comment.
@coderabbitai create a follow up issue for this one, we will work on that later, I think this should be merge now
Let me know what do you think
There was a problem hiding this comment.
🧩 Analysis chain
@yash-pouranik, I've created the follow-up issue for the getUniqueFieldFilter precheck gap. Here's my take on merging:
The PR looks good to merge. The flagged edge case is a correctness concern only in a very specific scenario — when an existing collection already has legacy documents that are missing a field that's now being marked as required + unique. For greenfield projects or collections where the field has always been present, this path is never hit. The core functionality (unique index creation, 409 conflict handling, validation schema support) is solid.
A few things that give me confidence to merge now:
- The duplicate detection and index creation path is gated behind
field.unique === true, so it only activates on explicit opt-in. - The partial filter expression for non-required fields is correctly handled.
- The 409 handling in both
dashboard-apiandpublic-apiis consistent.
The follow-up issue will track the edge-case fix so it doesn't get lost. ✅
🏁 Script executed:
gh issue create \
--repo yash-pouranik/urBackend \
--title "fix(injectModel): include missing-field documents in required-field duplicate precheck before unique index creation" \
--assignee yash-pouranik \
--body "## Summary
In \`packages/common/src/utils/injectModel.js\`, the \`getUniqueFieldFilter\` function currently filters required fields with \`{ \$exists: true }\`. This excludes legacy documents where the field is missing entirely.
A full unique index still treats repeated missing values as duplicate \`null\`s, so the precheck can pass and \`createIndex()\` may then fail at the DB level — bypassing the intended duplicate-detection guard.
## Proposed Fix
\`\`\`diff
function getUniqueFieldFilter(fieldKey, isRequired) {
if (isRequired) {
- return { [fieldKey]: { \$exists: true } };
+ return {};
}
return { [fieldKey]: { \$exists: true, \$ne: null } };
}
\`\`\`
By returning an empty filter \`{}\` for required fields, \`findDuplicates()\` will scan all documents (including those missing the field), so duplicate missing/null values are detected before \`createIndex()\` runs.
## References
- PR: https://github.com/yash-pouranik/urBackend/pull/52
- Review comment: https://github.com/yash-pouranik/urBackend/pull/52#discussion_r3024988891
Requested by `@yash-pouranik`"
Length of output: 52
@yash-pouranik, the follow-up issue has been created: #67 🎉
The precheck edge case is now tracked there and won't be lost. The PR is good to go ahead and merge whenever you're ready!
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
@coderabbitai so I should merge this? |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chainLet me check the current state of the PR and the review comments directly. 🏁 Script executed: Length of output: 111851
🔴 Critical
🟠 Major
🧹 Nitpick
Please ask |
|
@coderabbitai can you push the required changes? |
|
Tip For best results, initiate chat on the files or code changes.
If you'd like, I can plan fixes for all the outstanding issues:
Just reply "yes, plan the changes" and I'll kick off the full planning process. |
|
Last things you have to work on before merging. @Special7ka |
|
Hey @coderabbitai there's merge conflict in one file only i will review that, other than that please review may i merge this with existing main branch? like it will not affect existing features/services? |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chainLet me pull the latest PR state to give you an accurate answer. 🏁 Script executed: Length of output: 84632
🔴 Still unresolved (non-outdated)1. Proposed fix: // Snapshot existing index names BEFORE the loop
const existingIndexes = await Model.collection.indexes();
const existingIndexNames = new Set(existingIndexes.map((idx) => idx.name));
// Then inside the loop, after createIndex:
const createdName = await Model.collection.createIndex({ [normalizedKey]: 1 }, indexOptions);
if (!existingIndexNames.has(createdName)) {
createdIndexes.push(createdName);
}2. Proposed fix: function getUniqueFieldFilter(fieldKey, isRequired) {
if (isRequired) {
return {}; // scan ALL docs, including those missing the field
}
return { [fieldKey]: { $exists: true, $ne: null } };
}
|
| Change | Risk |
|---|---|
runValidators: true added to findByIdAndUpdate in public-api/data.controller.js |
Previously, validators were skipped on updates. Existing update calls that pass partial/invalid data (valid before) may now return errors. Low-risk if schemas are tight, but worth testing. |
userSignupSchema changed from .passthrough() → plain z.object() |
Extra fields in the signup payload that were silently accepted before will now be rejected. If any client sends extra fields, signups would break. Worth verifying no client sends extra fields. |
Everything else — the unique field defaulting to false, the new 409 responses for duplicate keys, index creation on new collections — is purely additive and won't affect existing data or functionality.
Bottom line: Fix items 1 & 2 above (both in injectModel.js), resolve the merge conflict, and verify the two behavioral notes above. After that, I'd be comfortable signing off. Tag me once @Special7ka pushes the fixes and I'll do a final check.
|
@coderabbitai look actually this PR is too old, and changes are getting bigger and bigger and this will be too tough for me to review manually as well can we do this in near future? |
|
Tip For best results, initiate chat on the files or code changes.
Whenever you do pick it back up, the two most critical fixes needed before merge are in Let me know how you'd like to proceed! 🙂 |
|
@coderabbitai I am thinking of merging this and then myself complete this? or create follow up issue and I will create PR? |
|
Tip For best results, initiate chat on the files or code changes.
Merging as-is + follow-up issue is the better pathThe two remaining issues are edge-case failure paths, not happy-path bugs:
The core feature — creating unique indexes, returning 409 on duplicate keys — works correctly for the normal flow. Merging and tracking the two edge cases as a follow-up is a reasonable call. Recommendation: Merge → follow-up issueThe follow-up issue should cover:
Would you like me to open a follow-up GitHub issue right now with the full details of both fixes, assigned to you? Just say the word and I'll create it. |
|
@coderabbitai create issue, then I will merge. |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🏁 Script executed: Length of output: 52
It covers both items:
You're good to merge PR |
|
OK Again Thank you @Special7ka For the intense work and this detailed PR, All the best ❤️ |
|
Also dont forget to Star this REPO @Special7ka |
Summary
This PR migrates unique field constraints support to the new monorepo structure.
Changes
uniquesupport to field validation schemasuniquein project field definitions409 Conflictpublic-apidashboard-apiNotes
String,Number,Boolean,Date)Summary by CodeRabbit
New Features
Bug Fixes
Improvements