Skip to content

Refactor table use cases and utilities for improved readability and maintainability#1687

Merged
Artuomka merged 2 commits intomainfrom
backend_refactoring_ts
Mar 25, 2026
Merged

Refactor table use cases and utilities for improved readability and maintainability#1687
Artuomka merged 2 commits intomainfrom
backend_refactoring_ts

Conversation

@Artuomka
Copy link
Copy Markdown
Collaborator

@Artuomka Artuomka commented Mar 24, 2026

  • Simplified connection validation logic in GetTableStructureUseCase, ImportCSVInTableUseCase, and UpdateRowInTableUseCase by introducing validateConnection utility.
  • Extracted user email retrieval for agent connections into getUserEmailForAgent utility.
  • Modularized foreign key handling by creating extractForeignKeysFromWidgets, filterForeignKeysByReadPermission, and attachForeignColumnNames utilities.
  • Enhanced referenced table processing with filterReferencedTablesByPermission and enrichReferencedTablesWithDisplayNames utilities.
  • Created new utilities for building table settings for responses and saving table info in the database.
  • Improved error handling and code clarity across various use cases.
  • Added new utility functions for validating table rows and finding filtering and ordering fields.

Summary by CodeRabbit

Release Notes

  • Bug Fixes

    • Corrected data structure naming to improve code clarity and consistency.
    • Fixed field name typo in CSV import configuration.
  • Improvements

    • Enhanced type safety across table API endpoints and data models for greater stability.
    • Refined pagination parameter handling and validation.
    • Improved foreign key and referenced table permission processing.
    • Consolidated table settings construction for consistent response formatting.

…aintainability

- Simplified connection validation logic in GetTableStructureUseCase, ImportCSVInTableUseCase, and UpdateRowInTableUseCase by introducing validateConnection utility.
- Extracted user email retrieval for agent connections into getUserEmailForAgent utility.
- Modularized foreign key handling by creating extractForeignKeysFromWidgets, filterForeignKeysByReadPermission, and attachForeignColumnNames utilities.
- Enhanced referenced table processing with filterReferencedTablesByPermission and enrichReferencedTablesWithDisplayNames utilities.
- Created new utilities for building table settings for responses and saving table info in the database.
- Improved error handling and code clarity across various use cases.
- Added new utility functions for validating table rows and finding filtering and ordering fields.
Copilot AI review requested due to automatic review settings March 24, 2026 16:44
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Mar 24, 2026

📝 Walkthrough

Walkthrough

A comprehensive refactoring of the table entity backend that extracts common connection validation, user email resolution, foreign key processing, and table settings logic from multiple use cases into eight new reusable utilities. Additionally, parameter types in the controller and data structures are improved for better type safety and consistency.

Changes

Cohort / File(s) Summary
Data Structure Type Corrections
found-table-rows.ds.ts, import-scv-in-table.ds.ts, table-datastructures.ts
Fixed typo: OrderingFiledDsOrderingFieldDs; fixed property name: materPwdmasterPwd; narrowed column_default type from any to string | number | boolean | null.
Controller Parameter Typing
table.controller.ts
Enhanced type safety: page/perPage parameters changed from any to string; query parameters properly typed as Record<string, string> or Record<string, unknown>; body parameters correctly typed; added pagination parsing with validation.
Connection & User Validation Utilities
validate-connection.util.ts
New utility centralizing connection null-check (throws CONNECTION_NOT_FOUND) and frozen-connection check (throws CONNECTION_IS_FROZEN). Also exports getUserEmailForAgent() helper for conditional agent email resolution.
Foreign Key Processing Utilities
extract-foreign-keys-from-widgets.util.ts, filter-foreign-keys-by-permission.util.ts, attach-foreign-column-names.util.ts
Extracted foreign key extraction from widgets (with JSON5 parsing), permission-based filtering, and autocomplete column enrichment into reusable utilities replacing duplicated inline logic across use cases.
Referenced Table Processing Utility
process-referenced-tables.util.ts
New utility exporting two functions: filterReferencedTablesByPermission() for filtering referenced_by entries by read permission, and enrichReferencedTablesWithDisplayNames() for augmenting tables with display metadata.
Table Settings & Display Utilities
add-display-names-for-tables.util.ts, build-table-settings-for-response.util.ts
New utilities for populating display names/icons from settings and constructing normalized table_settings response objects with proper defaults.
Database Persistence Orchestrator
save-table-info-in-database-orchestrator.util.ts
New orchestrator managing concurrent table structure fetching, entity creation, field persistence, and connection update within a controlled queue, replacing inline logic in use cases.
Existing Utility Updates
find-ordering-field.util.ts, find-filtering-fields.util.ts, hash-passwords-in-row.util.ts, validate-table-row.util.ts
Updated type references (OrderingFiledDsOrderingFieldDs), narrowed parameter types (Record<string, any>Record<string, unknown>), improved type assertions, and fixed row-key lowercasing logic.
Use Case Refactoring
add-row-in-table.use.case.ts, bulk-update-rows-in-table.use.case.ts, delete-row-from-table.use.case.ts, delete-rows-from-table.use.case.ts, export-csv-from-table.use.case.ts, find-tables-in-connection.use.case.ts, find-tables-in-connection-v2.use.case.ts, get-row-by-primary-key.use.case.ts, get-table-rows.use.case.ts, get-table-structure.use.case.ts, import-csv-in-table-user.case.ts, update-row-in-table.use.case.ts
Delegated connection validation, agent user email resolution, foreign key extraction/filtering/enrichment, referenced table processing, and table settings construction to new utilities; removed duplicate inline logic and private methods; reduced net lines while preserving control flow.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Poem

🐰 Refactored tables, neat and clean,
Extracted helpers, code pristine,
From duplication, we are freed,
With types that guard what we all need,
Let validation utilities lead! 🌟

🚥 Pre-merge checks | ✅ 2 | ❌ 2

❌ Failed checks (2 warnings)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
Security Check ⚠️ Warning Pull request introduces critical password encryption vulnerability by not validating input types before encryption, allowing plaintext passwords to persist in database on encryption failure. Add explicit typeof validation for string type before encryption in hash-passwords-in-row.util.ts line 26; wrap database saves in transactions; convert silent error handling to propagate errors.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The PR title accurately summarizes the main objective of the changeset, which is refactoring table use cases and utilities to improve code quality, readability, and maintainability.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch backend_refactoring_ts

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR refactors multiple table-related use cases by extracting repeated logic into shared utilities (connection validation, agent user email lookup, foreign key/referenced-table processing, response table-settings shaping, and saving table metadata), while also tightening some typings.

Changes:

  • Introduces reusable utilities: validateConnection, getUserEmailForAgent, foreign key extraction/filter/enrichment helpers, referenced-table permission/display-name helpers, and table-settings response builder.
  • Rewires several table use cases to use the new utilities and removes duplicated private helpers.
  • Updates a few data structures/typings and controller query parsing.

Reviewed changes

Copilot reviewed 28 out of 28 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
backend/src/entities/table/utils/validate-table-row.util.ts Refactors row key normalization when validating table rows.
backend/src/entities/table/utils/validate-connection.util.ts Adds shared connection validation + agent email lookup helpers.
backend/src/entities/table/utils/save-table-info-in-database-orchestrator.util.ts Extracts “save tables info” orchestration into a utility.
backend/src/entities/table/utils/process-referenced-tables.util.ts Adds helpers to filter referenced tables by permission and enrich with display names.
backend/src/entities/table/utils/hash-passwords-in-row.util.ts Tightens a cast when encrypting password widget values.
backend/src/entities/table/utils/find-ordering-field.util.ts Fixes typo in Ordering DS name usage.
backend/src/entities/table/utils/find-filtering-fields.util.ts Tightens input typing and refactors parsing of body-provided filters.
backend/src/entities/table/utils/filter-foreign-keys-by-permission.util.ts Adds helper to filter foreign keys by read permission.
backend/src/entities/table/utils/extract-foreign-keys-from-widgets.util.ts Adds helper to extract FK metadata from widget params.
backend/src/entities/table/utils/build-table-settings-for-response.util.ts Centralizes building table_settings response object.
backend/src/entities/table/utils/attach-foreign-column-names.util.ts Extracts helper to attach autocomplete columns for foreign keys.
backend/src/entities/table/utils/add-display-names-for-tables.util.ts Extracts helper to add display names/icons to found tables list.
backend/src/entities/table/use-cases/update-row-in-table.use.case.ts Replaces inline logic with new utilities (connection validation, FK/references processing, response shaping).
backend/src/entities/table/use-cases/import-csv-in-table-user.case.ts Uses shared connection validation and agent email helper; fixes masterPwd naming.
backend/src/entities/table/use-cases/get-table-structure.use.case.ts Uses shared connection validation and FK helpers; removes local FK enrichment method.
backend/src/entities/table/use-cases/get-table-rows.use.case.ts Uses shared connection validation, FK helpers, and response table-settings builder.
backend/src/entities/table/use-cases/get-row-by-primary-key.use.case.ts Uses shared connection validation, FK + referenced-tables helpers, and response builder.
backend/src/entities/table/use-cases/find-tables-in-connection.use.case.ts Extracts “save table info” + display-name logic into utilities; simplifies sorting.
backend/src/entities/table/use-cases/find-tables-in-connection-v2.use.case.ts Same extraction/simplification as v1 use case.
backend/src/entities/table/use-cases/export-csv-from-table.use.case.ts Uses shared connection validation and agent email helper.
backend/src/entities/table/use-cases/delete-rows-from-table.use.case.ts Uses shared connection validation and agent email helper.
backend/src/entities/table/use-cases/delete-row-from-table.use.case.ts Uses shared connection validation and agent email helper.
backend/src/entities/table/use-cases/bulk-update-rows-in-table.use.case.ts Uses shared connection validation and agent email helper.
backend/src/entities/table/use-cases/add-row-in-table.use.case.ts Uses shared connection validation + FK/referenced-tables helpers + response builder.
backend/src/entities/table/table.controller.ts Tightens controller query/body typings and refactors pagination parsing.
backend/src/entities/table/table-datastructures.ts Narrows column_default type for API structures.
backend/src/entities/table/application/data-structures/import-scv-in-table.ds.ts Fixes typo: materPwdmasterPwd.
backend/src/entities/table/application/data-structures/found-table-rows.ds.ts Fixes typo: OrderingFiledDsOrderingFieldDs.
Comments suppressed due to low confidence (5)

backend/src/entities/table/utils/validate-table-row.util.ts:10

  • keys are lowercased, but later you search with keys.indexOf(structure.at(i).column_name) without normalizing column_name. If the DB returns mixed/quoted-case column names, this will fail to detect the provided key and may skip the auto-generated-value validation. Consider normalizing both sides (e.g., compare using structure.at(i).column_name.toLowerCase()) or avoid lowercasing keys altogether.
	const keys = Object.keys(row).map((key) => key.toLowerCase());

	for (let i = 0; i < structure.length; i++) {
		try {
			const index = keys.indexOf(structure.at(i).column_name);

backend/src/entities/table/table.controller.ts:219

  • The pagination validation uses truthiness checks: (parsedPage && parsedPage <= 0) / (parsedPerPage && parsedPerPage <= 0). This will not reject 0 (because 0 is falsy) and also won’t reject NaN from parseInt. Consider validating with Number.isInteger(parsedPage) / Number.isInteger(parsedPerPage) and checking parsedPage <= 0 / parsedPerPage <= 0 directly, and return a 400 if parsing fails.
		let parsedPage: number;
		let parsedPerPage: number;
		if (page && perPage) {
			parsedPage = parseInt(page, 10);
			parsedPerPage = parseInt(perPage, 10);
			if ((parsedPage && parsedPage <= 0) || (parsedPerPage && parsedPerPage <= 0)) {
				throw new HttpException(
					{
						message: Messages.PAGE_AND_PERPAGE_INVALID,
					},
					HttpStatus.BAD_REQUEST,
				);

backend/src/entities/table/table.controller.ts:283

  • Same pagination parsing issue as above: 0 and NaN values can bypass the current truthiness-based validation. Recommend using Number.isInteger(...) and direct <= 0 checks (and handling parse failures) before passing values into GetTableRowsDs.
		let parsedPage: number;
		let parsedPerPage: number;
		if (page && perPage) {
			parsedPage = parseInt(page, 10);
			parsedPerPage = parseInt(perPage, 10);
			if ((parsedPage && parsedPage <= 0) || (parsedPerPage && parsedPerPage <= 0)) {
				throw new HttpException(
					{
						message: Messages.PAGE_AND_PERPAGE_INVALID,
					},
					HttpStatus.BAD_REQUEST,
				);
			}

backend/src/entities/table/table.controller.ts:295

  • parsedPage / parsedPerPage may be left undefined (when page/perPage are not provided), but GetTableRowsDs.page and .perPage are declared as required numbers. Either make them optional in the DS or default them here before calling the use case.
		const inputData: GetTableRowsDs = {
			connectionId: connectionId,
			masterPwd: masterPwd,
			page: parsedPage,
			perPage: parsedPerPage,
			query: query,
			searchingFieldValue: searchingFieldValue,
			tableName: tableName,
			userId: userId,
			filters: body?.filters,
		};

backend/src/entities/table/table.controller.ts:231

  • parsedPage / parsedPerPage can remain undefined when query params are omitted, but GetTableRowsDs.page and .perPage are typed as required numbers. Either make these fields optional in GetTableRowsDs (and handle defaults in the use case/DAO) or set explicit defaults here to avoid passing undefined into dao.getRowsFromTable(...).
		const inputData: GetTableRowsDs = {
			connectionId: connectionId,
			masterPwd: masterPwd,
			page: parsedPage,
			perPage: parsedPerPage,
			query: query,
			searchingFieldValue: searchingFieldValue,
			tableName: tableName,
			userId: userId,
		};

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +8 to +16
.filter((widget) => widget.widget_type === WidgetTypeEnum.Foreign_key)
.reduce<Array<ForeignKeyDSInfo>>((acc, widget) => {
if (widget.widget_params) {
try {
acc.push(JSON5.parse(widget.widget_params) as ForeignKeyDSInfo);
} catch (_e) {
// skip malformed widget params
}
}
Copy link

Copilot AI Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

widget.widget_params is not always a string — elsewhere in the codebase it’s treated as string | object (e.g., S3 widget code uses typeof widget.widget_params === 'string' ? JSON5.parse(...) : widget.widget_params). This helper always calls JSON5.parse(widget.widget_params), which will throw for already-parsed objects and silently skip those widgets, dropping foreign key metadata. Consider handling both cases: parse only when it’s a string, otherwise accept/cast the object shape.

Copilot uses AI. Check for mistakes.
userRepository: { getUserEmailOrReturnNull(userId: string): Promise<string> },
): Promise<string | undefined> {
if (isConnectionTypeAgent(connection.type)) {
return userRepository.getUserEmailOrReturnNull(userId);
Copy link

Copilot AI Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

getUserEmailOrReturnNull can actually return null (see user-custom-repository-extension.ts), but the injected userRepository type here is Promise<string> and getUserEmailForAgent returns only string | undefined. This can leak a null at runtime and undermines the intent of the helper. Suggest updating the repository type to Promise<string | null> and the return type to Promise<string | null | undefined>, or normalize null to undefined before returning.

Suggested change
return userRepository.getUserEmailOrReturnNull(userId);
const email = await userRepository.getUserEmailOrReturnNull(userId);
return email === null ? undefined : email;

Copilot uses AI. Check for mistakes.
Comment on lines +6 to +20
export async function attachForeignColumnNames(
foreignKey: ForeignKeyDS,
userEmail: string,
connectionId: string,
dao: IDataAccessObject | IDataAccessObjectAgent,
findTableSettings: (
connectionId: string,
tableName: string,
) => Promise<{ autocomplete_columns?: Array<string> } | null>,
): Promise<ForeignKeyWithAutocompleteColumnsDS> {
try {
const [foreignTableSettings, foreignTableStructure] = await Promise.all([
findTableSettings(connectionId, foreignKey.referenced_table_name),
dao.getTableStructure(foreignKey.referenced_table_name, userEmail),
]);
Copy link

Copilot AI Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

userEmail is typed as string, but most call sites pass the result of getUserEmailForAgent(...) which can be undefined (and can be null in practice). This helper should accept string | null | undefined and pass it through to dao.getTableStructure(...), or explicitly coerce to undefined to avoid accidentally calling the DAO with null.

Copilot uses AI. Check for mistakes.
Comment on lines +21 to +28
const decryptedConnection = await dbContext.connectionRepository.findAndDecryptConnection(
connectionId,
masterPwd,
);
const tableNames: Array<string> = tables.map((table) => table.tableName);
const queue = new PQueue({ concurrency: 2 });
const dao = getDataAccessObject(decryptedConnection);
const tablesStructures: Array<{
Copy link

Copilot AI Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

findAndDecryptConnection(...) can return null; the current code passes its result directly into getDataAccessObject(...), which will throw and rely on the outer catch. Consider adding an explicit if (!decryptedConnection) return; (or reuse validateConnection) before constructing the DAO to avoid noisy error capture/logging for a benign race (connection deleted) case.

Copilot uses AI. Check for mistakes.
});
}),
)) as Array<TableInfoEntity>;
foundConnection.saved_table_info = ++foundConnection.saved_table_info;
Copy link

Copilot AI Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

foundConnection.saved_table_info = ++foundConnection.saved_table_info; is harder to read than necessary and easy to misinterpret. Prefer foundConnection.saved_table_info += 1 for clarity.

Suggested change
foundConnection.saved_table_info = ++foundConnection.saved_table_info;
foundConnection.saved_table_info += 1;

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (3)
backend/src/entities/table/utils/validate-table-row.util.ts (1)

6-10: ⚠️ Potential issue | 🟠 Major

Normalize both sides of key matching to prevent validation misses.

Line 6 lowercases incoming row keys, but Line 10 compares against raw column_name. For mixed-case column names, this can miss matches and skip autogenerated-value validation.

Suggested fix
-			const index = keys.indexOf(structure.at(i).column_name);
+			const index = keys.indexOf(structure.at(i).column_name.toLowerCase());
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/utils/validate-table-row.util.ts` around lines 6 -
10, The validation lowers incoming row keys into the keys array but compares
them to raw column names, so mixed-case column_name values can fail matching;
update the lookup in the loop that uses structure.at(i).column_name (the code
computing index via keys.indexOf(...)) to compare against a normalized form
(e.g., call .toLowerCase() on structure.at(i).column_name or precompute a
lowercased list from structure) so both sides use the same case before checking
index and performing autogenerated-value validation.
backend/src/entities/table/use-cases/find-tables-in-connection.use.case.ts (1)

105-112: ⚠️ Potential issue | 🟠 Major

Pass the resolved agent email into the async table-info warmup.

The new saveTableInfoInDatabase() path no longer receives userEmail, but backend/src/entities/table/utils/save-table-info-in-database-orchestrator.util.ts (Lines 9-64) loads each structure via dao.getTableStructure(tableName, undefined). That drops the agent identity you resolved at Lines 77-79, so tables_info backfill can fail or write incomplete metadata for agent-backed connections.

🛠️ Suggested call-site change
-				saveTableInfoInDatabase(connection.id, tables, masterPwd, this._dbContext);
+				void saveTableInfoInDatabase(connection.id, tables, masterPwd, this._dbContext, userEmail);

saveTableInfoInDatabase() then needs the same extra parameter so it can call dao.getTableStructure(tableName, userEmail).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/use-cases/find-tables-in-connection.use.case.ts`
around lines 105 - 112, The async warm-up call to saveTableInfoInDatabase is
dropping the resolved agent identity so DAO calls use undefined userEmail;
update the call site in find-tables-in-connection.use.case (where you resolved
the agent email) to pass the resolved userEmail into saveTableInfoInDatabase,
and update saveTableInfoInDatabase's signature and all callers so it forwards
that userEmail into dao.getTableStructure(tableName, userEmail) (ensure
parameter name matches, e.g., userEmail or agentEmail, and propagate the change
through save-table-info-in-database-orchestrator.util.ts and any other usages).
backend/src/entities/table/table.controller.ts (1)

208-221: ⚠️ Potential issue | 🟠 Major

Pagination parsing now drops valid input and still accepts invalid numbers.

These blocks only parse inside if (page && perPage), so ?page=2 or ?perPage=50 are silently ignored. The truthy guard also lets "0", NaN, empty strings, and partially numeric values like "1foo" through. Parse each parameter independently and validate with Number.isInteger(...) && > 0 before building GetTableRowsDs.

🧮 Suggested validation pattern
-		let parsedPage: number;
-		let parsedPerPage: number;
-		if (page && perPage) {
-			parsedPage = parseInt(page, 10);
-			parsedPerPage = parseInt(perPage, 10);
-			if ((parsedPage && parsedPage <= 0) || (parsedPerPage && parsedPerPage <= 0)) {
+		const parsedPage = page === undefined ? undefined : Number(page);
+		const parsedPerPage = perPage === undefined ? undefined : Number(perPage);
+		if (
+			(parsedPage !== undefined && (!Number.isInteger(parsedPage) || parsedPage <= 0)) ||
+			(parsedPerPage !== undefined && (!Number.isInteger(parsedPerPage) || parsedPerPage <= 0))
+		) {
 				throw new HttpException(
 					{
 						message: Messages.PAGE_AND_PERPAGE_INVALID,
 					},
 					HttpStatus.BAD_REQUEST,
 				);
-			}
 		}

Also applies to: 271-284, 634-647

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/table.controller.ts` around lines 208 - 221, The
pagination parsing currently only runs when both query params are truthy and
uses parseInt without robust validation; change it to parse and validate page
and perPage independently: for each of page and perPage, if the param is
provided (not undefined/null/empty string), parse with Number(...) or parseInt,
then validate using Number.isInteger(value) && value > 0, set
parsedPage/parsedPerPage only when valid, and throw the same HttpException with
Messages.PAGE_AND_PERPAGE_INVALID when a provided value is invalid; ensure the
validated values are assigned into the GetTableRowsDs (or equivalent DTO) and
repeat the same independent-parameter validation fix for the other similar
blocks that use parsedPage/parsedPerPage.
🧹 Nitpick comments (7)
backend/src/entities/table/utils/extract-foreign-keys-from-widgets.util.ts (1)

9-18: Silent error handling and unsafe type assertion may hide data issues.

Two concerns:

  1. Silent catch: Swallowing parse errors without logging makes debugging difficult when widgets have malformed params.

  2. Unsafe cast: JSON5.parse(...) as ForeignKeyDSInfo doesn't validate the parsed object's shape. If the JSON is valid but has incorrect/missing properties, this will silently produce malformed data.

♻️ Suggested improvement with validation and logging
 export function extractForeignKeysFromWidgets(tableWidgets: Array<TableWidgetEntity>): Array<ForeignKeyDSInfo> {
 	return tableWidgets
 		.filter((widget) => widget.widget_type === WidgetTypeEnum.Foreign_key)
 		.reduce<Array<ForeignKeyDSInfo>>((acc, widget) => {
 			if (widget.widget_params) {
 				try {
-					acc.push(JSON5.parse(widget.widget_params) as ForeignKeyDSInfo);
-				} catch (_e) {
-					// skip malformed widget params
+					const parsed = JSON5.parse(widget.widget_params);
+					if (parsed && typeof parsed.referenced_table_name === 'string' && typeof parsed.column_name === 'string') {
+						acc.push(parsed as ForeignKeyDSInfo);
+					}
+				} catch (e) {
+					// Log or handle malformed widget params as needed
 				}
 			}
 			return acc;
 		}, []);
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/utils/extract-foreign-keys-from-widgets.util.ts`
around lines 9 - 18, The reduce block silently swallows JSON5.parse errors and
unsafely casts parsed results to ForeignKeyDSInfo which can allow malformed
objects; update the logic in extract-foreign-keys-from-widgets.util.ts to catch
and log parse errors (include widget id or context from widget) instead of
ignoring them, then validate the parsed value before pushing (e.g., add an
isValidForeignKeyDSInfo or use an existing validator/schema to check required
fields on the object returned from JSON5.parse(widget.widget_params)); only push
into the accumulator when validation passes and otherwise log a warning with
details so malformed or shape-mismatched widget_params are discoverable.
backend/src/entities/table/utils/build-table-settings-for-response.util.ts (1)

4-7: Extract the tableSettings parameter shape into a named interface.

This inline object type is now part of a reusable util signature. A named interface will keep the contract discoverable and avoid re-declaring the same shape elsewhere. As per coding guidelines, "Use interfaces for object shapes and type for unions and primitives."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/utils/build-table-settings-for-response.util.ts`
around lines 4 - 7, Extract the inline object type used by the tableSettings
parameter into a named interface (e.g., TableSettingsInput) and update the
buildTableSettingsForResponse signature to use that interface instead of the
inline shape; update any references/usages within this file (and export the
interface if needed for reuse) so other utilities can import the same contract
rather than redeclare the shape. Ensure the interface contains the same optional
boolean properties: allow_csv_export, allow_csv_import, can_delete, can_update,
can_add, and replace the inline type with TableSettingsInput | null in the
function parameter.
backend/src/entities/table/utils/filter-foreign-keys-by-permission.util.ts (1)

11-18: Deduplicate permission checks before fan-out.

This collapses the result into a Set by referenced_table_name, but the Cedar check still runs once per foreign key. If several keys point at the same table, you're paying for the same authorization lookup repeatedly. Check unique table names first and then filter the original array.

♻️ Suggested refactor
-	const canReadResults = await Promise.all(
-		foreignKeys.map(async (fk) => ({
-			tableName: fk.referenced_table_name,
-			canRead: await cedarPermissions.improvedCheckTableRead(userId, connectionId, fk.referenced_table_name, masterPwd),
-		})),
-	);
+	const referencedTableNames = [...new Set(foreignKeys.map((fk) => fk.referenced_table_name))];
+	const canReadResults = await Promise.all(
+		referencedTableNames.map(async (tableName) => ({
+			tableName,
+			canRead: await cedarPermissions.improvedCheckTableRead(userId, connectionId, tableName, masterPwd),
+		})),
+	);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/utils/filter-foreign-keys-by-permission.util.ts`
around lines 11 - 18, The current implementation runs
cedarPermissions.improvedCheckTableRead for every element in foreignKeys;
instead, first extract the unique referenced_table_name values from foreignKeys,
run improvedCheckTableRead once per unique table (using userId, connectionId,
masterPwd), collect the names that returned true into readableSet, and finally
filter the original foreignKeys array by checking
readableSet.has(fk.referenced_table_name); update the canReadResults/readableSet
logic to operate over unique table names to avoid duplicated permission checks.
backend/src/entities/table/utils/process-referenced-tables.util.ts (1)

12-28: Cache Cedar checks by ref.table_name.

The nested Promise.all issues one read-permission call per referenced_by entry. If multiple columns reference the same table, this repeats the same authorization lookup and fans out unnecessary load. A small per-table cache inside this helper would keep the output the same with fewer Cedar calls.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/utils/process-referenced-tables.util.ts` around
lines 12 - 28, The nested Promise.all is calling
cedarPermissions.improvedCheckTableRead for every referenced_by entry causing
duplicate checks for the same table; inside the processing of referencedTables
(the referencedTables.map async) create a small per-referencedTable Map keyed by
ref.table_name to cache results, before calling
cedarPermissions.improvedCheckTableRead check the cache and reuse the boolean if
present, otherwise call improvedCheckTableRead(userId, connectionId,
ref.table_name, masterPwd), store the result in the Map, and then use that
boolean to decide whether to keep ref in referencedTable.referenced_by to
preserve the original output while reducing redundant Cedar calls.
backend/src/entities/table/use-cases/get-table-structure.use.case.ts (1)

77-83: Drop the extra .catch() on attachForeignColumnNames().

backend/src/entities/table/utils/attach-foreign-column-names.util.ts (Lines 5-36) already normalizes DAO/settings failures to autocomplete_columns: []. This fallback is redundant, and if it ever runs it returns el without the autocomplete_columns field the response type promises.

♻️ Suggested simplification
-				transformedTableForeignKeys = await Promise.all(
-					tableForeignKeys.map((el) =>
-						attachForeignColumnNames(
-							el, userEmail, connectionId, dao,
-							this._dbContext.tableSettingsRepository.findTableSettings.bind(this._dbContext.tableSettingsRepository),
-						).catch(() => el as ForeignKeyWithAutocompleteColumnsDS),
-					),
-				);
+				transformedTableForeignKeys = await Promise.all(
+					tableForeignKeys.map((el) =>
+						attachForeignColumnNames(
+							el,
+							userEmail,
+							connectionId,
+							dao,
+							this._dbContext.tableSettingsRepository.findTableSettings.bind(this._dbContext.tableSettingsRepository),
+						),
+					),
+				);

The same cleanup applies to the identical map in the row read/update use cases.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/use-cases/get-table-structure.use.case.ts` around
lines 77 - 83, The map over tableForeignKeys is redundantly attaching a .catch()
to attachForeignColumnNames which already handles DAO/settings failures and
normalizes results (see attachForeignColumnNames); remove the .catch(() => el as
ForeignKeyWithAutocompleteColumnsDS) so transformedTableForeignKeys awaits
Promise.all(tableForeignKeys.map(el => attachForeignColumnNames(el, userEmail,
connectionId, dao,
this._dbContext.tableSettingsRepository.findTableSettings.bind(this._dbContext.tableSettingsRepository))))
and let attachForeignColumnNames provide the fallback autocomplete_columns: []
itself; apply the same removal to the identical map in the row read/update use
cases.
backend/src/entities/table/use-cases/add-row-in-table.use.case.ts (1)

129-133: Remove the cast-based fallback.

The shared helper already normalizes failures to autocomplete_columns: []. This extra .catch(() => el as ForeignKeyWithAutocompleteColumnsDS) can still return an object without that field, and the cast hides that from TypeScript.

♻️ Suggested cleanup
-				foreignKeysWithKeysFromWidgets.map((el) =>
-					attachForeignColumnNames(
-						el, userEmail, connectionId, dao,
-						this._dbContext.tableSettingsRepository.findTableSettings.bind(this._dbContext.tableSettingsRepository),
-					).catch(() => el as ForeignKeyWithAutocompleteColumnsDS),
-				),
+				foreignKeysWithKeysFromWidgets.map((el) =>
+					attachForeignColumnNames(
+						el,
+						userEmail,
+						connectionId,
+						dao,
+						this._dbContext.tableSettingsRepository.findTableSettings.bind(this._dbContext.tableSettingsRepository),
+					),
+				),
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/use-cases/add-row-in-table.use.case.ts` around
lines 129 - 133, The array mapping over foreignKeysWithKeysFromWidgets should
not use a cast-based fallback; remove the .catch(() => el as
ForeignKeyWithAutocompleteColumnsDS) after attachForeignColumnNames so failures
are handled by attachForeignColumnNames itself (which normalizes to
autocomplete_columns: []). Update the map to call attachForeignColumnNames(el,
userEmail, connectionId, dao,
this._dbContext.tableSettingsRepository.findTableSettings.bind(this._dbContext.tableSettingsRepository))
without the catch so TypeScript correctly sees the normalized shape.
backend/src/entities/table/use-cases/get-table-rows.use.case.ts (1)

181-185: Drop the weaker outer fallback here.

attachForeignColumnNames() already catches its own failures and returns autocomplete_columns: []. The extra .catch(() => el) reintroduces the raw foreign-key shape, so an unexpected rejection can leak a response item without autocomplete_columns.

♻️ Suggested cleanup
-					tableForeignKeys.map((el) =>
-						attachForeignColumnNames(
-							el, userEmail, connectionId, dao,
-							this._dbContext.tableSettingsRepository.findTableSettingsPure.bind(this._dbContext.tableSettingsRepository),
-						).catch(() => el),
-					),
+					tableForeignKeys.map((el) =>
+						attachForeignColumnNames(
+							el,
+							userEmail,
+							connectionId,
+							dao,
+							this._dbContext.tableSettingsRepository.findTableSettingsPure.bind(this._dbContext.tableSettingsRepository),
+						),
+					),
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/use-cases/get-table-rows.use.case.ts` around lines
181 - 185, The outer .catch on the tableForeignKeys.map pipeline is
reintroducing raw FK objects and bypassing attachForeignColumnNames()'s internal
failure handling; remove the trailing .catch(() => el) so that
attachForeignColumnNames(el, userEmail, connectionId, dao,
this._dbContext.tableSettingsRepository.findTableSettingsPure.bind(this._dbContext.tableSettingsRepository))
handles errors itself (it already returns autocomplete_columns: [] on failure).
Locate the tableForeignKeys.map usage and drop the outer catch, and quickly scan
for any callers that relied on the previous fallback to ensure no behavior
change is required elsewhere.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@backend/src/entities/table/use-cases/get-row-by-primary-key.use.case.ts`:
- Around line 161-165: The top-level can_delete/can_update/can_add fields are
reading directly from tableSettings and can disagree with the normalized values
produced by buildTableSettingsForResponse; update the response construction in
get-row-by-primary-key.use.case.ts to derive those top-level flags from the
normalized settings returned by buildTableSettingsForResponse (or apply the same
nullish-coalescing defaults like "?? true" using builtDAOsTableSettings) so
can_delete/can_update/can_add are consistent with the values inside
table_settings; reference the variables tableSettings, builtDAOsTableSettings
and the function buildTableSettingsForResponse when making the change.

In `@backend/src/entities/table/use-cases/update-row-in-table.use.case.ts`:
- Around line 216-219: Top-level permission flags (can_delete, can_update,
can_add) must match the defaults used in table_settings; instead of falling back
to true when tableSettings is missing, calculate the response defaults via
buildTableSettingsForResponse(builtDAOsTableSettings, tableSettings) and use
those resulting fields for the top-level flags (e.g. set can_delete =
table_settings_response.can_delete, can_update =
table_settings_response.can_update, can_add = table_settings_response.can_add)
so the sibling fields and table_settings are consistent.

In `@backend/src/entities/table/utils/build-table-settings-for-response.util.ts`:
- Around line 8-20: The builder is returning undefined/null for ordering,
ordering_field, and identity_column while the DTO TableSettingsInRowsDS declares
them as required non-nullable; update either TableSettingsInRowsDS to make
ordering, ordering_field, and identity_column optional/nullable or change the
builder (in build-table-settings-for-response util, e.g.,
buildTableSettingsForResponse) to always emit concrete values (e.g., default
empty string or default sentinel) for ordering, ordering_field and
identity_column to match the DTO; additionally extract the inline parameter type
currently passed as tableSettings into a named interface (e.g.,
TableSettingsInput) and replace the inline type usage with that interface to
follow coding guidelines.

In `@backend/src/entities/table/utils/find-filtering-fields.util.ts`:
- Around line 102-111: The code casts filtersDataFromBody[rowName] to
Record<string, string> but then checks for null and assigns possibly-null values
into FilteringFieldsDs; update the types and checks for consistency by changing
the cast to Record<string, string | null> (or adjust the source type), update
the loop to treat values as string | null, keep the isValueNull check using
FilterCriteriaEnum.eq, and when pushing into filteringItems (used by
FilteringFieldsDs) ensure the value field accepts string | null or convert
non-null values to string (e.g., String(filterData[key])) so the assigned value
matches FilteringFieldsDs.value; verify validateStringWithEnum and
FilterCriteriaEnum usage remain unchanged.

In `@backend/src/entities/table/utils/hash-passwords-in-row.util.ts`:
- Around line 25-26: The current check before encrypting a password field allows
non-string values (0, false, objects, arrays) to reach the unsafe cast and call
Encryptor.processDataWithAlgorithm, which can throw and leave
row[widget.field_name] unencrypted; update the condition in the hashing routine
in hash-passwords-in-row.util.ts (the block that reads fieldValue,
widgetParams.encrypt and calls Encryptor.processDataWithAlgorithm with
widgetParams.algorithm) to include an explicit typeof fieldValue === 'string'
check (skip/enforce handling when not a string), so only real strings are passed
to Encryptor.processDataWithAlgorithm and non-string values are either ignored
or handled safely before assignment to row[widget.field_name].

In
`@backend/src/entities/table/utils/save-table-info-in-database-orchestrator.util.ts`:
- Around line 45-64: The metadata save loop that uses queue.add with
buildTableInfoEntity, tableInfoRepository.save and tableFieldInfoRepository.save
must be executed inside a single database transaction so partial commits cannot
occur; refactor the block so that all saves (creating newTableInfo, saving
newTableFieldsInfos, assigning foundConnection.tables_info and incrementing
foundConnection.saved_table_info) are performed within
dbContext.manager.transaction (or your ORM’s transactional API) and on error the
transaction is rolled back and the exception rethrown (instead of only logging)
so callers can retry safely; alternatively, if transactions aren’t available,
implement explicit cleanup that deletes any partially saved TableInfoEntity and
its TableFieldInfoEntity records on failure before returning.
- Around line 10-15: The saveTableInfoInDatabase function currently hardcodes
undefined for agent-backed DAOs when calling getTableStructure and swallows
errors; update its signature to accept a userEmail parameter and, inside
saveTableInfoInDatabase, call getDataAccessObject and detect if the returned DAO
implements IDataAccessObjectAgent so you pass the userEmail into
getTableStructure (otherwise keep existing behavior), and change the error
handling so failures are surfaced (either rethrow the caught error or change the
return type from Promise<void> to Promise<boolean> and return success/failure)
so callers can detect when the operation fails; reference functions/types
saveTableInfoInDatabase, getDataAccessObject, IDataAccessObjectAgent, and
getTableStructure in your changes.

---

Outside diff comments:
In `@backend/src/entities/table/table.controller.ts`:
- Around line 208-221: The pagination parsing currently only runs when both
query params are truthy and uses parseInt without robust validation; change it
to parse and validate page and perPage independently: for each of page and
perPage, if the param is provided (not undefined/null/empty string), parse with
Number(...) or parseInt, then validate using Number.isInteger(value) && value >
0, set parsedPage/parsedPerPage only when valid, and throw the same
HttpException with Messages.PAGE_AND_PERPAGE_INVALID when a provided value is
invalid; ensure the validated values are assigned into the GetTableRowsDs (or
equivalent DTO) and repeat the same independent-parameter validation fix for the
other similar blocks that use parsedPage/parsedPerPage.

In `@backend/src/entities/table/use-cases/find-tables-in-connection.use.case.ts`:
- Around line 105-112: The async warm-up call to saveTableInfoInDatabase is
dropping the resolved agent identity so DAO calls use undefined userEmail;
update the call site in find-tables-in-connection.use.case (where you resolved
the agent email) to pass the resolved userEmail into saveTableInfoInDatabase,
and update saveTableInfoInDatabase's signature and all callers so it forwards
that userEmail into dao.getTableStructure(tableName, userEmail) (ensure
parameter name matches, e.g., userEmail or agentEmail, and propagate the change
through save-table-info-in-database-orchestrator.util.ts and any other usages).

In `@backend/src/entities/table/utils/validate-table-row.util.ts`:
- Around line 6-10: The validation lowers incoming row keys into the keys array
but compares them to raw column names, so mixed-case column_name values can fail
matching; update the lookup in the loop that uses structure.at(i).column_name
(the code computing index via keys.indexOf(...)) to compare against a normalized
form (e.g., call .toLowerCase() on structure.at(i).column_name or precompute a
lowercased list from structure) so both sides use the same case before checking
index and performing autogenerated-value validation.

---

Nitpick comments:
In `@backend/src/entities/table/use-cases/add-row-in-table.use.case.ts`:
- Around line 129-133: The array mapping over foreignKeysWithKeysFromWidgets
should not use a cast-based fallback; remove the .catch(() => el as
ForeignKeyWithAutocompleteColumnsDS) after attachForeignColumnNames so failures
are handled by attachForeignColumnNames itself (which normalizes to
autocomplete_columns: []). Update the map to call attachForeignColumnNames(el,
userEmail, connectionId, dao,
this._dbContext.tableSettingsRepository.findTableSettings.bind(this._dbContext.tableSettingsRepository))
without the catch so TypeScript correctly sees the normalized shape.

In `@backend/src/entities/table/use-cases/get-table-rows.use.case.ts`:
- Around line 181-185: The outer .catch on the tableForeignKeys.map pipeline is
reintroducing raw FK objects and bypassing attachForeignColumnNames()'s internal
failure handling; remove the trailing .catch(() => el) so that
attachForeignColumnNames(el, userEmail, connectionId, dao,
this._dbContext.tableSettingsRepository.findTableSettingsPure.bind(this._dbContext.tableSettingsRepository))
handles errors itself (it already returns autocomplete_columns: [] on failure).
Locate the tableForeignKeys.map usage and drop the outer catch, and quickly scan
for any callers that relied on the previous fallback to ensure no behavior
change is required elsewhere.

In `@backend/src/entities/table/use-cases/get-table-structure.use.case.ts`:
- Around line 77-83: The map over tableForeignKeys is redundantly attaching a
.catch() to attachForeignColumnNames which already handles DAO/settings failures
and normalizes results (see attachForeignColumnNames); remove the .catch(() =>
el as ForeignKeyWithAutocompleteColumnsDS) so transformedTableForeignKeys awaits
Promise.all(tableForeignKeys.map(el => attachForeignColumnNames(el, userEmail,
connectionId, dao,
this._dbContext.tableSettingsRepository.findTableSettings.bind(this._dbContext.tableSettingsRepository))))
and let attachForeignColumnNames provide the fallback autocomplete_columns: []
itself; apply the same removal to the identical map in the row read/update use
cases.

In `@backend/src/entities/table/utils/build-table-settings-for-response.util.ts`:
- Around line 4-7: Extract the inline object type used by the tableSettings
parameter into a named interface (e.g., TableSettingsInput) and update the
buildTableSettingsForResponse signature to use that interface instead of the
inline shape; update any references/usages within this file (and export the
interface if needed for reuse) so other utilities can import the same contract
rather than redeclare the shape. Ensure the interface contains the same optional
boolean properties: allow_csv_export, allow_csv_import, can_delete, can_update,
can_add, and replace the inline type with TableSettingsInput | null in the
function parameter.

In `@backend/src/entities/table/utils/extract-foreign-keys-from-widgets.util.ts`:
- Around line 9-18: The reduce block silently swallows JSON5.parse errors and
unsafely casts parsed results to ForeignKeyDSInfo which can allow malformed
objects; update the logic in extract-foreign-keys-from-widgets.util.ts to catch
and log parse errors (include widget id or context from widget) instead of
ignoring them, then validate the parsed value before pushing (e.g., add an
isValidForeignKeyDSInfo or use an existing validator/schema to check required
fields on the object returned from JSON5.parse(widget.widget_params)); only push
into the accumulator when validation passes and otherwise log a warning with
details so malformed or shape-mismatched widget_params are discoverable.

In `@backend/src/entities/table/utils/filter-foreign-keys-by-permission.util.ts`:
- Around line 11-18: The current implementation runs
cedarPermissions.improvedCheckTableRead for every element in foreignKeys;
instead, first extract the unique referenced_table_name values from foreignKeys,
run improvedCheckTableRead once per unique table (using userId, connectionId,
masterPwd), collect the names that returned true into readableSet, and finally
filter the original foreignKeys array by checking
readableSet.has(fk.referenced_table_name); update the canReadResults/readableSet
logic to operate over unique table names to avoid duplicated permission checks.

In `@backend/src/entities/table/utils/process-referenced-tables.util.ts`:
- Around line 12-28: The nested Promise.all is calling
cedarPermissions.improvedCheckTableRead for every referenced_by entry causing
duplicate checks for the same table; inside the processing of referencedTables
(the referencedTables.map async) create a small per-referencedTable Map keyed by
ref.table_name to cache results, before calling
cedarPermissions.improvedCheckTableRead check the cache and reuse the boolean if
present, otherwise call improvedCheckTableRead(userId, connectionId,
ref.table_name, masterPwd), store the result in the Map, and then use that
boolean to decide whether to keep ref in referencedTable.referenced_by to
preserve the original output while reducing redundant Cedar calls.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 2af02305-1ad8-4088-b99d-96bc31ca7bd0

📥 Commits

Reviewing files that changed from the base of the PR and between 6f20a9c and b16f227.

📒 Files selected for processing (28)
  • backend/src/entities/table/application/data-structures/found-table-rows.ds.ts
  • backend/src/entities/table/application/data-structures/import-scv-in-table.ds.ts
  • backend/src/entities/table/table-datastructures.ts
  • backend/src/entities/table/table.controller.ts
  • backend/src/entities/table/use-cases/add-row-in-table.use.case.ts
  • backend/src/entities/table/use-cases/bulk-update-rows-in-table.use.case.ts
  • backend/src/entities/table/use-cases/delete-row-from-table.use.case.ts
  • backend/src/entities/table/use-cases/delete-rows-from-table.use.case.ts
  • backend/src/entities/table/use-cases/export-csv-from-table.use.case.ts
  • backend/src/entities/table/use-cases/find-tables-in-connection-v2.use.case.ts
  • backend/src/entities/table/use-cases/find-tables-in-connection.use.case.ts
  • backend/src/entities/table/use-cases/get-row-by-primary-key.use.case.ts
  • backend/src/entities/table/use-cases/get-table-rows.use.case.ts
  • backend/src/entities/table/use-cases/get-table-structure.use.case.ts
  • backend/src/entities/table/use-cases/import-csv-in-table-user.case.ts
  • backend/src/entities/table/use-cases/update-row-in-table.use.case.ts
  • backend/src/entities/table/utils/add-display-names-for-tables.util.ts
  • backend/src/entities/table/utils/attach-foreign-column-names.util.ts
  • backend/src/entities/table/utils/build-table-settings-for-response.util.ts
  • backend/src/entities/table/utils/extract-foreign-keys-from-widgets.util.ts
  • backend/src/entities/table/utils/filter-foreign-keys-by-permission.util.ts
  • backend/src/entities/table/utils/find-filtering-fields.util.ts
  • backend/src/entities/table/utils/find-ordering-field.util.ts
  • backend/src/entities/table/utils/hash-passwords-in-row.util.ts
  • backend/src/entities/table/utils/process-referenced-tables.util.ts
  • backend/src/entities/table/utils/save-table-info-in-database-orchestrator.util.ts
  • backend/src/entities/table/utils/validate-connection.util.ts
  • backend/src/entities/table/utils/validate-table-row.util.ts

Comment on lines 161 to 165
can_delete: tableSettings ? tableSettings.can_delete : true,
can_update: tableSettings ? tableSettings.can_update : true,
can_add: tableSettings ? tableSettings.can_add : true,
table_settings: {
sortable_by: builtDAOsTableSettings?.sortable_by?.length > 0 ? builtDAOsTableSettings.sortable_by : [],
ordering: builtDAOsTableSettings.ordering ? builtDAOsTableSettings.ordering : undefined,
identity_column: builtDAOsTableSettings.identity_column ? builtDAOsTableSettings.identity_column : null,
list_fields: builtDAOsTableSettings?.list_fields?.length > 0 ? builtDAOsTableSettings.list_fields : [],
allow_csv_export: allowCsvExport,
allow_csv_import: allowCsvImport,
can_delete: can_delete,
can_update: can_update,
can_add: can_add,
columns_view: builtDAOsTableSettings?.columns_view ? builtDAOsTableSettings.columns_view : [],
ordering_field: builtDAOsTableSettings.ordering_field ? builtDAOsTableSettings.ordering_field : undefined,
},
table_settings: buildTableSettingsForResponse(builtDAOsTableSettings, tableSettings),
};
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Keep top-level can_* flags aligned with table_settings.

backend/src/entities/table/utils/build-table-settings-for-response.util.ts (Lines 3-20) now defaults missing booleans with ?? true, but the top-level can_delete/can_update/can_add fields here still read directly from tableSettings. For partially populated settings records, the same response can now disagree with itself.

🔧 Suggested fix
-			can_delete: tableSettings ? tableSettings.can_delete : true,
-			can_update: tableSettings ? tableSettings.can_update : true,
-			can_add: tableSettings ? tableSettings.can_add : true,
+			can_delete: tableSettings?.can_delete ?? true,
+			can_update: tableSettings?.can_update ?? true,
+			can_add: tableSettings?.can_add ?? true,
 			table_settings: buildTableSettingsForResponse(builtDAOsTableSettings, tableSettings),
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
can_delete: tableSettings ? tableSettings.can_delete : true,
can_update: tableSettings ? tableSettings.can_update : true,
can_add: tableSettings ? tableSettings.can_add : true,
table_settings: {
sortable_by: builtDAOsTableSettings?.sortable_by?.length > 0 ? builtDAOsTableSettings.sortable_by : [],
ordering: builtDAOsTableSettings.ordering ? builtDAOsTableSettings.ordering : undefined,
identity_column: builtDAOsTableSettings.identity_column ? builtDAOsTableSettings.identity_column : null,
list_fields: builtDAOsTableSettings?.list_fields?.length > 0 ? builtDAOsTableSettings.list_fields : [],
allow_csv_export: allowCsvExport,
allow_csv_import: allowCsvImport,
can_delete: can_delete,
can_update: can_update,
can_add: can_add,
columns_view: builtDAOsTableSettings?.columns_view ? builtDAOsTableSettings.columns_view : [],
ordering_field: builtDAOsTableSettings.ordering_field ? builtDAOsTableSettings.ordering_field : undefined,
},
table_settings: buildTableSettingsForResponse(builtDAOsTableSettings, tableSettings),
};
can_delete: tableSettings?.can_delete ?? true,
can_update: tableSettings?.can_update ?? true,
can_add: tableSettings?.can_add ?? true,
table_settings: buildTableSettingsForResponse(builtDAOsTableSettings, tableSettings),
};
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/use-cases/get-row-by-primary-key.use.case.ts`
around lines 161 - 165, The top-level can_delete/can_update/can_add fields are
reading directly from tableSettings and can disagree with the normalized values
produced by buildTableSettingsForResponse; update the response construction in
get-row-by-primary-key.use.case.ts to derive those top-level flags from the
normalized settings returned by buildTableSettingsForResponse (or apply the same
nullish-coalescing defaults like "?? true" using builtDAOsTableSettings) so
can_delete/can_update/can_add are consistent with the values inside
table_settings; reference the variables tableSettings, builtDAOsTableSettings
and the function buildTableSettingsForResponse when making the change.

Comment on lines 216 to +219
can_delete: tableSettings ? tableSettings.can_delete : true,
can_update: tableSettings ? tableSettings.can_update : true,
can_add: tableSettings ? tableSettings.can_add : true,
table_settings: {
sortable_by: builtDAOsTableSettings?.sortable_by?.length > 0 ? builtDAOsTableSettings.sortable_by : [],
ordering: builtDAOsTableSettings.ordering ? builtDAOsTableSettings.ordering : undefined,
identity_column: builtDAOsTableSettings.identity_column ? builtDAOsTableSettings.identity_column : null,
list_fields: builtDAOsTableSettings?.list_fields?.length > 0 ? builtDAOsTableSettings.list_fields : [],
allow_csv_export: allowCsvExport,
allow_csv_import: allowCsvImport,
can_delete: can_delete,
can_update: can_update,
can_add: can_add,
columns_view: builtDAOsTableSettings?.columns_view ? builtDAOsTableSettings.columns_view : [],
ordering_field: builtDAOsTableSettings.ordering_field ? builtDAOsTableSettings.ordering_field : undefined,
},
table_settings: buildTableSettingsForResponse(builtDAOsTableSettings, tableSettings),
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Match the top-level permission flags to the new table_settings defaults.

Right now table_settings.can_* falls back to true, while the sibling top-level fields can still be undefined. That creates two different answers in the same payload for partially populated settings rows.

🔧 Suggested fix
-				can_delete: tableSettings ? tableSettings.can_delete : true,
-				can_update: tableSettings ? tableSettings.can_update : true,
-				can_add: tableSettings ? tableSettings.can_add : true,
+				can_delete: tableSettings?.can_delete ?? true,
+				can_update: tableSettings?.can_update ?? true,
+				can_add: tableSettings?.can_add ?? true,
 				table_settings: buildTableSettingsForResponse(builtDAOsTableSettings, tableSettings),
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
can_delete: tableSettings ? tableSettings.can_delete : true,
can_update: tableSettings ? tableSettings.can_update : true,
can_add: tableSettings ? tableSettings.can_add : true,
table_settings: {
sortable_by: builtDAOsTableSettings?.sortable_by?.length > 0 ? builtDAOsTableSettings.sortable_by : [],
ordering: builtDAOsTableSettings.ordering ? builtDAOsTableSettings.ordering : undefined,
identity_column: builtDAOsTableSettings.identity_column ? builtDAOsTableSettings.identity_column : null,
list_fields: builtDAOsTableSettings?.list_fields?.length > 0 ? builtDAOsTableSettings.list_fields : [],
allow_csv_export: allowCsvExport,
allow_csv_import: allowCsvImport,
can_delete: can_delete,
can_update: can_update,
can_add: can_add,
columns_view: builtDAOsTableSettings?.columns_view ? builtDAOsTableSettings.columns_view : [],
ordering_field: builtDAOsTableSettings.ordering_field ? builtDAOsTableSettings.ordering_field : undefined,
},
table_settings: buildTableSettingsForResponse(builtDAOsTableSettings, tableSettings),
can_delete: tableSettings?.can_delete ?? true,
can_update: tableSettings?.can_update ?? true,
can_add: tableSettings?.can_add ?? true,
table_settings: buildTableSettingsForResponse(builtDAOsTableSettings, tableSettings),
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/use-cases/update-row-in-table.use.case.ts` around
lines 216 - 219, Top-level permission flags (can_delete, can_update, can_add)
must match the defaults used in table_settings; instead of falling back to true
when tableSettings is missing, calculate the response defaults via
buildTableSettingsForResponse(builtDAOsTableSettings, tableSettings) and use
those resulting fields for the top-level flags (e.g. set can_delete =
table_settings_response.can_delete, can_update =
table_settings_response.can_update, can_add = table_settings_response.can_add)
so the sibling fields and table_settings are consistent.

Comment on lines +8 to +20
return {
sortable_by: builtDAOsTableSettings?.sortable_by?.length > 0 ? builtDAOsTableSettings.sortable_by : [],
ordering: builtDAOsTableSettings.ordering ? builtDAOsTableSettings.ordering : undefined,
identity_column: builtDAOsTableSettings.identity_column ? builtDAOsTableSettings.identity_column : null,
list_fields: builtDAOsTableSettings?.list_fields?.length > 0 ? builtDAOsTableSettings.list_fields : [],
allow_csv_export: tableSettings?.allow_csv_export ?? true,
allow_csv_import: tableSettings?.allow_csv_import ?? true,
can_delete: tableSettings?.can_delete ?? true,
can_update: tableSettings?.can_update ?? true,
can_add: tableSettings?.can_add ?? true,
columns_view: builtDAOsTableSettings?.columns_view ? builtDAOsTableSettings.columns_view : [],
ordering_field: builtDAOsTableSettings.ordering_field ? builtDAOsTableSettings.ordering_field : undefined,
};
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

find . -name "build-table-settings-for-response.util.ts" -type f 2>/dev/null | head -20

Repository: rocket-admin/rocketadmin

Length of output: 144


🏁 Script executed:

cat -n ./backend/src/entities/table/utils/build-table-settings-for-response.util.ts

Repository: rocket-admin/rocketadmin

Length of output: 1599


🏁 Script executed:

cat -n ./backend/src/entities/table/application/data-structures/found-table-rows.ds.ts

Repository: rocket-admin/rocketadmin

Length of output: 4532


🏁 Script executed:

# Check if there are any type override comments or if TypeScript configuration allows this
grep -r "ts-ignore\|@ts-expect-error\|any" ./backend/src/entities/table/utils/build-table-settings-for-response.util.ts

Repository: rocket-admin/rocketadmin

Length of output: 50


🏁 Script executed:

# Check the QueryOrderingEnum to understand the type better
find . -name "*.ts" -type f -exec grep -l "enum QueryOrderingEnum\|export.*QueryOrderingEnum" {} \; 2>/dev/null | head -5

Repository: rocket-admin/rocketadmin

Length of output: 246


🏁 Script executed:

cat -n ./backend/src/enums/query-ordering.enum.ts

Repository: rocket-admin/rocketadmin

Length of output: 159


🏁 Script executed:

# Check TableSettingsDS from shared-code to understand the source data structure
find . -path "*/shared-code/*/table-settings.ds.*" -type f 2>/dev/null | head -5

Repository: rocket-admin/rocketadmin

Length of output: 227


🏁 Script executed:

cat -n ./shared-code/src/data-access-layer/shared/data-structures/table-settings.ds.ts

Repository: rocket-admin/rocketadmin

Length of output: 845


🏁 Script executed:

# Verify how this function is used in the codebase
rg "buildTableSettingsForResponse" --type ts -A 2 -B 2 | head -50

Repository: rocket-admin/rocketadmin

Length of output: 5381


🏁 Script executed:

# Check if there are type definitions/interfaces for the tableSettings parameter elsewhere
rg "allow_csv_export|allow_csv_import" --type ts -B 3 -A 3 ./backend/src/entities/table | grep -E "interface|type " | head -10

Repository: rocket-admin/rocketadmin

Length of output: 50


🏁 Script executed:

# Check if there are any other uses or patterns that might clarify the intent
rg "TableSettingsInRowsDS" --type ts -B 2 -A 2 ./backend/src | head -40

Repository: rocket-admin/rocketadmin

Length of output: 3548


Fix the type mismatch between the builder return and TableSettingsInRowsDS.

The function returns ordering: undefined, ordering_field: undefined, and identity_column: null, but TableSettingsInRowsDS declares all three as required, non-nullable fields (lines 17, 20, 23). This creates a runtime divergence from the DTO contract and Swagger schema. Make these fields optional/nullable in TableSettingsInRowsDS, or ensure the builder never emits undefined/null for them.

Additionally, extract the inline tableSettings parameter object type (line 6) into a named interface per the coding guidelines.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/utils/build-table-settings-for-response.util.ts`
around lines 8 - 20, The builder is returning undefined/null for ordering,
ordering_field, and identity_column while the DTO TableSettingsInRowsDS declares
them as required non-nullable; update either TableSettingsInRowsDS to make
ordering, ordering_field, and identity_column optional/nullable or change the
builder (in build-table-settings-for-response util, e.g.,
buildTableSettingsForResponse) to always emit concrete values (e.g., default
empty string or default sentinel) for ordering, ordering_field and
identity_column to match the DTO; additionally extract the inline parameter type
currently passed as tableSettings into a named interface (e.g.,
TableSettingsInput) and replace the inline type usage with that interface to
follow coding guidelines.

Comment on lines +102 to +111
const filterData = filtersDataFromBody[rowName] as Record<string, string>;
for (const key in filterData) {
if (!validateStringWithEnum(key, FilterCriteriaEnum)) {
throw new Error(`Invalid filter criteria: "${key}".`);
}
const isValueNull = filtersDataFromBody[rowName][key] === null && key === FilterCriteriaEnum.eq;
const isValueNull = filterData[key] === null && key === FilterCriteriaEnum.eq;
filteringItems.push({
field: rowName,
criteria: isValueNull ? FilterCriteriaEnum.empty : (key as FilterCriteriaEnum),
value: filtersDataFromBody[rowName][key],
value: filterData[key],
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Type inconsistency: cast to Record<string, string> but code checks for null values.

Line 107 checks filterData[key] === null, but the cast on line 102 claims all values are string. This is contradictory. Additionally, when isValueNull is false, filterData[key] (potentially null) is assigned to value, which is typed as string in FilteringFieldsDs.

♻️ Suggested fix for type consistency
-			const filterData = filtersDataFromBody[rowName] as Record<string, string>;
+			const filterData = filtersDataFromBody[rowName] as Record<string, string | null>;
 			for (const key in filterData) {
 				if (!validateStringWithEnum(key, FilterCriteriaEnum)) {
 					throw new Error(`Invalid filter criteria: "${key}".`);
 				}
 				const isValueNull = filterData[key] === null && key === FilterCriteriaEnum.eq;
 				filteringItems.push({
 					field: rowName,
 					criteria: isValueNull ? FilterCriteriaEnum.empty : (key as FilterCriteriaEnum),
-					value: filterData[key],
+					value: filterData[key] ?? '',
 				});
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const filterData = filtersDataFromBody[rowName] as Record<string, string>;
for (const key in filterData) {
if (!validateStringWithEnum(key, FilterCriteriaEnum)) {
throw new Error(`Invalid filter criteria: "${key}".`);
}
const isValueNull = filtersDataFromBody[rowName][key] === null && key === FilterCriteriaEnum.eq;
const isValueNull = filterData[key] === null && key === FilterCriteriaEnum.eq;
filteringItems.push({
field: rowName,
criteria: isValueNull ? FilterCriteriaEnum.empty : (key as FilterCriteriaEnum),
value: filtersDataFromBody[rowName][key],
value: filterData[key],
const filterData = filtersDataFromBody[rowName] as Record<string, string | null>;
for (const key in filterData) {
if (!validateStringWithEnum(key, FilterCriteriaEnum)) {
throw new Error(`Invalid filter criteria: "${key}".`);
}
const isValueNull = filterData[key] === null && key === FilterCriteriaEnum.eq;
filteringItems.push({
field: rowName,
criteria: isValueNull ? FilterCriteriaEnum.empty : (key as FilterCriteriaEnum),
value: filterData[key] ?? '',
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/utils/find-filtering-fields.util.ts` around lines
102 - 111, The code casts filtersDataFromBody[rowName] to Record<string, string>
but then checks for null and assigns possibly-null values into
FilteringFieldsDs; update the types and checks for consistency by changing the
cast to Record<string, string | null> (or adjust the source type), update the
loop to treat values as string | null, keep the isValueNull check using
FilterCriteriaEnum.eq, and when pushing into filteringItems (used by
FilteringFieldsDs) ensure the value field accepts string | null or convert
non-null values to string (e.g., String(filterData[key])) so the assigned value
matches FilteringFieldsDs.value; verify validateStringWithEnum and
FilterCriteriaEnum usage remain unchanged.

Comment on lines 25 to +26
if (fieldValue !== undefined && fieldValue !== null && fieldValue !== '' && widgetParams.encrypt) {
row[widget.field_name] = await Encryptor.processDataWithAlgorithm(fieldValue as any, widgetParams.algorithm);
row[widget.field_name] = await Encryptor.processDataWithAlgorithm(fieldValue as string, widgetParams.algorithm);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

find . -type f -name "hash-passwords-in-row.util.ts" | head -5

Repository: rocket-admin/rocketadmin

Length of output: 132


🏁 Script executed:

cat -n ./backend/src/entities/table/utils/hash-passwords-in-row.util.ts

Repository: rocket-admin/rocketadmin

Length of output: 1670


Add runtime string validation before encrypting password fields.

Line 25's conditions (!== undefined && !== null && !== '') do not validate that fieldValue is actually a string type. Non-string values like 0, false, objects, or arrays would pass all checks and reach the unsafe cast on line 26. If Encryptor.processDataWithAlgorithm() throws on a non-string input, the exception is silently caught at lines 28–30, leaving the original unencrypted value in row—a password encryption bypass.

Add an explicit typeof fieldValue !== 'string' check before calling the encryptor:

Suggested fix
 			if (fieldValue !== undefined && fieldValue !== null && fieldValue !== '' && widgetParams.encrypt) {
+				if (typeof fieldValue !== 'string') {
+					throw new TypeError(`Password field "${widget.field_name}" must be a string`);
+				}
 				row[widget.field_name] = await Encryptor.processDataWithAlgorithm(fieldValue as string, widgetParams.algorithm);
 			}
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (fieldValue !== undefined && fieldValue !== null && fieldValue !== '' && widgetParams.encrypt) {
row[widget.field_name] = await Encryptor.processDataWithAlgorithm(fieldValue as any, widgetParams.algorithm);
row[widget.field_name] = await Encryptor.processDataWithAlgorithm(fieldValue as string, widgetParams.algorithm);
if (fieldValue !== undefined && fieldValue !== null && fieldValue !== '' && widgetParams.encrypt) {
if (typeof fieldValue !== 'string') {
throw new TypeError(`Password field "${widget.field_name}" must be a string`);
}
row[widget.field_name] = await Encryptor.processDataWithAlgorithm(fieldValue as string, widgetParams.algorithm);
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/src/entities/table/utils/hash-passwords-in-row.util.ts` around lines
25 - 26, The current check before encrypting a password field allows non-string
values (0, false, objects, arrays) to reach the unsafe cast and call
Encryptor.processDataWithAlgorithm, which can throw and leave
row[widget.field_name] unencrypted; update the condition in the hashing routine
in hash-passwords-in-row.util.ts (the block that reads fieldValue,
widgetParams.encrypt and calls Encryptor.processDataWithAlgorithm with
widgetParams.algorithm) to include an explicit typeof fieldValue === 'string'
check (skip/enforce handling when not a string), so only real strings are passed
to Encryptor.processDataWithAlgorithm and non-string values are either ignored
or handled safely before assignment to row[widget.field_name].

Comment on lines +10 to +15
export async function saveTableInfoInDatabase(
connectionId: string,
tables: Array<TableDS>,
masterPwd: string,
dbContext: IGlobalDatabaseContext,
): Promise<void> {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
rg -n "saveTableInfoInDatabase\(" backend/src/entities/table/use-cases
rg -n "getTableStructure\(" shared-code/src/shared/interfaces backend/src/entities/table/utils/save-table-info-in-database-orchestrator.util.ts

Repository: rocket-admin/rocketadmin

Length of output: 856


🏁 Script executed:

cat -n backend/src/entities/table/utils/save-table-info-in-database-orchestrator.util.ts | head -50

Repository: rocket-admin/rocketadmin

Length of output: 2624


🏁 Script executed:

cat -n shared-code/src/shared/interfaces/data-access-object-agent.interface.ts | head -80

Repository: rocket-admin/rocketadmin

Length of output: 3703


🏁 Script executed:

cat -n shared-code/src/shared/interfaces/data-access-object.interface.ts | head -80

Repository: rocket-admin/rocketadmin

Length of output: 3952


🏁 Script executed:

cat -n backend/src/entities/table/use-cases/find-tables-in-connection.use.case.ts | sed -n '100,120p'

Repository: rocket-admin/rocketadmin

Length of output: 1179


🏁 Script executed:

cat -n backend/src/entities/table/utils/save-table-info-in-database-orchestrator.util.ts | tail -30

Repository: rocket-admin/rocketadmin

Length of output: 1352


Pass the agent user identity through this util.

getDataAccessObject() can return an IDataAccessObjectAgent, and that contract requires userEmail for getTableStructure(...). Hardcoding undefined here means agent-backed connections cannot safely populate table info. This util needs a userEmail parameter, or it needs to branch on agent vs non-agent DAOs before fetching schema.

Additionally, the error handling at lines 61–64 silently swallows failures—the function returns Promise<void> regardless of success or failure, leaving callers unable to detect when the operation fails. Either rethrow the error or return a result indicating success/failure.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@backend/src/entities/table/utils/save-table-info-in-database-orchestrator.util.ts`
around lines 10 - 15, The saveTableInfoInDatabase function currently hardcodes
undefined for agent-backed DAOs when calling getTableStructure and swallows
errors; update its signature to accept a userEmail parameter and, inside
saveTableInfoInDatabase, call getDataAccessObject and detect if the returned DAO
implements IDataAccessObjectAgent so you pass the userEmail into
getTableStructure (otherwise keep existing behavior), and change the error
handling so failures are surfaced (either rethrow the caught error or change the
return type from Promise<void> to Promise<boolean> and return success/failure)
so callers can detect when the operation fails; reference functions/types
saveTableInfoInDatabase, getDataAccessObject, IDataAccessObjectAgent, and
getTableStructure in your changes.

Comment on lines +45 to +64
foundConnection.tables_info = (await Promise.all(
tablesStructures.map(async (tableStructure) => {
return await queue.add(async () => {
const newTableInfo = buildTableInfoEntity(tableStructure.tableName, foundConnection);
const savedTableInfo = await dbContext.tableInfoRepository.save(newTableInfo);
const newTableFieldsInfos = tableStructure.structure.map((el) =>
buildTableFieldInfoEntity(el, savedTableInfo),
);
newTableInfo.table_fields_info = await dbContext.tableFieldInfoRepository.save(newTableFieldsInfos);
await dbContext.tableInfoRepository.save(newTableInfo);
return newTableInfo;
});
}),
)) as Array<TableInfoEntity>;
foundConnection.saved_table_info = ++foundConnection.saved_table_info;
await dbContext.connectionRepository.saveUpdatedConnection(foundConnection);
} catch (e) {
Sentry.captureException(e);
console.error(e);
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Wrap the metadata save phase in a transaction.

By the time one queued save() fails, other TableInfoEntity / field-info inserts may already have committed. Because the outer catch only logs and returns, retries can leave duplicate or orphaned metadata while foundConnection.tables_info never gets synchronized. This write batch needs transactional protection or explicit cleanup on failure.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@backend/src/entities/table/utils/save-table-info-in-database-orchestrator.util.ts`
around lines 45 - 64, The metadata save loop that uses queue.add with
buildTableInfoEntity, tableInfoRepository.save and tableFieldInfoRepository.save
must be executed inside a single database transaction so partial commits cannot
occur; refactor the block so that all saves (creating newTableInfo, saving
newTableFieldsInfos, assigning foundConnection.tables_info and incrementing
foundConnection.saved_table_info) are performed within
dbContext.manager.transaction (or your ORM’s transactional API) and on error the
transaction is rolled back and the exception rethrown (instead of only logging)
so callers can retry safely; alternatively, if transactions aren’t available,
implement explicit cleanup that deletes any partially saved TableInfoEntity and
its TableFieldInfoEntity records on failure before returning.

@Artuomka Artuomka merged commit 646f295 into main Mar 25, 2026
18 of 20 checks passed
@Artuomka Artuomka deleted the backend_refactoring_ts branch March 25, 2026 08:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants