Skip to content

UN-3197 [MISC] Remove Celery file processing workers and dead code#1777

Merged
Deepak-Kesavan merged 15 commits intomainfrom
feat/UN-3197-MISC_remove_celery_file_processing_workers
Feb 27, 2026
Merged

UN-3197 [MISC] Remove Celery file processing workers and dead code#1777
Deepak-Kesavan merged 15 commits intomainfrom
feat/UN-3197-MISC_remove_celery_file_processing_workers

Conversation

@muhammad-ali-e
Copy link
Contributor

@muhammad-ali-e muhammad-ali-e commented Feb 4, 2026

What

  • Remove legacy Celery file-processing and file-processing-callback workers entirely
  • Delete file_execution_tasks.py (all remaining methods were dead code after task removal)
  • Remove FileExecutionTasks import and usage from workflow_helper.py
  • Clean up related Celery service registrations and task definitions
  • Consolidate queue constants into utils/constants.py
  • Remove old worker services (worker, worker-logging, worker-file-processing, worker-file-processing-callback) from docker-compose
  • Promote all v2 unified workers from opt-in (workers-v2 profile) to default services
  • Remove --workers-v2 flag from run-platform.sh

Why

  • Workers V2 architecture replaces the old Celery-based file processing pipeline
  • After removing the two Celery tasks (process_file_batch, process_batch_callback), all 18 remaining methods in FileExecutionTasks had zero external callers — pure dead code
  • Keeping dead worker code increases maintenance burden and confusion
  • The old backend/workers/ package (file_processing, file_processing_callback) is fully superseded

How

  • Deleted backend/backend/workers/ package entirely (file_processing + file_processing_callback workers)
  • Deleted backend/workflow_manager/workflow_v2/file_execution_tasks.py (1200+ lines of dead code)
  • Removed Celery task registrations from celery_task.py and celery_service.py
  • Removed webhook notification helper that was only called from deleted code
  • Moved FileProcessingQueue constants to backend/utils/constants.py
  • Simplified workflow_helper.py to remove all references to deleted tasks
  • Updated run-platform.sh and CI workflow to remove old worker services
  • Removed old worker dependencies from pyproject.toml
  • Removed 4 legacy worker service definitions (worker, worker-logging, worker-file-processing, worker-file-processing-callback) from docker/docker-compose.yaml
  • Removed profiles: [workers-v2] from 8 v2 worker services, making them start by default
  • Cleaned celery-flower depends_on to reference only remaining services
  • Removed --workers-v2 flag and conditional profile logic from run-platform.sh
  • Removed old WORKER_*_AUTOSCALE env vars from docker/sample.env
  • Removed old worker override entries from docker/sample.compose.override.yaml
  • Removed "V2 Workers (Optional)" section from docker/README.md

Can this PR break any existing features. If yes, please list possible items. If no, please explain why. (PS: Admins do not merge the PR without this section filled)

  • No. All deleted code is dead — the Celery tasks were the only entry points, and those are removed. Workers V2 handles file processing now.
  • Verified: zero imports of FileExecutionTasks or file_execution_tasks remain in the codebase
  • FileProcessingQueue constants are preserved in utils/constants.py for continued use by workflow_helper.py
  • V2 workers are promoted to default — they were already functional under the workers-v2 profile

Database Migrations

  • None required — no model changes

Env Config

  • Old worker-specific env vars (CELERY_FILE_PROCESSING_*, CELERY_FILE_PROCESSING_CALLBACK_*) are no longer needed but their presence is harmless
  • WORKER_*_AUTOSCALE variables for old workers are removed from sample.env

Relevant Docs

Related Issues or PRs

  • Part of the Celery-to-Workers-V2 migration initiative

Dependencies Versions

  • No dependency changes (removed unused ones from pyproject.toml)

Notes on Testing

  • Pre-commit hooks pass on all changed files
  • No remaining imports/references to deleted modules confirmed via grep
  • V2 workers start by default without needing --workers-v2 flag

Screenshots

N/A

Checklist

I have read and understood the Contribution Guidelines.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 4, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review

Walkthrough

This pull request consolidates the Celery worker infrastructure by removing legacy v1 worker implementations (worker, worker-logging, worker-file-processing, worker-file-processing-callback), eliminating the FileExecutionTasks orchestration layer, and refactoring task invocations to use unified celery_app.send_task patterns. Updates deployment matrix and environment configuration accordingly.

Changes

Cohort / File(s) Summary
Production Build & Deployment
.github/workflows/production-build.yaml
Added worker-unified to the deployment matrix and incremented TOTAL_SERVICES from 6 to 7.
Worker Configuration & Task Definitions
backend/pyproject.toml
Removed Celery Flower dependency and eight Poe task definitions (worker, worker-logging, worker-file-processing, worker-api-file-processing, worker-file-processing-callback, worker-api-file-processing-callback, and flower) associated with legacy v1 workers.
Legacy Celery Configuration Cleanup
backend/backend/celery_service.py, backend/backend/celery_task.py, backend/backend/workers/constants.py
Removed TaskRegistry import and invocation; deleted TaskRegistry class with log_consumer task; removed CeleryWorkerNames class and FILE_PROCESSING constants.
File Processing Worker (V1) Removal
backend/backend/workers/file_processing/__init__.py, backend/backend/workers/file_processing/celery_config.py, backend/backend/workers/file_processing/constants.py, backend/backend/workers/file_processing/file_processing.py
Eliminated entire file processing worker module including Celery app instantiation, queue configuration, and QueueNames constants (FILE_PROCESSING, API_FILE_PROCESSING).
File Processing Callback Worker (V1) Removal
backend/backend/workers/file_processing_callback/__init__.py, backend/backend/workers/file_processing_callback/celery_config.py, backend/backend/workers/file_processing_callback/constants.py, backend/backend/workers/file_processing_callback/file_processing_callback.py
Removed entire callback worker module including Celery app, queue configuration, and QueueNames constants (FILE_PROCESSING_CALLBACK, API_FILE_PROCESSING_CALLBACK).
Task Orchestration Refactoring
backend/workflow_manager/workflow_v2/file_execution_tasks.py, backend/workflow_manager/workflow_v2/workflow_helper.py
Deleted FileExecutionTasks orchestration class (1217 lines) that handled batch file processing, workflow state reconstruction, and execution lifecycle. Updated WorkflowHelper to use FileProcessingQueue-driven celery_app.signature calls, added _handle_execution_failure helper, and removed execute_bin static method.
Notification Task Dispatch Refactoring
backend/notification_v2/internal_views.py, backend/notification_v2/provider/webhook/webhook.py, backend/utils/constants.py
Switched webhook notification dispatch from direct task method calls to celery_app.send_task with task name resolution; removed send_webhook_notification task definition from webhook.py. Added FileProcessingQueue class with four queue name constants (FILE_PROCESSING, API_FILE_PROCESSING, FILE_PROCESSING_CALLBACK, API_FILE_PROCESSING_CALLBACK).
Docker Compose Infrastructure
docker/docker-compose.yaml
Removed four legacy v1 worker services (worker, worker-logging, worker-file-processing, worker-file-processing-callback); removed workers-v2 profile from eight v2 worker services; updated Celery Flower dependencies; changed section header to generic "WORKER SERVICES".
Environment & Documentation
docker/sample.env, docker/README.md, run-platform.sh
Updated autoscale configuration variables (removed WORKER_LOGGING_AUTOSCALE, WORKER_AUTOSCALE, WORKER_FILE_PROCESSING_CALLBACK_AUTOSCALE; updated WORKER_FILE_PROCESSING_AUTOSCALE to 8,2; added WORKER_API_DEPLOYMENT_AUTOSCALE, WORKER_CALLBACK_AUTOSCALE, WORKER_GENERAL_AUTOSCALE). Removed v2 workers documentation and -w/--workers-v2 option from deployment script.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~65 minutes

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 50.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly summarizes the main change: removing legacy Celery file processing workers and associated dead code, which aligns with the comprehensive changeset.
Description check ✅ Passed The description fully completes the template with detailed What, Why, How sections, addresses the critical breaking-changes section, covers database migrations, environment config, related issues, testing notes, and includes the required contribution guidelines checklist.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feat/UN-3197-MISC_remove_celery_file_processing_workers

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

…' of github-muhammad:Zipstack/unstract into feat/UN-3197-MISC_remove_celery_file_processing_workers
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In `@backend/workflow_manager/workflow_v2/workflow_helper.py`:
- Around line 244-275: The failure handler _handle_execution_failure currently
updates only the WorkflowExecution and logs errors, which can leave the pipeline
run stuck if chord enqueue exceptions bypass the outer run_workflow path; import
and call PipelineUtils.update_pipeline_status (or the equivalent pipeline status
updater) from inside _handle_execution_failure using the pipeline/pipeline_run
identifier available on workflow_execution (e.g.,
workflow_execution.pipeline_run.id or workflow_execution.workflow.pipeline.id)
to mark the pipeline run as failed/errored and include the error message, or
alternatively re-raise the exception after cleanup so the outer run_workflow
path updates pipeline status; ensure you reference
PipelineUtils.update_pipeline_status and _handle_execution_failure in the
change.

@chandrasekharan-zipstack
Copy link
Contributor

@muhammad-ali-e what about changes to the docker-compose.yaml and remove the profile for v2-workers?

muhammad-ali-e and others added 3 commits February 16, 2026 10:19
Remove legacy worker, worker-logging, worker-file-processing, and
worker-file-processing-callback services from docker-compose. Promote
v2 unified workers from opt-in profile to default. Clean up related
flags, env vars, docs, and override configs.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…' of github-muhammad:Zipstack/unstract into feat/UN-3197-MISC_remove_celery_file_processing_workers
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
docker/sample.env (1)

10-18: ⚠️ Potential issue | 🟡 Minor

Rename WORKER_FILE_PROCESSING_NEW_AUTOSCALE to WORKER_FILE_PROCESSING_AUTOSCALE.

Line 15 inconsistently includes a _NEW suffix that doesn't appear in any other autoscale variable. All other autoscale variables follow the pattern WORKER_{TYPE}_AUTOSCALE, and the file processing worker's other variables (e.g., WORKER_FILE_PROCESSING_CONCURRENCY on line 85) don't use this suffix either. Remove the _NEW suffix to align with the naming convention.

Move `get_plugin` import to top-level and rename
WORKER_FILE_PROCESSING_NEW_AUTOSCALE to WORKER_FILE_PROCESSING_AUTOSCALE.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (3)
backend/workflow_manager/workflow_v2/workflow_helper.py (2)

209-232: Hoist is_api out of the loop — it's computed redundantly and creates a fragile cross-scope reference.

is_api is assigned inside the for batch in batches: loop (lines 210–212) but then referenced after the loop at line 230 (inside the try block). source.endpoint.connection_type is constant across all iterations, so re-evaluating it on every pass is wasteful. More importantly, if batches is ever empty (e.g., if the early-return guard at line 165 is removed, or get_file_batches is changed), is_api will be undefined when line 230 executes, raising a NameError.

♻️ Proposed fix: hoist `is_api` before the loop
+        # Determine the appropriate queue based on connection type
+        is_api = (
+            source.endpoint.connection_type == WorkflowEndpoint.ConnectionType.API
+        )
+        file_processing_queue = (
+            FileProcessingQueue.API_FILE_PROCESSING
+            if is_api
+            else FileProcessingQueue.FILE_PROCESSING
+        )
+
         for batch in batches:
             # Convert all UUIDs to strings in batch_data
             file_data = FileData(...)
             batch_data = FileBatchData(files=batch, file_data=file_data)
 
-            # Determine the appropriate queue based on connection type
-            is_api = (
-                source.endpoint.connection_type == WorkflowEndpoint.ConnectionType.API
-            )
-            file_processing_queue = (
-                FileProcessingQueue.API_FILE_PROCESSING
-                if is_api
-                else FileProcessingQueue.FILE_PROCESSING
-            )
 
             # Send each batch to the dedicated file_processing queue
             batch_tasks.append(
                 celery_app.signature(
                     "process_file_batch",
                     args=[batch_data.to_dict()],
                     queue=file_processing_queue,
                 )
             )
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/workflow_manager/workflow_v2/workflow_helper.py` around lines 209 -
232, The variable is_api is computed inside the for batch in batches: loop
(using source.endpoint.connection_type) but is later referenced after the loop
when selecting FileProcessingQueue.* for file_processing_callback_queue, which
can raise NameError if batches is empty; move the is_api computation (is_api =
source.endpoint.connection_type == WorkflowEndpoint.ConnectionType.API) to
before the loop so it is evaluated once outside the loop and then use that
single value when building batch_tasks (celery_app.signature
"process_file_batch") and when selecting
FileProcessingQueue.API_FILE_PROCESSING_CALLBACK vs FILE_PROCESSING_CALLBACK;
ensure no other logic depends on per-batch evaluation.

256-272: Address static analysis warnings in _handle_execution_failure.

Four Ruff findings in the new method:

  • Lines 256 & 272: str(error) inside f-strings should use the !s conversion flag (RUF010).
  • Line 268: Bare except Exception (BLE001) — catch a more specific exception type if possible, or at minimum acknowledge the broad catch.
  • Line 269: logger.error inside an except block should be logger.exception so the traceback is captured automatically (TRY400).
♻️ Proposed fix
     workflow_execution.update_execution(
         status=ExecutionStatus.ERROR,
-        error=f"Error while processing files: {str(error)}",
+        error=f"Error while processing files: {error!s}",
     )

     organization_id = workflow_execution.workflow.organization.organization_id
     subscription_usage_plugin = get_plugin("subscription_usage")
     if subscription_usage_plugin:
         try:
             service = subscription_usage_plugin["service_class"]()
             service.handle_workflow_execution_failure(
                 organization_id=organization_id,
                 execution_id=str(workflow_execution.id),
             )
-        except Exception as e:
-            logger.error(f"Error in subscription usage plugin failure handler: {e}")
+        except Exception as e:  # noqa: BLE001
+            logger.exception(f"Error in subscription usage plugin failure handler: {e}")

     logger.error(
-        f"Execution {workflow_execution.id} failed: {str(error)}", exc_info=True
+        f"Execution {workflow_execution.id} failed: {error!s}", exc_info=True
     )
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/workflow_manager/workflow_v2/workflow_helper.py` around lines 256 -
272, In _handle_execution_failure update the f-strings that currently use
str(error) to use the !s conversion (e.g., f"...{error!s}...") for both the
"Error while processing files" and "Execution {workflow_execution.id} failed"
logs; in the subscription_usage_plugin block replace the bare "except Exception
as e" with a more explicit catch if possible or at minimum keep "except
Exception as e" but change the inner logger call to logger.exception(...) so the
traceback is recorded when service.handle_workflow_execution_failure raises
(reference workflow_execution, subscription_usage_plugin,
service.handle_workflow_execution_failure, and logger.error/logger.exception).
docker/sample.env (1)

10-11: Clarify the comment — WORKER_*_AUTOSCALE and *_AUTOSCALE are distinct variable names.

The phrase "matches hierarchical configuration below" is slightly misleading: the top block (lines 12–18) uses WORKER_*_AUTOSCALE prefixed variables, while the hierarchical section (lines 77–100) uses unprefixed *_AUTOSCALE names (e.g., WORKER_GENERAL_AUTOSCALE=6,2 vs GENERAL_AUTOSCALE=6,2). They share the same values, but they're separate env vars consumed by different layers (Docker Compose vs Celery config hierarchy). A comment like # Docker Compose–level autoscale limits would be less ambiguous.

✏️ Suggested comment clarification
-#
-# Worker autoscaling (matches hierarchical configuration below)
+#
+# Docker Compose–level autoscale limits (separate from per-worker Celery AUTOSCALE vars below)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docker/sample.env` around lines 10 - 11, The comment "matches hierarchical
configuration below" is ambiguous because Docker Compose uses prefixed
environment variables (WORKER_*_AUTOSCALE) while the Celery/hierarchical config
uses unprefixed variables (*_AUTOSCALE) even though they hold the same values;
update the comment to explicitly differentiate these two sets (e.g., "Docker
Compose–level autoscale limits" for WORKER_*_AUTOSCALE and note that separate
unprefixed *_AUTOSCALE vars are used by the Celery/hierarchy), so readers know
they are distinct env var names consumed by different layers (refer to
WORKER_*_AUTOSCALE and *_AUTOSCALE).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In `@backend/workflow_manager/workflow_v2/workflow_helper.py`:
- Around line 245-246: The exception handler currently only calls
cls._handle_execution_failure(workflow_execution, e) but does not update/persist
the pipeline/workflow_execution status for chord enqueue failures; modify the
except block so after (or inside) cls._handle_execution_failure it also sets the
workflow_execution status to the failed state (e.g., call
workflow_execution.update_status('FAILED') or workflow_execution.mark_failed())
and persists/saves the change so the pipeline status reflects the failure;
ensure you reference the existing workflow_execution object and keep the call to
cls._handle_execution_failure to preserve existing failure handling logic.

---

Nitpick comments:
In `@backend/workflow_manager/workflow_v2/workflow_helper.py`:
- Around line 209-232: The variable is_api is computed inside the for batch in
batches: loop (using source.endpoint.connection_type) but is later referenced
after the loop when selecting FileProcessingQueue.* for
file_processing_callback_queue, which can raise NameError if batches is empty;
move the is_api computation (is_api = source.endpoint.connection_type ==
WorkflowEndpoint.ConnectionType.API) to before the loop so it is evaluated once
outside the loop and then use that single value when building batch_tasks
(celery_app.signature "process_file_batch") and when selecting
FileProcessingQueue.API_FILE_PROCESSING_CALLBACK vs FILE_PROCESSING_CALLBACK;
ensure no other logic depends on per-batch evaluation.
- Around line 256-272: In _handle_execution_failure update the f-strings that
currently use str(error) to use the !s conversion (e.g., f"...{error!s}...") for
both the "Error while processing files" and "Execution {workflow_execution.id}
failed" logs; in the subscription_usage_plugin block replace the bare "except
Exception as e" with a more explicit catch if possible or at minimum keep
"except Exception as e" but change the inner logger call to
logger.exception(...) so the traceback is recorded when
service.handle_workflow_execution_failure raises (reference workflow_execution,
subscription_usage_plugin, service.handle_workflow_execution_failure, and
logger.error/logger.exception).

In `@docker/sample.env`:
- Around line 10-11: The comment "matches hierarchical configuration below" is
ambiguous because Docker Compose uses prefixed environment variables
(WORKER_*_AUTOSCALE) while the Celery/hierarchical config uses unprefixed
variables (*_AUTOSCALE) even though they hold the same values; update the
comment to explicitly differentiate these two sets (e.g., "Docker Compose–level
autoscale limits" for WORKER_*_AUTOSCALE and note that separate unprefixed
*_AUTOSCALE vars are used by the Celery/hierarchy), so readers know they are
distinct env var names consumed by different layers (refer to WORKER_*_AUTOSCALE
and *_AUTOSCALE).

@github-actions
Copy link
Contributor

Test Results

Summary
  • Runner Tests: 11 passed, 0 failed (11 total)
  • SDK1 Tests: 63 passed, 0 failed (63 total)

Runner Tests - Full Report
filepath function $$\textcolor{#23d18b}{\tt{passed}}$$ SUBTOTAL
$$\textcolor{#23d18b}{\tt{runner/src/unstract/runner/clients/test\_docker.py}}$$ $$\textcolor{#23d18b}{\tt{test\_logs}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{runner/src/unstract/runner/clients/test\_docker.py}}$$ $$\textcolor{#23d18b}{\tt{test\_cleanup}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{runner/src/unstract/runner/clients/test\_docker.py}}$$ $$\textcolor{#23d18b}{\tt{test\_cleanup\_skip}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{runner/src/unstract/runner/clients/test\_docker.py}}$$ $$\textcolor{#23d18b}{\tt{test\_client\_init}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{runner/src/unstract/runner/clients/test\_docker.py}}$$ $$\textcolor{#23d18b}{\tt{test\_get\_image\_exists}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{runner/src/unstract/runner/clients/test\_docker.py}}$$ $$\textcolor{#23d18b}{\tt{test\_get\_image}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{runner/src/unstract/runner/clients/test\_docker.py}}$$ $$\textcolor{#23d18b}{\tt{test\_get\_container\_run\_config}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{runner/src/unstract/runner/clients/test\_docker.py}}$$ $$\textcolor{#23d18b}{\tt{test\_get\_container\_run\_config\_without\_mount}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{runner/src/unstract/runner/clients/test\_docker.py}}$$ $$\textcolor{#23d18b}{\tt{test\_run\_container}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{runner/src/unstract/runner/clients/test\_docker.py}}$$ $$\textcolor{#23d18b}{\tt{test\_get\_image\_for\_sidecar}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{runner/src/unstract/runner/clients/test\_docker.py}}$$ $$\textcolor{#23d18b}{\tt{test\_sidecar\_container}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{TOTAL}}$$ $$\textcolor{#23d18b}{\tt{11}}$$ $$\textcolor{#23d18b}{\tt{11}}$$
SDK1 Tests - Full Report
filepath function $$\textcolor{#23d18b}{\tt{passed}}$$ SUBTOTAL
$$\textcolor{#23d18b}{\tt{tests/test\_platform.py}}$$ $$\textcolor{#23d18b}{\tt{TestPlatformHelperRetry.test\_success\_on\_first\_attempt}}$$ $$\textcolor{#23d18b}{\tt{2}}$$ $$\textcolor{#23d18b}{\tt{2}}$$
$$\textcolor{#23d18b}{\tt{tests/test\_platform.py}}$$ $$\textcolor{#23d18b}{\tt{TestPlatformHelperRetry.test\_retry\_on\_connection\_error}}$$ $$\textcolor{#23d18b}{\tt{2}}$$ $$\textcolor{#23d18b}{\tt{2}}$$
$$\textcolor{#23d18b}{\tt{tests/test\_platform.py}}$$ $$\textcolor{#23d18b}{\tt{TestPlatformHelperRetry.test\_non\_retryable\_http\_error}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/test\_platform.py}}$$ $$\textcolor{#23d18b}{\tt{TestPlatformHelperRetry.test\_retryable\_http\_errors}}$$ $$\textcolor{#23d18b}{\tt{3}}$$ $$\textcolor{#23d18b}{\tt{3}}$$
$$\textcolor{#23d18b}{\tt{tests/test\_platform.py}}$$ $$\textcolor{#23d18b}{\tt{TestPlatformHelperRetry.test\_post\_method\_retry}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/test\_platform.py}}$$ $$\textcolor{#23d18b}{\tt{TestPlatformHelperRetry.test\_retry\_logging}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/test\_prompt.py}}$$ $$\textcolor{#23d18b}{\tt{TestPromptToolRetry.test\_success\_on\_first\_attempt}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/test\_prompt.py}}$$ $$\textcolor{#23d18b}{\tt{TestPromptToolRetry.test\_retry\_on\_errors}}$$ $$\textcolor{#23d18b}{\tt{2}}$$ $$\textcolor{#23d18b}{\tt{2}}$$
$$\textcolor{#23d18b}{\tt{tests/test\_prompt.py}}$$ $$\textcolor{#23d18b}{\tt{TestPromptToolRetry.test\_wrapper\_methods\_retry}}$$ $$\textcolor{#23d18b}{\tt{4}}$$ $$\textcolor{#23d18b}{\tt{4}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestIsRetryableError.test\_connection\_error\_is\_retryable}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestIsRetryableError.test\_timeout\_is\_retryable}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestIsRetryableError.test\_http\_error\_retryable\_status\_codes}}$$ $$\textcolor{#23d18b}{\tt{3}}$$ $$\textcolor{#23d18b}{\tt{3}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestIsRetryableError.test\_http\_error\_non\_retryable\_status\_codes}}$$ $$\textcolor{#23d18b}{\tt{5}}$$ $$\textcolor{#23d18b}{\tt{5}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestIsRetryableError.test\_http\_error\_without\_response}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestIsRetryableError.test\_os\_error\_retryable\_errno}}$$ $$\textcolor{#23d18b}{\tt{5}}$$ $$\textcolor{#23d18b}{\tt{5}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestIsRetryableError.test\_os\_error\_non\_retryable\_errno}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestIsRetryableError.test\_other\_exception\_not\_retryable}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCalculateDelay.test\_exponential\_backoff\_without\_jitter}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCalculateDelay.test\_exponential\_backoff\_with\_jitter}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCalculateDelay.test\_max\_delay\_cap}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCalculateDelay.test\_max\_delay\_cap\_with\_jitter}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestRetryWithExponentialBackoff.test\_successful\_call\_first\_attempt}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestRetryWithExponentialBackoff.test\_retry\_after\_transient\_failure}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestRetryWithExponentialBackoff.test\_max\_retries\_exceeded}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestRetryWithExponentialBackoff.test\_retry\_with\_custom\_predicate}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestRetryWithExponentialBackoff.test\_no\_retry\_with\_predicate\_false}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestRetryWithExponentialBackoff.test\_exception\_not\_in\_tuple\_not\_retried}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCreateRetryDecorator.test\_default\_configuration}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCreateRetryDecorator.test\_environment\_variable\_configuration}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCreateRetryDecorator.test\_invalid\_max\_retries}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCreateRetryDecorator.test\_invalid\_base\_delay}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCreateRetryDecorator.test\_invalid\_multiplier}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCreateRetryDecorator.test\_jitter\_values}}$$ $$\textcolor{#23d18b}{\tt{2}}$$ $$\textcolor{#23d18b}{\tt{2}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCreateRetryDecorator.test\_custom\_exceptions\_only}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCreateRetryDecorator.test\_custom\_predicate\_only}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCreateRetryDecorator.test\_both\_exceptions\_and\_predicate}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestCreateRetryDecorator.test\_exceptions\_match\_but\_predicate\_false}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestPreconfiguredDecorators.test\_retry\_platform\_service\_call\_exists}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestPreconfiguredDecorators.test\_retry\_prompt\_service\_call\_exists}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestPreconfiguredDecorators.test\_platform\_service\_decorator\_retries\_on\_connection\_error}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestPreconfiguredDecorators.test\_prompt\_service\_decorator\_retries\_on\_timeout}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestRetryLogging.test\_warning\_logged\_on\_retry}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestRetryLogging.test\_info\_logged\_on\_success\_after\_retry}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{tests/utils/test\_retry\_utils.py}}$$ $$\textcolor{#23d18b}{\tt{TestRetryLogging.test\_exception\_logged\_on\_giving\_up}}$$ $$\textcolor{#23d18b}{\tt{1}}$$ $$\textcolor{#23d18b}{\tt{1}}$$
$$\textcolor{#23d18b}{\tt{TOTAL}}$$ $$\textcolor{#23d18b}{\tt{63}}$$ $$\textcolor{#23d18b}{\tt{63}}$$

@sonarqubecloud
Copy link

@Deepak-Kesavan Deepak-Kesavan merged commit 9fcb723 into main Feb 27, 2026
8 checks passed
@Deepak-Kesavan Deepak-Kesavan deleted the feat/UN-3197-MISC_remove_celery_file_processing_workers branch February 27, 2026 10:05
pk-zipstack added a commit that referenced this pull request Mar 2, 2026
…tadata

Resolve conflict: accept deletion of file_execution_tasks.py (dead code
removed in #1777 after workers v2 migration). The API metadata enrichment
change from that file is no longer needed as workers v2 handles destination
processing differently.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants