Skip to content

Fix health check to use HTTPS#320

Merged
samuelvkwong merged 2 commits intomainfrom
fix-healthcheck-https
Apr 2, 2026
Merged

Fix health check to use HTTPS#320
samuelvkwong merged 2 commits intomainfrom
fix-healthcheck-https

Conversation

@samuelvkwong
Copy link
Copy Markdown
Collaborator

@samuelvkwong samuelvkwong commented Apr 2, 2026

Summary

  • The Docker Swarm health check was using curl -f http://localhost/health/, but SECURE_SSL_REDIRECT returns a 301 redirect to HTTPS
  • This caused health checks to fail, leading Swarm to restart replicas until max_attempts was exhausted
  • Changed to curl -fk https://localhost/health/ to hit the HTTPS endpoint directly (-k skips cert verification for localhost)

Test plan

  • Exec into a running web container and verify curl -fk https://localhost/health/ returns {"status": "ok"}
  • Deploy and confirm docker ps shows containers as (healthy)
  • Verify no restart loops in docker service ps

🤖 Generated with Claude Code

Summary by CodeRabbit

  • New Features

    • Added Mass Transfer module for transferring large imaging volumes over specified date ranges with reusable filters
    • Added DICOM server max search results configuration
    • Added mass transfer job CSV export functionality
  • Infrastructure

    • Added staging environment Docker Compose configuration
    • Added mass transfer worker service for processing background tasks

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Apr 2, 2026

Warning

Rate limit exceeded

@samuelvkwong has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 6 minutes and 53 seconds before requesting another review.

Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 6 minutes and 53 seconds.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: e7a17c3e-134c-4bb1-9a4f-9a5c6a42b2fd

📥 Commits

Reviewing files that changed from the base of the PR and between 5dab95d and 6983c46.

📒 Files selected for processing (1)
  • docker-compose.prod.yml
📝 Walkthrough

Walkthrough

A comprehensive new "Mass Transfer" feature is added to ADIT, enabling bulk DICOM transfer over time ranges with configurable filters. This includes a complete Django app with job/task/volume models, background processing, forms, views, utilities, Docker worker configuration, and extensive test coverage. Core utilities are enhanced to support persistent connections, improved pseudonymization, and refined transfer syntax negotiation.

Changes

Cohort / File(s) Summary
Docker & CI Build Configuration
.github/workflows/ci.yml, docker-compose.base.yml, docker-compose.dev.yml, docker-compose.prod.yml, docker-compose.staging.yml, cli.py
Added mass_transfer_worker Docker service and image tag; introduced new staging environment for development-mode testing with configurable worker replicas; added CLI commands for staging stack deploy/remove.
Project Dependencies & Settings
pyproject.toml, adit/settings/base.py, .vscode/settings.json, AGENTS.md
Added django-codemirror and pydantic dependencies; registered mass_transfer app and CodeMirror configuration; added session cookie customization and mass transfer job priority settings; documented assertion standards.
Core DICOM Models & Migrations
adit/core/models.py, adit/core/migrations/0016_add_max_search_results_to_dicomserver.py, adit/core/migrations/0017_review_fixes.py
Added max_search_results field to DicomServer to cap C-FIND results; updated DicomTask.queued_job_id type annotation from int to int | None.
Core Task & Background Processing
adit/core/tasks.py, adit/core/management/commands/cleanup_jobs_and_tasks.py
Extracted _run_dicom_task helper; added configurable process timeout and CancelledError handling; made retry strategy module-level constant; extended cleanup command to handle MassTransferTask/Job alongside existing DICOM tasks.
Core DICOM Connection & Operator Utilities
adit/core/utils/dimse_connector.py, adit/core/utils/dicom_operator.py, adit/core/utils/presentation_contexts.py
Implemented auto_close/persistent connection management with per-service association tracking; added close() method to DicomOperator; split presentation contexts into image/non-image groups with separate transfer syntax negotiation; improved error logging for C-GET store failures.
Core Pseudonymization & Manipulation
adit/core/utils/pseudonymizer.py, adit/core/utils/dicom_manipulator.py
Added compute_pseudonym function for deterministic pseudonym generation; extended Pseudonymizer with optional seed/anonymizer dependency injection; updated DicomManipulator to accept injected pseudonymizer.
Core Views & Utilities
adit/core/views.py, adit/dicom_web/utils/wadors_utils.py, adit/core/templates/core/home.html
Enhanced DicomJobCancelView to clear queued_job_id on pending tasks and handle in-progress task cancellation separately; improved wado_retrieve sentinel-based completion handling; added Mass Transfer home page entry.
Core Tests
adit/core/tests/utils/test_dimse_connector.py, adit/core/tests/utils/test_pseudonymizer.py
Added DimseConnector connection lifecycle test across service changes; added compute_pseudonym tests validating deterministic output, length/format constraints.
Mass Transfer Models & Migrations
adit/mass_transfer/models.py, adit/mass_transfer/migrations/*
Created MassTransferJob, MassTransferTask, MassTransferVolume, and MassTransferSettings models with comprehensive fields for date ranges, partitioning, pseudonymization, and filter specifications; five migrations handle initial schema, field restructuring, SQL constraints, and validation.
Mass Transfer Views & Forms
adit/mass_transfer/views.py, adit/mass_transfer/forms.py
Implemented job list/detail/create views with preference persistence; added CSV export endpoint; created comprehensive form with Pydantic-validated filters, cross-field validation, and task partition generation on save.
Mass Transfer Processing & Tasks
adit/mass_transfer/processors.py, adit/mass_transfer/tasks.py
Implemented full MassTransferTaskProcessor orchestrating DICOM discovery (with time-window splitting and cross-midnight handling), volume creation, pseudonymization, transfer to folder/server, optional NIfTI conversion, and comprehensive error/retry logic; added background tasks for processing and queuing.
Mass Transfer Admin & Filters
adit/mass_transfer/admin.py, adit/mass_transfer/filters.py, adit/mass_transfer/tables.py
Registered models with Django admin; created filter classes for job/task/volume; defined table renderers with status formatting and optional field normalization.
Mass Transfer Templates & Static Assets
adit/mass_transfer/templates/mass_transfer/*, adit/mass_transfer/static/mass_transfer/*
Added job/task detail and creation templates; integrated CodeMirror for JSON filter editor; created CSS for form layout and JavaScript for preference persistence and destination type handling.
Mass Transfer Utilities & Factories
adit/mass_transfer/utils/partitions.py, adit/mass_transfer/utils/testing_helpers.py, adit/mass_transfer/factories.py, adit/mass_transfer/mixins.py
Implemented partition window builder for daily/weekly splitting; added factory classes for test data generation; created MassTransferLockedMixin and testing helper for group creation.
Mass Transfer Tests
adit/mass_transfer/tests/*
Comprehensive test suite including acceptance tests across transfer protocols (C-MOVE/C-GET/DICOMweb) with server/folder destinations and NIfTI conversion; unit tests for forms, partitioning, processor logic, queue management, and volume creation.
Mass Transfer Configuration & URLs
adit/mass_transfer/urls.py, adit/mass_transfer/apps.py, adit/mass_transfer/__init__.py
Registered 16 URL endpoints for job/task CRUD and actions; implemented AppConfig with menu registration, processor wiring, job-stats collector, and database initialization.
Project Configuration & Scripts
adit/urls.py, globals.d.ts, example.env, scripts/csv_to_mass_transfer_filters.py
Added mass-transfer route to main URL config; added TypeScript global for updatePreferences; extended environment variables for staging ports/replicas; added CSV-to-JSON filter conversion utility script.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant Web as Web View
    participant Queue as Procrastinate Queue
    participant Processor as MassTransferTaskProcessor
    participant PACS as Source DICOM Server
    participant Destination as Destination (Server/Folder)

    User->>Web: Submit mass transfer job form
    Web->>Web: Validate filters & create job + partitions
    Web->>Web: Create MassTransferTask per partition (PENDING)
    Web->>Queue: Enqueue queue_mass_transfer_tasks job

    Queue->>Processor: load job, queue pending tasks
    Processor->>Queue: For each task: enqueue process_mass_transfer_task

    Queue->>Processor: process_mass_transfer_task(job_id, task_id)
    Processor->>Processor: load filters, split time window
    Processor->>PACS: discover studies (C-FIND with time splits)
    PACS-->>Processor: study list
    Processor->>Processor: filter by age, modality, criteria
    Processor->>Processor: create MassTransferVolume records (PENDING)

    loop For each discovered series
        Processor->>PACS: fetch series (C-GET/C-MOVE)
        PACS-->>Processor: DICOM instances
        Processor->>Processor: apply pseudonymization & trial protocol
        Processor->>Destination: send/write instances
        Destination-->>Processor: ack/success
        Processor->>Processor: update volume (CONVERTED/SUCCESS)
    end

    Processor->>Processor: update task status & summary
    Processor->>Web: mark job/task complete
    Web->>User: Display results (status, volume count, errors)
Loading

Estimated code review effort

🎯 5 (Critical) | ⏱️ ~120 minutes

The diff introduces a substantial new feature spanning multiple architectural layers: complex form validation with Pydantic, a sophisticated processor orchestrating DICOM discovery/filtering/transfer with retry/error handling logic, fundamental changes to connection lifecycle management (dimse_connector.py), multiple database migrations, comprehensive templating/static assets, and extensive test coverage across unit/acceptance/integration levels. The sheer heterogeneity—spanning models, views, forms, processors, utilities, templates, migrations, Docker config, and tests—combined with logic density in the processor (1080+ lines) and core utility refactors, demands careful reasoning across each domain.

Possibly related PRs

Poem

🐰 A worker awakens to transfer masses vast,
With filters and partitions spanning time's expanse,
Through PACS it hops, pseudonyms in hand,
Converting and shuffling with organizational flair,
The mass transfer dance begins—hop by hop! 🎉

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch fix-healthcheck-https

@claude
Copy link
Copy Markdown

claude Bot commented Apr 2, 2026

Code Review

Thanks for this contribution! The PR title/description says "Fix health check to use HTTPS", but this is actually a much larger change — it adds a full Mass Transfer app and several core infrastructure improvements. It would help to split these into separate PRs, or at minimum update the description to reflect what's actually here. That said, here's a detailed review:


Health Check Fix

The actual fix (-fk https://localhost/health/) is correct: SECURE_SSL_REDIRECT returns 301, so using HTTP fails. The -k to skip cert verification is fine for a localhost health check.

One minor note: the dev compose adds http://localhost:8000/health/ (port 8000, no SSL) which is consistent with dev environment not using SSL redirect. Good.


Core Infrastructure Changes

adit/core/tasks.py — Task refactoring

Good: Extracting DICOM_TASK_RETRY_STRATEGY as a module-level constant is clean and enables reuse by mass_transfer.

Concern — Retry with IN_PROGRESS status: The assertion was relaxed from PENDING only to also accept IN_PROGRESS:

assert dicom_task.status in (DicomTask.Status.PENDING, DicomTask.Status.IN_PROGRESS)

The comment explains the reasoning (killed worker), but this creates an idempotency gap: if a mass transfer task was halfway through processing 1,000 volumes and the worker is killed, a retry will re-discover and re-process all volumes from scratch. There's no resume checkpoint. For large transfers this could mean hours of duplicated work or, worse, duplicate DICOM data at the destination.

Concern — CancelledError handling: The new CancelledError except block sets status to CANCELED but does not call job.post_process(), unlike the other exception paths. This could leave the parent job stuck in IN_PROGRESS forever.


adit/core/utils/dimse_connector.py — Connection lifecycle

Good: The auto_close parameter and _current_service tracking are clean additions. The GeneratorExit handler is a thoughtful safety net.

Concern — Service-switch closes connection aggressively: When a persistent operator switches from C-FIND to C-GET, connect_to_server detects the service change and closes the connection. This is correct for correctness, but for mass transfer processing thousands of studies:

series 1: C-FIND open → close → C-GET open → close
series 2: C-FIND open → close → C-GET open → close
...

Each series causes multiple associations. Consider holding separate connectors per service, or accepting the reconnect cost is acceptable for this use case.

Concern — GeneratorExit not tested: There's no test for the GeneratorExit path. The comment says "callers should fully consume the generator" but this is easy to violate accidentally (e.g., using next(), breaking early from a for loop). A test + a note in DimseConnector docstring would help.

Fix appreciated: The stale-association cleanup in open_connection:

if not self.assoc.is_alive():
    self.assoc = None
    self._current_service = None

This is a real bug fix that will help in production.


adit/core/models.py

Good: Making queued_job_id: int | None explicit (previously typed as int but backed by nullable FK) is a correctness fix.

Good: max_search_results field on DicomServer with a MinValueValidator(1) is well-modeled.


adit/core/management/commands/cleanup_jobs_and_tasks.py

Bug fix appreciated: The previous code hard-coded SelectiveTransferTask.Status.FAILURE and SelectiveTransferJob.objects.get() even when cleaning up BatchTransferTask. This is a real bug. The refactored job_model lookup via _meta.get_field is correct.


Mass Transfer App

Models

Concern — Pseudonym length as mode discriminator:

_DETERMINISTIC_PSEUDONYM_LENGTH = 14
_RANDOM_PSEUDONYM_LENGTH = 15  # So the two modes can be distinguished by length

Using string length to encode the pseudonymization mode is a side-channel. Anyone with read access to the DB can infer linkage strategy for each record. Consider storing the mode explicitly (e.g., a dedicated field or metadata) rather than encoding it in length.

Concern — Partition unique constraint race condition: MassTransferVolume has a UniqueConstraint on (job, series_instance_uid). In processors.py, partitions are processed by separate workers and each bulk-deletes then bulk-creates volumes for its partition key:

MassTransferVolume.objects.filter(job=job, partition_key=...).delete()
...
MassTransferVolume.objects.bulk_create(...)

If two workers process overlapping partitions concurrently (or if a series appears in two partition windows due to edge cases), this would cause a unique constraint violation. A SELECT FOR UPDATE on the job or partition-level locking would help.

Processors

Concern — Hardcoded delays:

_DELAY_BETWEEN_SERIES = 0.5
_DELAY_RETRY_FETCH = 3

These empirical values work for one PACS but may not suit others. These should be configurable settings (similar to how retry attempts are in Django settings), or at least constants documented with reasoning.

Concern — Broad exception catching:

except Exception as err:
    volume.status = MassTransferVolume.Status.ERROR
    volume.log = str(err)

This catches KeyboardInterrupt and SystemExit (they're BaseException subclasses, so actually this is fine), but also wraps legitimate programming errors in silently-logged volume failures. Consider narrowing to DicomError or a defined hierarchy.

Concern — _MIN_SPLIT_WINDOW = timedelta(minutes=30) appears defined but unused in the diff. Is this dead code?

Tests

The acceptance tests and unit tests look thorough for the happy path. Some gaps:

  1. Retry idempotency: No test verifies that retrying an interrupted task doesn't re-process already-completed volumes.
  2. Connection loss mid-transfer: No test for PACS dropping the association mid-series.
  3. GeneratorExit in DIMSE: As mentioned above.
  4. Concurrent partition workers: No test for the unique constraint race condition scenario.

Migrations

Five migrations in the initial PR suggests the schema evolved significantly during development. Before merge, consider squashing the mass_transfer migrations into a single clean migration (since this is a new app with no production data yet). It simplifies bisect and makes the schema intent clearer.


Minor / Nits

  • adit/core/utils/dimse_connector.py: Trailing whitespace on line with self._current_service = None inside abort_connectionuv run cli lint will flag this.
  • adit/core/models.py: Extra blank line before queue_pending_task method (line 184 in diff).
  • The .vscode/settings.json change (containers.containers.label) is editor-local config that probably shouldn't be checked into the repo.
  • AGENTS.md gets the assertions section added — was this meant for CLAUDE.md instead? (CLAUDE.md already has this section in the project instructions.)

Summary

Area Severity Issue
Task retry High No idempotency — retried tasks restart from scratch, risking duplicate transfers
CancelledError Medium Parent job not post-processed, may stay stuck in IN_PROGRESS
Pseudonym length Medium Leaks pseudonymization mode as side-channel
Partition race Medium Concurrent workers can violate unique constraint
Hardcoded delays Low Not adjustable per PACS behavior
GeneratorExit Low No test coverage for this new code path
Migrations Low Consider squashing 5 new-app migrations before first release

The health check fix itself is correct. The DIMSE connector improvements (stale association cleanup, auto_close, service tracking) are solid. The main concerns are around the mass transfer processor's retry/idempotency story and the pseudonym side-channel.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a comprehensive 'Mass Transfer' feature, enabling bulk DICOM data exchange over specified time ranges using reusable JSON-based filters. The implementation includes a new Django app with dedicated models for tracking jobs, tasks, and individual volumes, along with background processing logic that utilizes recursive C-FIND splitting to handle large result sets. Key architectural enhancements include refactored DIMSE association management for persistent connections, deterministic pseudonymization using SHA-256, and improved task retry and cancellation handling. Review feedback highlighted critical correctness issues regarding 1-second gaps in time-window splitting, the need for explicit resource cleanup of DICOM associations during discovery, and opportunities to optimize transfer speed by adjusting pacing delays.

Comment thread adit/mass_transfer/processors.py Outdated
Comment on lines +893 to +894
left = self._find_studies(operator, mf, start, midnight - timedelta(seconds=1))
right = self._find_studies(operator, mf, midnight, end)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

There is a potential correctness issue with the 1-second gap in the time window split. If a study occurs at a fractional second (e.g., 23:59:59.500), it will be missed by both the left and right queries. DICOM time ranges are inclusive, so it is safer to use overlapping boundaries (e.g., both ending and starting at midnight) and rely on the existing deduplication logic.

Suggested change
left = self._find_studies(operator, mf, start, midnight - timedelta(seconds=1))
right = self._find_studies(operator, mf, midnight, end)
left = self._find_studies(operator, mf, start, midnight)
right = self._find_studies(operator, mf, midnight, end)

Comment thread adit/mass_transfer/processors.py Outdated

mid = start + (end - start) / 2
left = self._find_studies(operator, mf, start, mid)
right = self._find_studies(operator, mf, mid + timedelta(seconds=1), end)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Similar to the cross-midnight split, adding timedelta(seconds=1) to the midpoint creates a 1-second gap where studies might be missed if the PACS supports sub-second precision or if the midpoint falls within a second. It is recommended to use the exact mid value for both halves and let the deduplication logic handle any studies occurring exactly at the split point.

Suggested change
right = self._find_studies(operator, mf, mid + timedelta(seconds=1), end)
right = self._find_studies(operator, mf, mid, end)

Comment thread adit/mass_transfer/processors.py Outdated
Comment on lines +318 to +322
operator = DicomOperator(source_node.dicomserver, persistent=True)

# Discovery: query the source server for all matching series
discovered = self._discover_series(operator, filters)
operator.close()
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The DicomOperator instance is created but not explicitly closed if an exception occurs during _discover_series. While Python's garbage collection will eventually close the association, it is better to use a try...finally block or a context manager to ensure the DIMSE association is released immediately upon failure, especially in a long-running mass transfer task.

Suggested change
operator = DicomOperator(source_node.dicomserver, persistent=True)
# Discovery: query the source server for all matching series
discovered = self._discover_series(operator, filters)
operator.close()
operator = DicomOperator(source_node.dicomserver, persistent=True)
try:
# Discovery: query the source server for all matching series
discovered = self._discover_series(operator, filters)
finally:
operator.close()

Comment thread adit/mass_transfer/processors.py Outdated
Comment on lines +443 to +451
# Pacing delay between consecutive C-GET/C-MOVE requests.
# Some PACS servers reject or drop associations under
# rapid-fire requests. Batch transfer does not need this
# because it processes fewer series per task. The 0.5s
# value was chosen empirically.
# TODO: Investigate if this is really needed and if the
# delay value is appropriate (was never necessary in mass
# transfer which also transfers series one by one).
time.sleep(_DELAY_BETWEEN_SERIES)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The pacing delay of 0.5 seconds between series within the same study (and same association) may significantly slow down mass transfers involving thousands of series. Since the association is persistent, most PACS servers should handle consecutive C-GET/C-MOVE requests without issues. If pacing is required to avoid overwhelming the PACS with new associations, it should be placed between studies, not between series of the same study.

The health check was using plain HTTP, but SECURE_SSL_REDIRECT causes a
301 redirect to HTTPS, making the probe fail and Swarm restart replicas.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@samuelvkwong samuelvkwong force-pushed the fix-healthcheck-https branch from 5dab95d to f87b048 Compare April 2, 2026 10:14
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 15

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
.vscode/settings.json (1)

3-3: ⚠️ Potential issue | 🟠 Major

Remove both invalid container label settings

Neither docker.containers.label (Line 3) nor containers.containers.label (Line 80) are valid VS Code setting keys according to official Microsoft documentation. Valid container-related settings use documented prefixes like docker.commands.*, containers.commands.*, containers.containerClient, or containers.orchestratorClient. Container labels belong in task configurations or Docker configurations, not as top-level VS Code settings. Remove both keys from settings.json.

Also applies to: 80-80

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.vscode/settings.json at line 3, Remove the two invalid VS Code settings
keys "docker.containers.label" and "containers.containers.label" from
settings.json; locate and delete the entries for docker.containers.label and
containers.containers.label so that only valid VS Code settings remain
(container labels should instead be defined in task or Docker configuration
files, not as top-level settings).
🧹 Nitpick comments (9)
adit/mass_transfer/templates/mass_transfer/mass_transfer_job_form.html (1)

10-59: Consider moving the inline CodeMirror theme CSS into a static stylesheet.

This block is substantial; extracting it will improve reuse, caching, and template readability.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/mass_transfer/templates/mass_transfer/mass_transfer_job_form.html`
around lines 10 - 59, The large inline CodeMirror theme CSS (selectors like
.CodeMirror, [data-bs-theme="dark"] .CodeMirror-*, .cm-s-default .cm-*) should
be moved into a static stylesheet for reuse and caching: extract this entire
<style> block into a new static file (e.g., css/codemirror-theme.css), remove
the inline <style> from mass_transfer_job_form.html, and add a link to the new
stylesheet in the template head (ensuring you use the app's static asset helper
to reference the file). Keep selector names and rules identical and ensure the
template still loads the stylesheet in pages that require the CodeMirror
styling.
cli.py (1)

40-72: Add structured logs around staging stack lifecycle actions.

These commands perform high-impact actions (build/deploy/remove) but currently emit no structured logs for action/state transitions.

Proposed refactor
 import os
 import shutil
+import logging
 from glob import glob
 from typing import Annotated
@@
 app = typer.Typer(
@@
 )
+logger = logging.getLogger(__name__)
@@
 def stack_deploy_staging():
@@
     stack_name = f"{helper.project_id}_staging"
+    logger.info("staging_stack_build_start", extra={"stack_name": stack_name})
@@
     helper.execute_cmd(build_cmd, env={**env, "COMPOSE_BAKE": "true"})
+    logger.info("staging_stack_build_done", extra={"stack_name": stack_name})
@@
     helper.execute_cmd(deploy_cmd, env=env)
+    logger.info("staging_stack_deploy_done", extra={"stack_name": stack_name})
@@
 def stack_rm_staging():
@@
     stack_name = f"{helper.project_id}_staging"
+    logger.info("staging_stack_remove_start", extra={"stack_name": stack_name})
     helper.execute_cmd(f"docker stack rm {stack_name}")
+    logger.info("staging_stack_remove_done", extra={"stack_name": stack_name})

As per coding guidelines, **/*.{js,ts,py}: Use structured logging for agent actions and state transitions.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@cli.py` around lines 40 - 72, Add structured logging around the high-impact
staging stack actions in stack_deploy_staging and stack_rm_staging: use the
CommandHelper logger (or add one) to emit structured INFO logs before and after
major steps (prepare_environment, load_config_from_env_file, build via
helper.execute_cmd, and deploy via helper.execute_cmd) including contextual
fields like project_id, stack_name, base_file, staging_file, and
PROJECT_VERSION, and emit ERROR logs with context on failures from
helper.execute_cmd; similarly log BEFORE and AFTER for stack_rm_staging around
helper.execute_cmd with stack_name context.
adit/core/utils/pseudonymizer.py (1)

45-46: _seed attribute appears unused after assignment.

self._seed = seed is stored but not referenced elsewhere in the class. Consider removing it if not needed for debugging or future use.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/core/utils/pseudonymizer.py` around lines 45 - 46, The assignment
self._seed = seed is unused; either remove this unused attribute or actually use
it — for example pass seed via self._seed into self._setup_anonymizer or into
other methods that need the seed; update the constructor in the class that
creates anonymizer (the anonymizer parameter, the _setup_anonymizer method and
any uses of anonymizer) accordingly and remove dead attribute if you choose the
remove path; run unit tests to ensure no behavior change.
adit/mass_transfer/utils/partitions.py (1)

46-47: Consider using end-of-day with microseconds for completeness.

time(23, 59, 59) excludes the final sub-second of each day. For most DICOM timestamp queries this is fine, but if precision matters, consider time(23, 59, 59, 999999).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/mass_transfer/utils/partitions.py` around lines 46 - 47, The end_dt
construction currently uses time(23, 59, 59) which omits the final microsecond
of the day; update the end_dt line that calls
timezone.make_aware(datetime.combine(window_end_date, time(23, 59, 59)), tz) to
use time(23, 59, 59, 999999) so the range covers the full end-of-day precision
for DICOM timestamps (adjust the datetime.combine/time call that creates
end_dt).
adit/mass_transfer/tests/test_queue_pending_tasks.py (1)

125-143: Consider removing duplicate test.

test_queue_mass_transfer_tasks_sets_queued_job (lines 126-143) appears to test the same behavior as test_background_job_queues_all_pending_tasks (lines 35-52). Both verify that queue_mass_transfer_tasks sets queued_job on pending tasks and places them on the mass_transfer queue.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/mass_transfer/tests/test_queue_pending_tasks.py` around lines 125 - 143,
This file contains a duplicate test; remove or consolidate the redundant test
`test_queue_mass_transfer_tasks_sets_queued_job` so we only keep
`test_background_job_queues_all_pending_tasks` (or vice versa). Locate the test
function named `test_queue_mass_transfer_tasks_sets_queued_job` that calls
`queue_mass_transfer_tasks(job_id=job.pk)` and assertions on
`MassTransferTaskFactory` instances and `ProcrastinateJob`, and delete it (or
merge any unique assertions into the remaining test) to avoid duplicate coverage
of `queue_mass_transfer_tasks`, `MassTransferJobFactory`,
`MassTransferTaskFactory`, and `ProcrastinateJob`.
adit/core/utils/dimse_connector.py (1)

251-252: Minor: trailing whitespace.

Line 252 has trailing whitespace after self._current_service = None.

Suggested fix
         self.assoc = None
         self._current_service = None
-    
+
     `@retry_dimse_find`
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/core/utils/dimse_connector.py` around lines 251 - 252, Remove the
trailing whitespace at the end of the line that sets self._current_service =
None in dimse_connector.py so the line ends cleanly; locate the assignment to
self._current_service within the class (the method/block where
self._current_service = None is set) and trim the extra space characters after
the None.
adit/mass_transfer/tests/test_forms.py (1)

42-67: Consider using the form_env fixture to reduce duplication.

Several tests (lines 42-67, 70-95, 98-124, 127-151, 154-178, 181-207, 210-236, 303-327, 366-389) manually recreate the user/source/destination/group setup that the form_env fixture already provides. Using form_env with the _make_form helper would reduce boilerplate and improve maintainability.

Example refactor for test_clean_clears_salt_when_pseudonymize_unchecked
 `@pytest.mark.django_db`
-def test_clean_clears_salt_when_pseudonymize_unchecked():
+def test_clean_clears_salt_when_pseudonymize_unchecked(form_env):
     """When pseudonymize is unchecked, the salt should be cleared."""
-    user = UserFactory.create()
-    source = DicomServerFactory.create()
-    destination = DicomFolderFactory.create()
-    group = GroupFactory.create()
-    add_user_to_group(user, group)
-    grant_access(group, source, source=True)
-    grant_access(group, destination, destination=True)
-
-    form = MassTransferJobForm(
-        data={
-            "source": source.pk,
-            "destination": destination.pk,
-            "start_date": "2024-01-01",
-            "end_date": "2024-01-03",
-            "partition_granularity": "daily",
-            "pseudonymize": False,
-            "pseudonym_salt": "should-be-cleared",
-            "filters_json": json.dumps([{"modality": "CT"}]),
-        },
-        user=user,
-    )
+    form = _make_form(form_env, pseudonymize=False, pseudonym_salt="should-be-cleared")
     assert form.is_valid(), form.errors
     assert form.cleaned_data["pseudonym_salt"] == ""
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/mass_transfer/tests/test_forms.py` around lines 42 - 67, The test
test_clean_clears_salt_when_pseudonymize_unchecked duplicates setup that exists
in the form_env fixture; refactor the test to use form_env and its _make_form
helper instead of creating UserFactory, DicomServerFactory, DicomFolderFactory,
GroupFactory, add_user_to_group, and grant_access manually: call
form_env._make_form(...) or the provided helper to build a MassTransferJobForm
instance with pseudonymize=False and pseudonym_salt set, assert form.is_valid()
and that form.cleaned_data["pseudonym_salt"] == ""—this reduces boilerplate and
aligns this test with the other tests mentioned.
adit/mass_transfer/tests/acceptance/test_mass_transfer.py (1)

41-45: Consider using set_checked for cleaner checkbox handling.

Playwright's set_checked(checked: bool) method simplifies the conditional logic.

Suggested simplification
     pseudonymize_checkbox = page.get_by_label("Pseudonymize")
-    if pseudonymize and not pseudonymize_checkbox.is_checked():
-        pseudonymize_checkbox.click(force=True)
-    elif not pseudonymize and pseudonymize_checkbox.is_checked():
-        pseudonymize_checkbox.click(force=True)
+    pseudonymize_checkbox.set_checked(pseudonymize, force=True)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/mass_transfer/tests/acceptance/test_mass_transfer.py` around lines 41 -
45, Replace the manual conditional toggling of the checkbox with Playwright's
Locator.set_checked to simplify and make intent explicit: use the existing
pseudonymize_checkbox locator and call set_checked(pseudonymize) (await if using
async test) instead of the current if/elif block that clicks conditionally.
scripts/csv_to_mass_transfer_filters.py (1)

67-74: Consider adding return type annotation.

For consistency with the coding guidelines, consider adding the return type hint.

Suggested fix
 def csv_to_filters(
     csv_path: Path,
     *,
     delimiter: str = ",",
     min_age: int | None = None,
     max_age: int | None = None,
     min_number_of_series_related_instances: int | None = None,
-) -> list[dict]:
+) -> list[dict[str, str | int]]:
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/csv_to_mass_transfer_filters.py` around lines 67 - 74, The function
csv_to_filters is missing an explicit return type annotation; update its
signature (function csv_to_filters) to include the appropriate return type
(e.g., list[dict] or a more specific TypedDict/List type) to match the actual
return value, and ensure any imported typing aliases are added if needed so the
annotation is valid.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.vscode/settings.json:
- Line 80: The commit introduced an unrelated workspace setting
"containers.containers.label" in .vscode/settings.json; remove or revert that
specific key from this branch so the PR focuses on the health-check HTTPS fix,
and then create a separate housekeeping PR that adds
"containers.containers.label": "ContainerName" (or any desired value) if still
needed; ensure only the health-check-related files remain changed before pushing
the fix.

In `@adit/core/utils/dicom_operator.py`:
- Around line 542-545: The logger.error call accesses ds.SOPInstanceUID directly
which can raise AttributeError for malformed datasets; update the exception
handler around logger.error to safely retrieve the SOP Instance UID (e.g. guard
for ds being None and use getattr(ds, "SOPInstanceUID", "<unknown>") or similar)
and pass that safe value into logger.error (keep exc_info=True) so the original
exception isn't masked; look for the logger.error invocation and the ds variable
in the store handler code to apply this change.

In `@adit/mass_transfer/apps.py`:
- Around line 48-49: The current check-then-create pattern using
MassTransferSettings.objects.exists() followed by
MassTransferSettings.objects.create() is racy; replace it with an atomic
get_or_create call on the model’s singleton identifying key (e.g., the unique PK
or other singleton constraint) so the database enforces uniqueness and only one
row is created (use MassTransferSettings.objects.get_or_create(defaults={...})
with the appropriate lookup kwargs for the singleton key).
- Around line 12-16: Guard register_app() and signal/collector registration
against repeated ready() invocations: in ready() ensure register_app() is only
executed once (e.g. set a one-time flag on the AppConfig instance or
module-level sentinel) so collect_job_stats()/register_job_stats_collector()
does not append duplicate callables into job_stats_collectors; also pass a
stable dispatch_uid when connecting post_migrate to init_db
(post_migrate.connect(init_db, sender=self, dispatch_uid=...)) to prevent
multiple signal registrations; finally, replace the exists()+create() pattern
inside init_db() with an atomic get_or_create() call to avoid duplicate
singleton creation under concurrent post-migrate signals.

In `@adit/mass_transfer/factories.py`:
- Around line 27-31: partition_start, partition_end and partition_key must be
generated consistently: keep partition_start as the base
(factory.LazyFunction(timezone.now)) but make partition_end and partition_key
derived fields using LazyAttribute so they use the same partition_start value;
specifically, change partition_end to use a LazyAttribute that returns
partition_start + timedelta(hours=23, minutes=59, seconds=59) and change
partition_key to use a LazyAttribute that formats partition_start.date() as
"%Y-%m-%d" (e.g., via .strftime) so all three fields (partition_start,
partition_end, partition_key) remain coherent and tests will not be flaky.

In `@adit/mass_transfer/forms.py`:
- Around line 299-306: Wrap the job creation and task creation in a single DB
transaction so they either both succeed or both roll back: inside the save
method of the form (the save function that currently calls job.save() then
self._save_tasks(job)), import and use Django's transaction.atomic() to perform
job.save() and self._save_tasks(job) within the same atomic block when commit is
True, ensuring that any exception from _save_tasks causes the job save to be
rolled back; keep the behavior for commit=False unchanged (return the unsaved
job without saving or creating tasks).
- Around line 231-235: clean_destination currently only checks access but must
also reject unsupported DicomNode types so the form errors instead of letting
MassTransferTaskProcessor.process() fail later; update clean_destination to
inspect the node type on the destination DicomNode (e.g., check a type attribute
or helper methods like is_folder()/is_server() or compare destination.node_type
to allowed values) and raise ValidationError("Selected destination must be a
folder or server.") when it is not an allowed type, then return destination as
before.

In `@adit/mass_transfer/migrations/0002_move_source_destination_to_task.py`:
- Around line 22-32: The migration currently adds a destination ForeignKey to
MassTransferTask using a hardcoded default=1 which can fail if DicomNode pk=1
doesn't exist; replace the placeholder default with a two-step migration: first
add the destination field as nullable (migrations.AddField on MassTransferTask
with null=True and no default), then add a RunPython data migration that looks
up each MassTransferTask, finds its related MassTransferJob (or other source
mapping), sets destination to the appropriate DicomNode for that job, and saves
the task; finally add a migration to set null=False on destination and add the
ForeignKey constraint (or handle missing nodes by creating/validating the
expected DicomNode rows beforehand). Ensure the steps reference
MassTransferTask.destination, MassTransferJob, migrations.RunPython and the
AddField operations so reviewers can locate and apply the safe migration flow.

In `@adit/mass_transfer/processors.py`:
- Around line 318-322: The DicomOperator created as operator before calling
_discover_series can leak if discovery raises; wrap the discovery call in a
try/finally (or use a context manager) to ensure operator.close() runs
regardless of exceptions. Specifically, change the block around
DicomOperator(source_node.dicomserver, persistent=True) and the call to
self._discover_series(operator, filters) so operator.close() is invoked in a
finally block (mirror the same pattern for the dest_operator usage around the
related block at the _discover_series/_retrieve_series area referenced near
lines 337-339) to guarantee the persistent association is always closed on
error.
- Around line 789-796: The current age-filter logic allows studies with missing
PatientBirthDate or StudyDate to pass when mf.min_age or mf.max_age is set;
change it to fail closed by checking has_age_filter first and rejecting any
study missing either birth_date or study.StudyDate. Specifically, in the block
that uses birth_date, study.StudyDate, has_age_filter and _age_at_study, add a
guard so that if has_age_filter is true and either birth_date or study.StudyDate
is falsy you skip/continue the study; otherwise compute age with _age_at_study
and apply the existing mf.min_age / mf.max_age comparisons.

In `@adit/mass_transfer/templates/mass_transfer/mass_transfer_job_detail.html`:
- Around line 46-48: The template currently renders the secret value
job.pseudonym_salt in clear text in mass_transfer_job_detail.html; change the UI
to never display the salt itself—replace the {{ job.pseudonym_salt }} output
with a non-secret indicator (e.g., "Configured" or a boolean/lock icon) so the
template only conveys that a pseudonym salt exists without exposing its value;
update the dt/dd block referencing job.pseudonym_salt accordingly.

In `@adit/mass_transfer/templatetags/mass_transfer_extras.py`:
- Around line 10-19: The function volume_status_css_class currently calls
MassTransferVolume.Status(status) which raises ValueError for invalid values;
update volume_status_css_class to handle invalid status values by wrapping the
enum cast in a try/except (catch ValueError) and return the default
"text-secondary" on error, or alternatively check membership without casting
(e.g., compare against MassTransferVolume.Status members or map by raw status
string) so that css_classes.get uses a safe key; apply this change inside
volume_status_css_class referencing MassTransferVolume.Status and css_classes.

In `@adit/mass_transfer/utils/testing_helpers.py`:
- Around line 5-9: The helper create_mass_transfer_group currently uses a fixed
Group name ("Radiologists") which can cause unique constraint failures; change
it to create a uniquely-named group (e.g., let GroupFactory.generate or
GroupFactory.create with no name or append a random/uuid suffix) so repeated
calls won't collide, and keep the same permission additions (add_permission
calls) using the created group's instance; update references to
GroupFactory.create and the Group.name usage in create_mass_transfer_group
accordingly.

In `@adit/mass_transfer/views.py`:
- Around line 139-140: Remove the line that emits the secret pseudonym salt into
the CSV download: do not call response.write with job.pseudonym_salt (the check
for job.pseudonym_salt and the response.write(f"# Pseudonym salt:
{job.pseudonym_salt}\n") should be deleted). Ensure no other code in this view
or export path writes or exposes job.pseudonym_salt to the response or included
export metadata; if the salt must be retained for internal reuse, keep it only
in job storage and never serialize it to output.

In `@docker-compose.staging.yml`:
- Around line 43-55: Remove all one-off bootstrap commands from the replicated
web service command and put them into a new/non-replicated one-shot init service
so only one container runs migrations and seeding; specifically move these
commands: ./manage.py migrate, ./manage.py create_superuser, ./manage.py
create_example_users, ./manage.py create_example_groups, ./manage.py
populate_example_data, ./manage.py populate_orthancs and ./manage.py
retry_stalled_jobs (and their wait-for-it waits: postgres.local:5432 and
orthanc1.local:6501) into the init service command, keep the web service command
only as ./manage.py runserver 0.0.0.0:8000 (and any minimal readiness wait it
needs), ensure the init service uses the same image/env as web and is configured
as a one-shot (not replicated) service so WEB_STAGING_REPLICAS > 1 won’t re-run
bootstrap.

---

Outside diff comments:
In @.vscode/settings.json:
- Line 3: Remove the two invalid VS Code settings keys "docker.containers.label"
and "containers.containers.label" from settings.json; locate and delete the
entries for docker.containers.label and containers.containers.label so that only
valid VS Code settings remain (container labels should instead be defined in
task or Docker configuration files, not as top-level settings).

---

Nitpick comments:
In `@adit/core/utils/dimse_connector.py`:
- Around line 251-252: Remove the trailing whitespace at the end of the line
that sets self._current_service = None in dimse_connector.py so the line ends
cleanly; locate the assignment to self._current_service within the class (the
method/block where self._current_service = None is set) and trim the extra space
characters after the None.

In `@adit/core/utils/pseudonymizer.py`:
- Around line 45-46: The assignment self._seed = seed is unused; either remove
this unused attribute or actually use it — for example pass seed via self._seed
into self._setup_anonymizer or into other methods that need the seed; update the
constructor in the class that creates anonymizer (the anonymizer parameter, the
_setup_anonymizer method and any uses of anonymizer) accordingly and remove dead
attribute if you choose the remove path; run unit tests to ensure no behavior
change.

In `@adit/mass_transfer/templates/mass_transfer/mass_transfer_job_form.html`:
- Around line 10-59: The large inline CodeMirror theme CSS (selectors like
.CodeMirror, [data-bs-theme="dark"] .CodeMirror-*, .cm-s-default .cm-*) should
be moved into a static stylesheet for reuse and caching: extract this entire
<style> block into a new static file (e.g., css/codemirror-theme.css), remove
the inline <style> from mass_transfer_job_form.html, and add a link to the new
stylesheet in the template head (ensuring you use the app's static asset helper
to reference the file). Keep selector names and rules identical and ensure the
template still loads the stylesheet in pages that require the CodeMirror
styling.

In `@adit/mass_transfer/tests/acceptance/test_mass_transfer.py`:
- Around line 41-45: Replace the manual conditional toggling of the checkbox
with Playwright's Locator.set_checked to simplify and make intent explicit: use
the existing pseudonymize_checkbox locator and call set_checked(pseudonymize)
(await if using async test) instead of the current if/elif block that clicks
conditionally.

In `@adit/mass_transfer/tests/test_forms.py`:
- Around line 42-67: The test test_clean_clears_salt_when_pseudonymize_unchecked
duplicates setup that exists in the form_env fixture; refactor the test to use
form_env and its _make_form helper instead of creating UserFactory,
DicomServerFactory, DicomFolderFactory, GroupFactory, add_user_to_group, and
grant_access manually: call form_env._make_form(...) or the provided helper to
build a MassTransferJobForm instance with pseudonymize=False and pseudonym_salt
set, assert form.is_valid() and that form.cleaned_data["pseudonym_salt"] ==
""—this reduces boilerplate and aligns this test with the other tests mentioned.

In `@adit/mass_transfer/tests/test_queue_pending_tasks.py`:
- Around line 125-143: This file contains a duplicate test; remove or
consolidate the redundant test `test_queue_mass_transfer_tasks_sets_queued_job`
so we only keep `test_background_job_queues_all_pending_tasks` (or vice versa).
Locate the test function named `test_queue_mass_transfer_tasks_sets_queued_job`
that calls `queue_mass_transfer_tasks(job_id=job.pk)` and assertions on
`MassTransferTaskFactory` instances and `ProcrastinateJob`, and delete it (or
merge any unique assertions into the remaining test) to avoid duplicate coverage
of `queue_mass_transfer_tasks`, `MassTransferJobFactory`,
`MassTransferTaskFactory`, and `ProcrastinateJob`.

In `@adit/mass_transfer/utils/partitions.py`:
- Around line 46-47: The end_dt construction currently uses time(23, 59, 59)
which omits the final microsecond of the day; update the end_dt line that calls
timezone.make_aware(datetime.combine(window_end_date, time(23, 59, 59)), tz) to
use time(23, 59, 59, 999999) so the range covers the full end-of-day precision
for DICOM timestamps (adjust the datetime.combine/time call that creates
end_dt).

In `@cli.py`:
- Around line 40-72: Add structured logging around the high-impact staging stack
actions in stack_deploy_staging and stack_rm_staging: use the CommandHelper
logger (or add one) to emit structured INFO logs before and after major steps
(prepare_environment, load_config_from_env_file, build via helper.execute_cmd,
and deploy via helper.execute_cmd) including contextual fields like project_id,
stack_name, base_file, staging_file, and PROJECT_VERSION, and emit ERROR logs
with context on failures from helper.execute_cmd; similarly log BEFORE and AFTER
for stack_rm_staging around helper.execute_cmd with stack_name context.

In `@scripts/csv_to_mass_transfer_filters.py`:
- Around line 67-74: The function csv_to_filters is missing an explicit return
type annotation; update its signature (function csv_to_filters) to include the
appropriate return type (e.g., list[dict] or a more specific TypedDict/List
type) to match the actual return value, and ensure any imported typing aliases
are added if needed so the annotation is valid.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 50cc032d-0a9b-4db7-9a20-04b194f8a79d

📥 Commits

Reviewing files that changed from the base of the PR and between 4e9e13a and 5dab95d.

⛔ Files ignored due to path filters (8)
  • adit/mass_transfer/static/mass_transfer/vendor/codemirror/addon/edit/closebrackets.min.js is excluded by !**/*.min.js
  • adit/mass_transfer/static/mass_transfer/vendor/codemirror/addon/edit/matchbrackets.min.js is excluded by !**/*.min.js
  • adit/mass_transfer/static/mass_transfer/vendor/codemirror/addon/lint/json-lint.min.js is excluded by !**/*.min.js
  • adit/mass_transfer/static/mass_transfer/vendor/codemirror/addon/lint/lint.min.js is excluded by !**/*.min.js
  • adit/mass_transfer/static/mass_transfer/vendor/codemirror/codemirror.min.js is excluded by !**/*.min.js
  • adit/mass_transfer/static/mass_transfer/vendor/codemirror/jsonlint.min.js is excluded by !**/*.min.js
  • adit/mass_transfer/static/mass_transfer/vendor/codemirror/mode/javascript/javascript.min.js is excluded by !**/*.min.js
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (70)
  • .github/workflows/ci.yml
  • .vscode/settings.json
  • AGENTS.md
  • adit/core/management/commands/cleanup_jobs_and_tasks.py
  • adit/core/migrations/0016_add_max_search_results_to_dicomserver.py
  • adit/core/migrations/0017_review_fixes.py
  • adit/core/models.py
  • adit/core/tasks.py
  • adit/core/templates/core/home.html
  • adit/core/tests/utils/test_dimse_connector.py
  • adit/core/tests/utils/test_pseudonymizer.py
  • adit/core/utils/dicom_manipulator.py
  • adit/core/utils/dicom_operator.py
  • adit/core/utils/dimse_connector.py
  • adit/core/utils/presentation_contexts.py
  • adit/core/utils/pseudonymizer.py
  • adit/core/views.py
  • adit/dicom_web/utils/wadors_utils.py
  • adit/mass_transfer/__init__.py
  • adit/mass_transfer/admin.py
  • adit/mass_transfer/apps.py
  • adit/mass_transfer/factories.py
  • adit/mass_transfer/filters.py
  • adit/mass_transfer/forms.py
  • adit/mass_transfer/migrations/0001_initial.py
  • adit/mass_transfer/migrations/0002_move_source_destination_to_task.py
  • adit/mass_transfer/migrations/0003_fix_queued_job_on_delete.py
  • adit/mass_transfer/migrations/0004_sync_transfertask_fields.py
  • adit/mass_transfer/migrations/0005_add_partition_constraint.py
  • adit/mass_transfer/migrations/__init__.py
  • adit/mass_transfer/mixins.py
  • adit/mass_transfer/models.py
  • adit/mass_transfer/processors.py
  • adit/mass_transfer/static/mass_transfer/mass_transfer.css
  • adit/mass_transfer/static/mass_transfer/mass_transfer.js
  • adit/mass_transfer/static/mass_transfer/vendor/codemirror/addon/lint/lint.min.css
  • adit/mass_transfer/static/mass_transfer/vendor/codemirror/codemirror.min.css
  • adit/mass_transfer/tables.py
  • adit/mass_transfer/tasks.py
  • adit/mass_transfer/templates/mass_transfer/mass_transfer_job_detail.html
  • adit/mass_transfer/templates/mass_transfer/mass_transfer_job_form.html
  • adit/mass_transfer/templates/mass_transfer/mass_transfer_job_list.html
  • adit/mass_transfer/templates/mass_transfer/mass_transfer_layout.html
  • adit/mass_transfer/templates/mass_transfer/mass_transfer_task_detail.html
  • adit/mass_transfer/templatetags/__init__.py
  • adit/mass_transfer/templatetags/mass_transfer_extras.py
  • adit/mass_transfer/tests/__init__.py
  • adit/mass_transfer/tests/acceptance/__init__.py
  • adit/mass_transfer/tests/acceptance/conftest.py
  • adit/mass_transfer/tests/acceptance/test_mass_transfer.py
  • adit/mass_transfer/tests/test_forms.py
  • adit/mass_transfer/tests/test_partitions.py
  • adit/mass_transfer/tests/test_processor.py
  • adit/mass_transfer/tests/test_queue_pending_tasks.py
  • adit/mass_transfer/urls.py
  • adit/mass_transfer/utils/__init__.py
  • adit/mass_transfer/utils/partitions.py
  • adit/mass_transfer/utils/testing_helpers.py
  • adit/mass_transfer/views.py
  • adit/settings/base.py
  • adit/urls.py
  • cli.py
  • docker-compose.base.yml
  • docker-compose.dev.yml
  • docker-compose.prod.yml
  • docker-compose.staging.yml
  • example.env
  • globals.d.ts
  • pyproject.toml
  • scripts/csv_to_mass_transfer_filters.py

Comment thread .vscode/settings.json Outdated
"plist"
]
],
"containers.containers.label": "ContainerName"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Unrelated workspace setting in a focused production-fix PR

This change is not tied to the health-check HTTPS objective and adds avoidable scope to the PR. Please move it to a separate housekeeping PR.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.vscode/settings.json at line 80, The commit introduced an unrelated
workspace setting "containers.containers.label" in .vscode/settings.json; remove
or revert that specific key from this branch so the PR focuses on the
health-check HTTPS fix, and then create a separate housekeeping PR that adds
"containers.containers.label": "ContainerName" (or any desired value) if still
needed; ensure only the health-check-related files remain changed before pushing
the fix.

Comment thread adit/core/utils/dicom_operator.py Outdated
Comment on lines +542 to +545
logger.error(
"Store handler failed for SOP %s: %s",
ds.SOPInstanceUID, err, exc_info=True,
)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Prevent secondary failure in exception handler by safely accessing DICOM attributes.

Accessing ds.SOPInstanceUID directly can raise AttributeError if the dataset is malformed, masking the original store failure. Use safe attribute access:

Fix
                 logger.error(
                     "Store handler failed for SOP %s: %s",
-                    ds.SOPInstanceUID, err, exc_info=True,
+                    getattr(ds, "SOPInstanceUID", "<unknown>"),
+                    err,
+                    exc_info=True,
                 )
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
logger.error(
"Store handler failed for SOP %s: %s",
ds.SOPInstanceUID, err, exc_info=True,
)
logger.error(
"Store handler failed for SOP %s: %s",
getattr(ds, "SOPInstanceUID", "<unknown>"),
err,
exc_info=True,
)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/core/utils/dicom_operator.py` around lines 542 - 545, The logger.error
call accesses ds.SOPInstanceUID directly which can raise AttributeError for
malformed datasets; update the exception handler around logger.error to safely
retrieve the SOP Instance UID (e.g. guard for ds being None and use getattr(ds,
"SOPInstanceUID", "<unknown>") or similar) and pass that safe value into
logger.error (keep exc_info=True) so the original exception isn't masked; look
for the logger.error invocation and the ds variable in the store handler code to
apply this change.

Comment thread adit/mass_transfer/apps.py Outdated
Comment on lines +12 to +16
def ready(self):
register_app()

# Put calls to db stuff in this signal handler
post_migrate.connect(init_db, sender=self)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "Registration helpers:"
fd -i 'site.py' | xargs -r rg -n -C2 'def register_(main_menu_item|job_stats_collector|dicom_processor)|append\(|setdefault\(|if .* in '

echo
echo "AppConfig.ready hooks:"
fd -i 'apps.py' adit | xargs -r rg -n -C2 'def ready|register_app\(|post_migrate\.connect|dispatch_uid'

Repository: openradx/adit

Length of output: 4986


🏁 Script executed:

#!/bin/bash
set -euo pipefail

# Get the full register_app() function in mass_transfer
echo "=== mass_transfer/apps.py register_app() ==="
sed -n '19,50p' adit/mass_transfer/apps.py

echo
echo "=== Check register_main_menu_item implementation ==="
# Search for register_main_menu_item in adit_radis_shared or locally
rg -A 5 'def register_main_menu_item' --type py

echo
echo "=== Check init_db pattern in mass_transfer ==="
fd -i 'mass_transfer' -t f -name '*.py' | xargs rg -B2 -A8 'def init_db'

echo
echo "=== Check collect_job_stats function ==="
rg -B2 -A10 'collect_job_stats' adit/mass_transfer --type py

Repository: openradx/adit

Length of output: 1231


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "=== Search for register_main_menu_item in adit packages ==="
find adit -name '*.py' -type f | xargs rg 'def register_main_menu_item' -B2 -A10

echo
echo "=== Check adit_radis_shared imports in adit/core/site.py ==="
rg -B5 -A20 'register_main_menu_item|main_menu_items' adit/core/site.py

echo
echo "=== Confirm job_stats_collectors is append-only ==="
rg -B3 -A3 'job_stats_collectors.*=' adit/core/site.py | head -20

Repository: openradx/adit

Length of output: 118


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "=== adit/core/site.py full content ==="
cat -n adit/core/site.py

echo
echo "=== Confirm post_migrate dispatch_uid usage ==="
rg 'post_migrate\.connect' adit --type py -A1 -B1 | head -30

Repository: openradx/adit

Length of output: 2170


Guard register_app() against repeated ready() calls.

Django can invoke ready() more than once, and collect_job_stats() is recreated on each call. Since register_job_stats_collector() uses append-only registration, this stacks duplicate function objects in the job_stats_collectors list over time. Additionally, post_migrate.connect(init_db, sender=self) lacks a dispatch_uid parameter, so the signal handler can register multiple times on repeated ready() calls.

♻️ Suggested guard
+_mass_transfer_registered = False
+
 class MassTransferConfig(AppConfig):
     name = "adit.mass_transfer"

     def ready(self):
-        register_app()
-
-        # Put calls to db stuff in this signal handler
-        post_migrate.connect(init_db, sender=self)
+        global _mass_transfer_registered
+        if not _mass_transfer_registered:
+            register_app()
+            _mass_transfer_registered = True
+
+        post_migrate.connect(init_db, sender=self, dispatch_uid="adit.mass_transfer.init_db")

Also note: init_db() uses exists() followed by create() (lines 48–49), which is not atomic and can result in duplicate singleton creation under concurrent post-migrate signals. Consider using get_or_create() instead.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/mass_transfer/apps.py` around lines 12 - 16, Guard register_app() and
signal/collector registration against repeated ready() invocations: in ready()
ensure register_app() is only executed once (e.g. set a one-time flag on the
AppConfig instance or module-level sentinel) so
collect_job_stats()/register_job_stats_collector() does not append duplicate
callables into job_stats_collectors; also pass a stable dispatch_uid when
connecting post_migrate to init_db (post_migrate.connect(init_db, sender=self,
dispatch_uid=...)) to prevent multiple signal registrations; finally, replace
the exists()+create() pattern inside init_db() with an atomic get_or_create()
call to avoid duplicate singleton creation under concurrent post-migrate
signals.

Comment thread adit/mass_transfer/apps.py Outdated
Comment on lines +48 to +49
if not MassTransferSettings.objects.exists():
MassTransferSettings.objects.create()
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Make the default settings insert atomic.

exists() + create() is a check-then-insert race. If two migrate processes start together, both can observe an empty table and create a row. Use get_or_create() against the model’s singleton key/constraint instead.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/mass_transfer/apps.py` around lines 48 - 49, The current
check-then-create pattern using MassTransferSettings.objects.exists() followed
by MassTransferSettings.objects.create() is racy; replace it with an atomic
get_or_create call on the model’s singleton identifying key (e.g., the unique PK
or other singleton constraint) so the database enforces uniqueness and only one
row is created (use MassTransferSettings.objects.get_or_create(defaults={...})
with the appropriate lookup kwargs for the singleton key).

Comment thread adit/mass_transfer/factories.py Outdated
Comment on lines +27 to +31
partition_start = factory.LazyFunction(timezone.now)
partition_end = factory.LazyFunction(
lambda: timezone.now() + timedelta(hours=23, minutes=59, seconds=59)
)
partition_key = factory.Faker("date", pattern="%Y-%m-%d")
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Partition fields are generated independently and can become inconsistent.

partition_key should represent partition_start’s date, but it is currently random. partition_end should also be derived from the same partition_start value to keep partition metadata coherent and avoid flaky tests.

💡 Proposed fix
 class MassTransferTaskFactory(AbstractTransferTaskFactory[MassTransferTask]):
@@
     job = factory.SubFactory(MassTransferJobFactory)
     partition_start = factory.LazyFunction(timezone.now)
-    partition_end = factory.LazyFunction(
-        lambda: timezone.now() + timedelta(hours=23, minutes=59, seconds=59)
-    )
-    partition_key = factory.Faker("date", pattern="%Y-%m-%d")
+    partition_end = factory.LazyAttribute(
+        lambda obj: obj.partition_start + timedelta(hours=23, minutes=59, seconds=59)
+    )
+    partition_key = factory.LazyAttribute(lambda obj: obj.partition_start.date().isoformat())
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
partition_start = factory.LazyFunction(timezone.now)
partition_end = factory.LazyFunction(
lambda: timezone.now() + timedelta(hours=23, minutes=59, seconds=59)
)
partition_key = factory.Faker("date", pattern="%Y-%m-%d")
partition_start = factory.LazyFunction(timezone.now)
partition_end = factory.LazyAttribute(
lambda obj: obj.partition_start + timedelta(hours=23, minutes=59, seconds=59)
)
partition_key = factory.LazyAttribute(lambda obj: obj.partition_start.date().isoformat())
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/mass_transfer/factories.py` around lines 27 - 31, partition_start,
partition_end and partition_key must be generated consistently: keep
partition_start as the base (factory.LazyFunction(timezone.now)) but make
partition_end and partition_key derived fields using LazyAttribute so they use
the same partition_start value; specifically, change partition_end to use a
LazyAttribute that returns partition_start + timedelta(hours=23, minutes=59,
seconds=59) and change partition_key to use a LazyAttribute that formats
partition_start.date() as "%Y-%m-%d" (e.g., via .strftime) so all three fields
(partition_start, partition_end, partition_key) remain coherent and tests will
not be flaky.

Comment on lines +46 to +48
{% if job.pseudonym_salt %}
<dt class="col-sm-3">Pseudonym Salt</dt>
<dd class="col-sm-9">{{ job.pseudonym_salt }}</dd>
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Do not render the pseudonym salt in clear text.

This exposes the linkage secret to anyone who can open the job detail page. If the UI needs to show that linked pseudonymization is enabled, render only that a salt is configured, not the value itself.

🔒 Suggested change
-        {% if job.pseudonym_salt %}
-        <dt class="col-sm-3">Pseudonym Salt</dt>
-        <dd class="col-sm-9">{{ job.pseudonym_salt }}</dd>
-        {% endif %}
+        {% if job.pseudonym_salt %}
+        <dt class="col-sm-3">Pseudonym Salt</dt>
+        <dd class="col-sm-9">Configured</dd>
+        {% endif %}
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
{% if job.pseudonym_salt %}
<dt class="col-sm-3">Pseudonym Salt</dt>
<dd class="col-sm-9">{{ job.pseudonym_salt }}</dd>
{% if job.pseudonym_salt %}
<dt class="col-sm-3">Pseudonym Salt</dt>
<dd class="col-sm-9">Configured</dd>
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/mass_transfer/templates/mass_transfer/mass_transfer_job_detail.html`
around lines 46 - 48, The template currently renders the secret value
job.pseudonym_salt in clear text in mass_transfer_job_detail.html; change the UI
to never display the salt itself—replace the {{ job.pseudonym_salt }} output
with a non-secret indicator (e.g., "Configured" or a boolean/lock icon) so the
template only conveys that a pseudonym salt exists without exposing its value;
update the dt/dd block referencing job.pseudonym_salt accordingly.

Comment on lines +10 to +19
@register.filter
def volume_status_css_class(status: str) -> str:
css_classes = {
MassTransferVolume.Status.PENDING: "text-secondary",
MassTransferVolume.Status.EXPORTED: "text-info",
MassTransferVolume.Status.CONVERTED: "text-success",
MassTransferVolume.Status.SKIPPED: "text-muted",
MassTransferVolume.Status.ERROR: "text-danger",
}
return css_classes.get(MassTransferVolume.Status(status), "text-secondary")
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Potential ValueError if status is not a valid enum member.

MassTransferVolume.Status(status) is called before the .get() lookup. If status is an invalid value not in the enum, this will raise a ValueError instead of returning the default "text-secondary".

🛡️ Proposed defensive fix
 `@register.filter`
 def volume_status_css_class(status: str) -> str:
     css_classes = {
         MassTransferVolume.Status.PENDING: "text-secondary",
         MassTransferVolume.Status.EXPORTED: "text-info",
         MassTransferVolume.Status.CONVERTED: "text-success",
         MassTransferVolume.Status.SKIPPED: "text-muted",
         MassTransferVolume.Status.ERROR: "text-danger",
     }
-    return css_classes.get(MassTransferVolume.Status(status), "text-secondary")
+    try:
+        return css_classes.get(MassTransferVolume.Status(status), "text-secondary")
+    except ValueError:
+        return "text-secondary"
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
@register.filter
def volume_status_css_class(status: str) -> str:
css_classes = {
MassTransferVolume.Status.PENDING: "text-secondary",
MassTransferVolume.Status.EXPORTED: "text-info",
MassTransferVolume.Status.CONVERTED: "text-success",
MassTransferVolume.Status.SKIPPED: "text-muted",
MassTransferVolume.Status.ERROR: "text-danger",
}
return css_classes.get(MassTransferVolume.Status(status), "text-secondary")
`@register.filter`
def volume_status_css_class(status: str) -> str:
css_classes = {
MassTransferVolume.Status.PENDING: "text-secondary",
MassTransferVolume.Status.EXPORTED: "text-info",
MassTransferVolume.Status.CONVERTED: "text-success",
MassTransferVolume.Status.SKIPPED: "text-muted",
MassTransferVolume.Status.ERROR: "text-danger",
}
try:
return css_classes.get(MassTransferVolume.Status(status), "text-secondary")
except ValueError:
return "text-secondary"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/mass_transfer/templatetags/mass_transfer_extras.py` around lines 10 -
19, The function volume_status_css_class currently calls
MassTransferVolume.Status(status) which raises ValueError for invalid values;
update volume_status_css_class to handle invalid status values by wrapping the
enum cast in a try/except (catch ValueError) and return the default
"text-secondary" on error, or alternatively check membership without casting
(e.g., compare against MassTransferVolume.Status members or map by raw status
string) so that css_classes.get uses a safe key; apply this change inside
volume_status_css_class referencing MassTransferVolume.Status and css_classes.

Comment on lines +5 to +9
def create_mass_transfer_group():
group = GroupFactory.create(name="Radiologists")
add_permission(group, "mass_transfer", "add_masstransferjob")
add_permission(group, "mass_transfer", "view_masstransferjob")
return group
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Avoid fixed group name to prevent test collisions.

At Line 6, using a constant "Radiologists" can break when this helper is called multiple times in one DB lifecycle (unique Group.name).

Proposed fix
-def create_mass_transfer_group():
-    group = GroupFactory.create(name="Radiologists")
+def create_mass_transfer_group(name: str | None = None):
+    group = GroupFactory.create(name=name) if name else GroupFactory.create()
     add_permission(group, "mass_transfer", "add_masstransferjob")
     add_permission(group, "mass_transfer", "view_masstransferjob")
     return group
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/mass_transfer/utils/testing_helpers.py` around lines 5 - 9, The helper
create_mass_transfer_group currently uses a fixed Group name ("Radiologists")
which can cause unique constraint failures; change it to create a uniquely-named
group (e.g., let GroupFactory.generate or GroupFactory.create with no name or
append a random/uuid suffix) so repeated calls won't collide, and keep the same
permission additions (add_permission calls) using the created group's instance;
update references to GroupFactory.create and the Group.name usage in
create_mass_transfer_group accordingly.

Comment thread adit/mass_transfer/views.py Outdated
Comment on lines +139 to +140
if job.pseudonym_salt:
response.write(f"# Pseudonym salt: {job.pseudonym_salt}\n")
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Don’t ship the pseudonym salt in the CSV export.

The salt is the secret that makes deterministic pseudonyms reusable across jobs. Emitting it in every download lets anyone with the CSV correlate future exports that reuse the same salt, which weakens the privacy boundary.

Suggested fix
         writer = csv.writer(response)
-        if job.pseudonym_salt:
-            response.write(f"# Pseudonym salt: {job.pseudonym_salt}\n")
         writer.writerow(self.COLUMNS)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@adit/mass_transfer/views.py` around lines 139 - 140, Remove the line that
emits the secret pseudonym salt into the CSV download: do not call
response.write with job.pseudonym_salt (the check for job.pseudonym_salt and the
response.write(f"# Pseudonym salt: {job.pseudonym_salt}\n") should be deleted).
Ensure no other code in this view or export path writes or exposes
job.pseudonym_salt to the response or included export metadata; if the salt must
be retained for internal reuse, keep it only in job storage and never serialize
it to output.

Comment thread docker-compose.staging.yml Outdated
Comment on lines +43 to +55
command: >
bash -c "
wait-for-it -s postgres.local:5432 -t 60 &&
./manage.py migrate &&
./manage.py create_superuser &&
./manage.py create_example_users &&
./manage.py create_example_groups &&
./manage.py populate_example_data &&
wait-for-it -s orthanc1.local:6501 -t 60 &&
./manage.py populate_orthancs &&
./manage.py retry_stalled_jobs &&
./manage.py runserver 0.0.0.0:8000
"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Move one-off bootstrap out of the replicated web service.

When WEB_STAGING_REPLICAS is greater than 1, every web task will run migrations, user/group/example-data creation, Orthanc seeding, and retry_stalled_jobs before starting runserver. That will race and can duplicate bootstrap work; put those commands in the one-shot init service and let web only serve traffic.

🐳 Suggested split
   web:
     <<: *default-app
     image: adit_staging-web:latest
     ports:
       - ${WEB_STAGING_PORT:-8001}:8000
       - ${REMOTE_DEBUGGING_STAGING_PORT:-5679}:5678
     command: >
       bash -c "
         wait-for-it -s postgres.local:5432 -t 60 &&
-        ./manage.py migrate &&
-        ./manage.py create_superuser &&
-        ./manage.py create_example_users &&
-        ./manage.py create_example_groups &&
-        ./manage.py populate_example_data &&
-        wait-for-it -s orthanc1.local:6501 -t 60 &&
-        ./manage.py populate_orthancs &&
-        ./manage.py retry_stalled_jobs &&
         ./manage.py runserver 0.0.0.0:8000
       "

Move the removed commands into the init service so only one task performs bootstrap.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docker-compose.staging.yml` around lines 43 - 55, Remove all one-off
bootstrap commands from the replicated web service command and put them into a
new/non-replicated one-shot init service so only one container runs migrations
and seeding; specifically move these commands: ./manage.py migrate, ./manage.py
create_superuser, ./manage.py create_example_users, ./manage.py
create_example_groups, ./manage.py populate_example_data, ./manage.py
populate_orthancs and ./manage.py retry_stalled_jobs (and their wait-for-it
waits: postgres.local:5432 and orthanc1.local:6501) into the init service
command, keep the web service command only as ./manage.py runserver 0.0.0.0:8000
(and any minimal readiness wait it needs), ensure the init service uses the same
image/env as web and is configured as a one-shot (not replicated) service so
WEB_STAGING_REPLICAS > 1 won’t re-run bootstrap.

@samuelvkwong samuelvkwong merged commit 32573b1 into main Apr 2, 2026
2 checks passed
@samuelvkwong samuelvkwong deleted the fix-healthcheck-https branch April 2, 2026 11:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant