Skip to content

feat(domain): implement hexagonal port traits (task 04)#4

Merged
mpiton merged 5 commits intomainfrom
feat/04-domain-ports
Apr 8, 2026
Merged

feat(domain): implement hexagonal port traits (task 04)#4
mpiton merged 5 commits intomainfrom
feat/04-domain-ports

Conversation

@mpiton
Copy link
Copy Markdown
Owner

@mpiton mpiton commented Apr 7, 2026

Summary

  • Define all 14 hexagonal architecture port traits (12 driven + 2 driving) in domain/ports/
  • Add 5 supporting domain model types needed by port signatures (views, config, credential, http, meta)
  • CQRS separation: DownloadRepository (write, domain aggregates) vs DownloadReadRepository (read, flattened DTOs)
  • Driving ports use RPITIT async (Rust 1.75+, std-only, no external crates)
  • Custom Debug on Credential to redact passwords (same pattern as Url)

Changes

New port traits (domain/ports/driven/):
DownloadRepository, DownloadReadRepository, HistoryRepository, StatsRepository, PluginLoader, CredentialStore, ConfigStore, FileStorage, HttpClient, DownloadEngine, EventBus, ClipboardObserver

New driving traits (domain/ports/driving/):
CommandHandler<C: Command>, QueryHandler<Q: Query>

New domain types (domain/model/):
DownloadView, DownloadDetailView, HistoryEntry, StatsView, SegmentView, DownloadFilter, SortField, AppConfig, ConfigPatch, Credential, HttpResponse, DownloadMeta, SegmentMeta

Design decisions

  • Driven ports are synchronous: adapters handle async internally (e.g. tokio::spawn). Domain stays pure std.
  • DownloadEngine is fire-and-forget: start() spawns async work, outcomes reported via EventBus.
  • DomainError used as error type: AppError will be introduced in task 05.
  • View types in domain: needed for port signatures. Read-model DTOs with pub fields (data-only, no invariants).

Test plan

  • 18 port-specific tests (mock implementations, Send+Sync bounds, functional CRUD)
  • 94 total tests pass (cargo test)
  • cargo clippy -- -D warnings clean
  • Zero external crate imports in domain/ports/
  • Adversarial review resolved (Credential debug redaction, 'static on EventBus, PluginLoader fallibility, SegmentView typing)

Summary by cubic

Implements all domain hexagonal ports and DTOs, setting clear CQRS read/write boundaries in a std-only domain ready for adapters. Also adds sorting direction, tri-state config patches, safer HTTP headers/defaults, sensitive Debug redactions, removes specs/, and updates tests to use fake credentials to satisfy GitGuardian.

  • New Features

    • Driven ports: DownloadRepository/DownloadReadRepository, HistoryRepository, StatsRepository, PluginLoader, CredentialStore, ConfigStore, FileStorage, HttpClient, DownloadEngine, EventBus, ClipboardObserver.
    • Driving ports: CommandHandler<C: Command> and QueryHandler<Q: Query> using RPITIT async; all ports are Send + Sync and synchronous.
    • Types: views (DownloadView, DownloadDetailView, SegmentView, HistoryEntry, StatsView, filters + SortField + SortDirection + SortOrder), config (AppConfig, ConfigPatch), auth (Credential with redacted Debug), HTTP (HttpResponse with multi-value headers), resume meta (DownloadMeta with redacted URL, SegmentMeta).
    • Design: DownloadEngine is fire-and-forget with outcomes via EventBus; errors use DomainError. Tests include in-memory mocks, Send/Sync checks, and basic CRUD/flow coverage. Removed specs/.
  • Bug Fixes

    • Sorting: added SortDirection and SortOrder.
    • Config: AppConfig.download_dir is now Option<String>; ConfigPatch.download_dir supports tri-state updates.
    • HTTP: multi-value headers via HashMap<String, Vec<String>>; content-length parsing trims whitespace.
    • Docs/Tests/Security: clarified Unix seconds in HistoryRepository docs; ConfigPatch mock updates all fields; test credentials switched to obvious fakes to avoid GitGuardian false positives.

Written for commit 4bdb073. Summary will update on new commits.

Summary by CodeRabbit

  • New Features
    • Persistent app settings, secure credential storage, clipboard URL monitoring, plugin-based URL handlers, resumable multi-segment downloads, download history & statistics, and richer list/detail download views.
  • Architecture / APIs
    • New domain interfaces to support HTTP, file I/O, eventing, download engine, repositories, plugin loading, and command/query handling.
  • Tests
    • Comprehensive in-memory mocks and unit tests for driven interfaces.
  • Documentation
    • Removed legacy project/specs README.

Define all driven and driving port traits for the hexagonal architecture:

Driven ports (12 traits):
- DownloadRepository (write), DownloadReadRepository (read/CQRS)
- HistoryRepository, StatsRepository
- PluginLoader, CredentialStore, ConfigStore
- FileStorage, HttpClient, DownloadEngine
- EventBus, ClipboardObserver

Driving ports (2 traits):
- CommandHandler<C: Command> with RPITIT async
- QueryHandler<Q: Query> with RPITIT async

Supporting domain types:
- views.rs: DownloadView, DownloadDetailView, HistoryEntry, StatsView
- config.rs: AppConfig, ConfigPatch
- credential.rs: Credential (custom Debug to redact password)
- http.rs: HttpResponse
- meta.rs: DownloadMeta, SegmentMeta

All traits use Send + Sync, std-only (no external crates in domain),
documented with /// doc comments. 18 tests verify mock implementations
compile and satisfy trait bounds.
@gitguardian
Copy link
Copy Markdown

gitguardian bot commented Apr 7, 2026

⚠️ GitGuardian has uncovered 1 secret following the scan of your pull request.

Please consider investigating the findings and remediating the incidents. Failure to do so may lead to compromising the associated services or software components.

🔎 Detected hardcoded secret in your pull request
GitGuardian id GitGuardian status Secret Commit Filename
29818992 Triggered Generic Password d9ea78b src-tauri/src/domain/ports/driven/tests.rs View secret
🛠 Guidelines to remediate hardcoded secrets
  1. Understand the implications of revoking this secret by investigating where it is used in your code.
  2. Replace and store your secret safely. Learn here the best practices.
  3. Revoke and rotate this secret.
  4. If possible, rewrite git history. Rewriting git history is not a trivial act. You might completely break other contributing developers' workflow and you risk accidentally deleting legitimate data.

To avoid such incidents in the future consider


🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.

@github-actions github-actions bot added documentation Improvements or additions to documentation rust labels Apr 7, 2026
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Apr 7, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 0ae78b85-e458-46a6-abf1-e3d5524d7576

📥 Commits

Reviewing files that changed from the base of the PR and between 07f6161 and 4bdb073.

📒 Files selected for processing (1)
  • src-tauri/src/domain/ports/driven/tests.rs
✅ Files skipped from review due to trivial changes (1)
  • src-tauri/src/domain/ports/driven/tests.rs

📝 Walkthrough

Walkthrough

Adds a new domain ports layer (driven + driving), many domain models and view DTOs, ~12 driven-port trait interfaces, async command/query handler traits, extensive in-memory mocks/tests, and deletes specs/README.md.

Changes

Cohort / File(s) Summary
Documentation Removal
specs/README.md
Deleted entire specs document (project overview, task checklist, diagrams, PRD mapping — 126 lines).
Domain Root
src-tauri/src/domain/mod.rs
Exported new ports submodule.
Domain Model Exports
src-tauri/src/domain/model/mod.rs
Added new submodules and re-exports for config, credential, http, meta, and views types.
New Domain Models
src-tauri/src/domain/model/{config.rs,credential.rs,http.rs,meta.rs,views.rs}
Introduced AppConfig/ConfigPatch, Credential (password-redacted Debug), HttpResponse helpers, DownloadMeta/SegmentMeta, and numerous read-model/view DTOs (DownloadView, DownloadDetailView, StatsView, filters/sorting, etc.).
Ports Aggregators
src-tauri/src/domain/ports/mod.rs, .../driven/mod.rs, .../driving/mod.rs
New ports module structure; re-exports driving/driven submodules and their public traits/types.
Driven Ports (interfaces)
src-tauri/src/domain/ports/driven/*
.../{clipboard_observer.rs,config_store.rs,credential_store.rs,download_engine.rs,download_read_repository.rs,download_repository.rs,event_bus.rs,file_storage.rs,history_repository.rs,http_client.rs,plugin_loader.rs,stats_repository.rs}
Added ~12 trait-based driven ports for clipboard, config, credential store, download engine, read/write repos, event bus, file storage/meta, HTTP client (range/head), plugin loader, history and stats repositories.
Driving Ports (CQRS)
src-tauri/src/domain/ports/driving/{command_handler.rs,query_handler.rs}
Added Command/Query marker traits and generic async CommandHandler/QueryHandler traits with associated outputs and async handle signatures.
In-Memory Mocks & Tests
src-tauri/src/domain/ports/driven/tests.rs
Large #[cfg(test)] module with in-memory/mock implementations for driven ports and extensive unit tests exercising CRUD, metadata I/O, HTTP range behavior, config patching, credential handling, history/stats, clipboard draining, plugin lifecycle, and Send+Sync assertions.

Sequence Diagram(s)

sequenceDiagram
  participant UI as "UI / Caller"
  participant CmdH as "CommandHandler"
  participant Repo as "DownloadRepository"
  participant Engine as "DownloadEngine"
  participant FS as "FileStorage"
  participant HTTP as "HttpClient"
  participant Bus as "EventBus"

  UI->>CmdH: StartDownload(command)
  CmdH->>Repo: save(download)
  CmdH->>Engine: start(&download)
  Engine->>FS: create_file(path, size)
  Engine->>HTTP: head(url)
  Engine->>HTTP: supports_range(url)
  loop per segment
    Engine->>HTTP: get_range(url, start, end)
    Engine->>FS: write_segment(path, offset, data)
    Engine->>Bus: publish(DomainEvent::Progress)
  end
  Engine->>Bus: publish(DomainEvent::Complete)
  Bus->>Repo: (subscribers may update state)
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Possibly related PRs

Poem

🐇 I hopped through traits and tiny specs today,

New models sprouted where the README went away.
Commands bound to handlers, ports opened wide,
Mocks whispered secrets of IO they hide.
A rabbit cheers this tidy hexagonal stride.

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 26.92% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'feat(domain): implement hexagonal port traits (task 04)' clearly and accurately describes the main addition of hexagonal port trait definitions in the domain layer, which represents the primary change across all modified files.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch feat/04-domain-ports

Comment @coderabbitai help to get the list of available commands and usage tips.

@greptile-apps
Copy link
Copy Markdown

greptile-apps bot commented Apr 7, 2026

Greptile Summary

This PR introduces all 14 hexagonal-architecture port traits for the domain layer (12 driven, 2 driving) along with 5 supporting model types and 18 focused tests. The design is clean: driven ports are synchronous, driving ports use RPITIT async (Rust 1.75+), CQRS read/write split is well-motivated, and Credential debug redaction is correctly implemented. All remaining findings are minor P2 style suggestions.

Confidence Score: 5/5

Safe to merge — all remaining findings are P2 style suggestions with no correctness impact.

The implementation is structurally sound, well-tested (18 port-specific tests, 94 total passing), and clippy-clean. The three P2 issues (SortField missing direction, download_dir empty-string default, HashMap header loss) are design limitations that can be addressed without breaking changes before the adapter layer is written.

src-tauri/src/domain/model/views.rs and src-tauri/src/domain/model/config.rs have minor design choices worth revisiting before adapters consume them.

Important Files Changed

Filename Overview
src-tauri/src/domain/ports/driven/tests.rs Thorough 18-test mock suite covering all 12 driven ports with compile-time Send+Sync assertions
src-tauri/src/domain/ports/driving/command_handler.rs Clean RPITIT async CommandHandler driving port with correct Send bounds on the returned Future
src-tauri/src/domain/ports/driving/query_handler.rs Clean RPITIT async QueryHandler driving port, symmetric with CommandHandler
src-tauri/src/domain/model/credential.rs Custom Debug implementation correctly redacts the password field; test verifies the redaction
src-tauri/src/domain/model/views.rs SortField enum lacks a sort direction; all view types (DownloadView, HistoryEntry, StatsView) are otherwise well-structured
src-tauri/src/domain/model/config.rs download_dir defaults to empty string; ConfigPatch correctly uses Option<Option> for nullable speed_limit
src-tauri/src/domain/model/http.rs Headers stored as HashMap<String,String> silently drops duplicate values; case-insensitive lookup re-allocates on every call
src-tauri/src/domain/ports/driven/event_bus.rs 'static subscriber bound is intentional; no unsubscribe is a documented limitation
src-tauri/src/domain/ports/driven/clipboard_observer.rs get_urls drains the buffer via &self using interior mutability — documented behavior, idiomatic Rust
src-tauri/src/domain/ports/driven/download_repository.rs Clean CQRS write-side repository port with upsert-style save; no issues
src-tauri/src/domain/ports/driven/download_read_repository.rs find_downloads lacks a sort direction parameter alongside SortField; otherwise solid CQRS read port
src-tauri/src/domain/ports/driven/download_engine.rs Fire-and-forget synchronous port; async work delegated to adapter via tokio::spawn, outcome via EventBus

Flowchart

%%{init: {'theme': 'neutral'}}%%
flowchart TD
    UI["Frontend / Tauri IPC"] -->|"Command / Query"| DrivingPorts

    subgraph DrivingPorts["Driving Ports (primary)"]
        CH["CommandHandler<C: Command>"]
        QH["QueryHandler<Q: Query>"]
    end

    CH --> AppLayer["Application Layer\n(task 05)"]
    QH --> AppLayer

    subgraph DrivenPorts["Driven Ports (secondary)"]
        DR["DownloadRepository"]
        DRR["DownloadReadRepository"]
        DE["DownloadEngine"]
        EB["EventBus"]
        FS["FileStorage"]
        HC["HttpClient"]
        CS["CredentialStore"]
        CFG["ConfigStore"]
        HR["HistoryRepository"]
        SR["StatsRepository"]
        PL["PluginLoader"]
        CO["ClipboardObserver"]
    end

    AppLayer -->|"orchestrates"| DrivenPorts
    EB -->|"DomainEvent"| UI
Loading

Fix All in Claude Code

Reviews (1): Last reviewed commit: "feat(domain): implement hexagonal port t..." | Re-trigger Greptile

Comment on lines +121 to +130
#[derive(Debug, Clone, Copy, PartialEq, Eq, Default)]
pub enum SortField {
#[default]
CreatedAt,
FileName,
FileSize,
Progress,
Speed,
State,
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 SortField has no direction variant

find_downloads accepts Option<SortField> but there is no way for the caller to request descending order. Every sort will behave as ascending, making it impossible to implement "newest first" or "largest first" UI orderings without a breaking port change later.

Fix in Claude Code

Comment on lines +24 to +26
fn default() -> Self {
Self {
download_dir: String::new(),
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 download_dir defaults to empty string

AppConfig::default() sets download_dir: String::new(). Any code that consumes this default without first checking for an empty path will silently resolve to the process's current working directory (or produce an unhelpful I/O error). Consider Option<String> or documenting that an empty value must be replaced before the store is used.

Fix in Claude Code

Comment on lines +14 to +19
#[derive(Debug, Clone, PartialEq)]
pub struct HttpResponse {
pub status_code: u16,
pub headers: HashMap<String, String>,
pub body: Vec<u8>,
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 HashMap<String, String> silently drops duplicate HTTP headers

HTTP allows multiple values for the same header name (e.g., Set-Cookie, Link). Storing headers in a HashMap<String, String> forces adapters to silently discard extras. This is fine for content-length/accept-ranges today, but changing to HashMap<String, Vec<String>> now avoids a breaking port contract change if a plugin later needs multi-value headers.

Fix in Claude Code

Specs are gitignored and should not be committed.
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🧹 Nitpick comments (12)
src-tauri/src/domain/model/views.rs (1)

133-133: Use a fixed-width integer for state counts.

On Line 133, usize is architecture-dependent. For DTO/port contracts, a fixed-width type (u64) is safer and more portable.

Proposed change
-pub type StateCountMap = HashMap<DownloadState, usize>;
+pub type StateCountMap = HashMap<DownloadState, u64>;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/model/views.rs` at line 133, The alias StateCountMap
uses usize which is architecture-dependent; change its value type to a
fixed-width integer (u64) to make DTO/port contracts portable—update the type
alias declaration for StateCountMap (currently defined as HashMap<DownloadState,
usize>) to HashMap<DownloadState, u64> and adjust any code that constructs or
reads from StateCountMap (e.g., places that insert, serialize, or sum counts) to
use u64 accordingly.
src-tauri/src/domain/model/http.rs (1)

27-33: Avoid per-lookup allocations in header matching.

On Line 29 and Line 32, to_lowercase() allocates for every call/key. Prefer ASCII-insensitive comparison directly.

Proposed change
     pub fn header(&self, name: &str) -> Option<&str> {
-        let lower = name.to_lowercase();
         self.headers
             .iter()
-            .find(|(k, _)| k.to_lowercase() == lower)
+            .find(|(k, _)| k.eq_ignore_ascii_case(name))
             .map(|(_, v)| v.as_str())
     }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/model/http.rs` around lines 27 - 33, The header lookup
in function header(&self, name: &str) allocates by calling to_lowercase() on
both the input and each header key; change the comparison to an ASCII
case-insensitive comparison to avoid allocations — use the built-in
eq_ignore_ascii_case (e.g., in the iterator find use |(k, _)|
k.eq_ignore_ascii_case(name)) so you only compare without creating new Strings
and keep the rest of header(&self, name: &str) unchanged.
src-tauri/src/domain/ports/driven/http_client.rs (1)

15-23: Avoid a second round-trip for range capability checks.

Since head() already returns headers, supports_range() can default to parsing Accept-Ranges from that response. This reduces adapter duplication and inconsistent behavior between the two methods.

Proposed default implementation
 pub trait HttpClient: Send + Sync {
     /// Send a HEAD request to retrieve response headers without body.
     fn head(&self, url: &str) -> Result<HttpResponse, DomainError>;

     /// Fetch a byte range from the URL (inclusive start, inclusive end).
     fn get_range(&self, url: &str, start: u64, end: u64) -> Result<Vec<u8>, DomainError>;

     /// Check whether the server supports HTTP Range requests.
-    fn supports_range(&self, url: &str) -> Result<bool, DomainError>;
+    fn supports_range(&self, url: &str) -> Result<bool, DomainError> {
+        let response = self.head(url)?;
+        Ok(response
+            .header("accept-ranges")
+            .is_some_and(|v| v.eq_ignore_ascii_case("bytes")))
+    }
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/ports/driven/http_client.rs` around lines 15 - 23, Add a
default implementation for supports_range in the HttpClient trait that calls
head(url) and parses the "accept-ranges" response header (case-insensitive) to
determine support (treat "bytes" as true, anything else or missing as false),
returning any DomainError from head() rather than performing a second network
request; use the existing head(&self, url: &str) -> Result<HttpResponse,
DomainError> and HttpResponse header accessors to implement this logic and
return Result<bool, DomainError>.
src-tauri/src/domain/ports/driven/stats_repository.rs (1)

10-18: Clarify aggregation contract for StatsView completeness.

record_completed(bytes, avg_speed) only carries global counters, while StatsView also exposes daily_volumes and top_hosts. Please either (a) document that those fields are derived elsewhere (e.g., history queries) or (b) extend the input payload so adapters can update all aggregates consistently.

Proposed contract/doc tweak
-/// Statistics are typically computed from history data via SQL
-/// aggregations. The `record_completed` method updates running
-/// totals when a download finishes.
+/// Statistics may be computed from history data via SQL aggregations.
+/// `record_completed` is intended to update only global running totals
+/// (bytes/files/speeds). Per-day and per-host aggregates may be derived
+/// from history storage in `get_stats`.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/ports/driven/stats_repository.rs` around lines 10 - 18,
The trait StatsRepository's record_completed(bytes, avg_speed) signature doesn't
cover fields exposed on StatsView (daily_volumes, top_hosts), causing ambiguity
for adapters; either document that StatsView's daily_volumes and top_hosts are
produced by separate history/aggregation queries or extend the repository API so
adapters can update all aggregates atomically. Update the StatsRepository
contract by adding clear docs on StatsView completeness referencing StatsView,
daily_volumes, and top_hosts (if you choose the doc path), or modify/overload
record_completed (or add a new method like record_completed_with_aggregates) to
accept the additional aggregate data required to update daily_volumes and
top_hosts so adapters (implementations of StatsRepository) can update all
aggregates consistently.
src-tauri/src/domain/ports/driven/file_storage.rs (1)

23-29: Disambiguate path semantics for metadata methods.

For Line 23/26/29, it’s unclear whether path is the download file path (from which .vortex-meta is derived) or the metadata file path itself. Please make that explicit in docs (or parameter names) to avoid adapter divergence.

Proposed doc-level clarification
-    /// Read the `.vortex-meta` resume metadata for a download.
+    /// Read the `.vortex-meta` resume metadata derived from `download_path`.
     fn read_meta(&self, path: &Path) -> Result<Option<DownloadMeta>, DomainError>;

-    /// Write (or overwrite) the `.vortex-meta` resume metadata.
+    /// Write (or overwrite) the `.vortex-meta` resume metadata derived from `download_path`.
     fn write_meta(&self, path: &Path, meta: &DownloadMeta) -> Result<(), DomainError>;

-    /// Delete the `.vortex-meta` file (called after successful completion).
+    /// Delete the `.vortex-meta` file derived from `download_path` (after successful completion).
     fn delete_meta(&self, path: &Path) -> Result<(), DomainError>;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/ports/driven/file_storage.rs` around lines 23 - 29, The
three methods read_meta, write_meta, and delete_meta currently take a Path
parameter called path but the semantics are ambiguous; update their
documentation and parameter name to make it explicit that the argument is the
download file path (i.e., the file for which the `.vortex-meta` resume metadata
is derived/located), not the metadata file itself—for example, rename path to
download_path in the signatures (fn read_meta(&self, download_path: &Path) ...
etc.) and update the doc comments above each method to state that the
`.vortex-meta` is read/written/deleted alongside the provided download file path
so adapters implement consistent behavior.
src-tauri/src/domain/model/meta.rs (1)

27-31: Clarify byte-range boundary semantics in SegmentMeta.

Line 30 (end_byte) is ambiguous (inclusive vs exclusive). Please document the convention explicitly to avoid off-by-one mismatches between adapters.

Proposed doc clarification
 /// Per-segment resume state within a download metadata file.
 #[derive(Debug, Clone, PartialEq)]
 pub struct SegmentMeta {
     pub id: u32,
+    /// Inclusive start offset (bytes).
     pub start_byte: u64,
+    /// Inclusive end offset (bytes).
     pub end_byte: u64,
     pub downloaded_bytes: u64,
     pub completed: bool,
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/model/meta.rs` around lines 27 - 31, The SegmentMeta
struct's byte-range fields are ambiguous about whether end_byte is inclusive or
exclusive; update the SegmentMeta documentation to state the convention
explicitly (e.g., that start_byte is inclusive and end_byte is exclusive) and
clarify how downloaded_bytes relates to that range; add comments on the
SegmentMeta struct and fields (start_byte, end_byte, downloaded_bytes) so
adapters and callers know the exact boundary semantics and avoid off-by-one
errors.
src-tauri/src/domain/ports/driven/download_read_repository.rs (1)

25-26: Use fixed-width integer types for pagination params.

Line 25 and Line 26 currently use usize; this makes the interface platform-width dependent. Prefer u64 (or u32) for stable cross-target semantics.

Proposed signature tweak
     fn find_downloads(
         &self,
         filter: Option<DownloadFilter>,
         sort: Option<SortField>,
-        limit: Option<usize>,
-        offset: Option<usize>,
+        limit: Option<u64>,
+        offset: Option<u64>,
     ) -> Result<Vec<DownloadView>, DomainError>;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/ports/driven/download_read_repository.rs` around lines
25 - 26, Change the platform-dependent pagination params from usize to a
fixed-width integer (use u64) for the limit and offset fields; update the
signature(s) where limit and offset are declared (e.g., the
DownloadReadRepository trait and any structs/impls that declare "limit:
Option<usize>" and "offset: Option<usize>") to "Option<u64>", then update all
implementations and call sites to accept/return u64 and perform safe conversions
where interacting with APIs or collections that require usize (use
TryFrom/try_into or checked casts rather than unchecked as when converting u64
-> usize). Ensure tests and function signatures that reference limit/offset are
adjusted accordingly.
src-tauri/src/domain/ports/driven/tests.rs (2)

138-144: Potential truncation on 32-bit platforms.

The cast size as usize on line 141 could truncate on 32-bit platforms where usize is 32-bit. This is unlikely to cause issues in tests, but for production mock code, consider using TryInto with appropriate error handling.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/ports/driven/tests.rs` around lines 138 - 144, In
create_file (fn create_file) avoid the unchecked cast `size as usize` which can
truncate on 32-bit targets; instead attempt a fallible conversion (e.g., use
`TryInto<usize>`) on the `size` parameter and handle the Err by returning an
appropriate DomainError, and only call vec![0u8; converted_size] when the
conversion succeeds; update the function to propagate a clear error from the
conversion so tests and mocks don't silently truncate on 32-bit platforms.

146-157: Silent no-op when write exceeds buffer bounds.

The write_segment implementation silently does nothing when end > file.len(). This could mask bugs in tests where callers expect the write to succeed or fail explicitly. Consider returning an error for out-of-bounds writes.

♻️ Proposed fix
         fn write_segment(&self, path: &Path, offset: u64, data: &[u8]) -> Result<(), DomainError> {
             let key = path.to_string_lossy().into_owned();
             let mut files = self.files.lock().unwrap();
             if let Some(file) = files.get_mut(&key) {
                 let start = offset as usize;
                 let end = start + data.len();
                 if end <= file.len() {
                     file[start..end].copy_from_slice(data);
+                } else {
+                    return Err(DomainError::Storage(format!(
+                        "write out of bounds: offset {} + len {} > file size {}",
+                        offset, data.len(), file.len()
+                    )));
                 }
+            } else {
+                return Err(DomainError::Storage(format!(
+                    "file not found: {}",
+                    key
+                )));
             }
             Ok(())
         }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/ports/driven/tests.rs` around lines 146 - 157, The
write_segment test helper currently silently ignores writes that exceed the
buffer (when end > file.len()) which can hide bugs; update the write_segment
implementation to return an Err(DomainError) for out-of-bounds writes instead of
silently doing nothing: locate the write_segment function, compute start and end
as done now, and if end > file.len() return a suitable DomainError (create or
reuse an appropriate variant such as OutOfBounds/InvalidArgument) so callers
receive a clear failure; keep the rest of the locking/key lookup logic the same
and only allow the copy and Ok(()) when the bounds check passes.
src-tauri/src/domain/ports/driven/mod.rs (1)

1-1: Consider adding #[cfg(test)] to the mod tests declaration.

The mod tests; declaration isn't gated by #[cfg(test)], though the module contents are. Adding the attribute here would prevent the module from even being parsed in release builds, slightly improving compile times.

♻️ Proposed fix
+#[cfg(test)]
 mod tests;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/ports/driven/mod.rs` at line 1, The tests module is
declared with `mod tests;` but lacks a `#[cfg(test)]` gate, so add the
`#[cfg(test)]` attribute directly above the `mod tests;` declaration to prevent
the module from being parsed/compiled in non-test builds; locate the `mod
tests;` line in this file and prepend `#[cfg(test)]` so the module is only
included during `cargo test`.
src-tauri/src/domain/model/config.rs (2)

44-48: Document the Option<Option<u64>> semantics.

The speed_limit_bytes_per_sec: Option<Option<u64>> pattern enables three-way distinction, but this isn't documented. Consider adding a comment explaining:

  • None = no change
  • Some(None) = clear the limit (unlimited)
  • Some(Some(x)) = set limit to x bytes/sec
📝 Proposed documentation
     pub max_segments_per_download: Option<u32>,
+    /// Speed limit update.
+    /// - `None` → no change
+    /// - `Some(None)` → clear limit (unlimited)
+    /// - `Some(Some(x))` → set limit to `x` bytes/sec
     pub speed_limit_bytes_per_sec: Option<Option<u64>>,
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/model/config.rs` around lines 44 - 48, The field
speed_limit_bytes_per_sec on struct ConfigPatch uses an Option<Option<u64>> to
represent a three-way choice but lacks docs; add a brief comment (doc or inline)
above the ConfigPatch definition or immediately above the
speed_limit_bytes_per_sec field explaining the semantics: None = no change,
Some(None) = clear the limit (unlimited), Some(Some(x)) = set limit to x
bytes/sec, so callers and deserializers understand the tri-state intent; refer
to the ConfigPatch struct and the speed_limit_bytes_per_sec field when adding
the comment.

23-37: Empty download_dir default may cause issues.

The default download_dir is an empty string, which is likely not a valid path. Downstream code may need to handle this case specially (e.g., resolve to user's Downloads folder). Consider documenting this expectation or initializing to a sentinel value that clearly indicates "unset".

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/model/config.rs` around lines 23 - 37, The Default impl
for AppConfig sets download_dir to an empty string which is not a valid path;
update the Default for AppConfig (the default() function) to initialize
download_dir to a clear sentinel (e.g. "UNSET" or "none") or to a resolved user
Downloads path at runtime (using a platform helper like dirs::download_dir()),
and document in the AppConfig comment or type that consumers must treat that
sentinel as "unset" and resolve it to the real download directory when needed
(refer to the download_dir field and the Default implementation).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@specs/01-mvp/04-domain-ports.md`:
- Around line 44-47: Update the spec to match the implemented driving-port
trait: in the DownloadReadRepository trait replace the outdated method name
find_downloads_with_progress(...) with the implemented find_downloads(...)
signature and ensure the return/error types match the implementation by
replacing any occurrences of AppError with DomainError (also update related
references such as find_download_detail and count_by_state if they mention
AppError) so the spec and the implemented trait (DownloadReadRepository,
find_downloads, find_download_detail, count_by_state) use the same method names
and DomainError type.
- Line 4: Fix French typography and spelling in the spec: replace "coeur" →
"cœur", "metier" → "métier", "adapters" → "adaptateurs", "implementent" →
"implémentent", "reseau" → "réseau" and consider "filesystem" → "système de
fichiers"; correct apostrophes to proper French forms (e.g., use l'application
with a straight apostrophe or typographic apostrophe consistently) and audit the
nearby sentences that mention driving/driven ports to ensure accents and
apostrophes are consistent across the document (search for strings "coeur",
"metier", "adapters", "implementent", "reseau", "filesystem" to locate all
occurrences).

In `@src-tauri/src/domain/model/http.rs`:
- Around line 37-38: The content_length method currently parses the
"content-length" header directly which fails if the header contains surrounding
whitespace; update content_length (and use the existing header("content-length")
call) to trim the header string before parsing (e.g., call trim() on the header
value) and then parse into u64, returning .ok() as before so values like " 123 "
succeed.

In `@src-tauri/src/domain/ports/driven/history_repository.rs`:
- Around line 24-25: The method signature delete_older_than(&self,
before_timestamp: u64) in history_repository.rs is ambiguous about the timestamp
unit; update its documentation and signature comment to state the expected unit
(e.g., "Unix epoch milliseconds" or "Unix epoch seconds") so adapters
implementers use the correct scale, and consider renaming the parameter to
before_timestamp_ms or before_timestamp_s (or add a TimestampUnit enum) to make
the contract explicit; ensure references to delete_older_than and DomainError in
the trait doc/comments reflect the chosen unit so implementing adapters will
delete the correct range.

In `@src-tauri/src/domain/ports/driven/tests.rs`:
- Around line 263-273: The mock update_config implementation only updates
download_dir and max_concurrent_downloads; update_config (in tests.rs) should
apply all ConfigPatch fields to the locked AppConfig so the mock matches
production behaviour. For the speed_limit_bytes_per_sec: Option<Option<u64>>
field, implement three-way handling: None -> leave existing value, Some(None) ->
clear the value on AppConfig, Some(Some(v)) -> set to v. Ensure every other
ConfigPatch field (use the ConfigPatch and AppConfig field names) is checked
with if let Some(...) and assigned to config so the mock faithfully mirrors real
update_config semantics and returns the cloned AppConfig or DomainError as
before.

---

Nitpick comments:
In `@src-tauri/src/domain/model/config.rs`:
- Around line 44-48: The field speed_limit_bytes_per_sec on struct ConfigPatch
uses an Option<Option<u64>> to represent a three-way choice but lacks docs; add
a brief comment (doc or inline) above the ConfigPatch definition or immediately
above the speed_limit_bytes_per_sec field explaining the semantics: None = no
change, Some(None) = clear the limit (unlimited), Some(Some(x)) = set limit to x
bytes/sec, so callers and deserializers understand the tri-state intent; refer
to the ConfigPatch struct and the speed_limit_bytes_per_sec field when adding
the comment.
- Around line 23-37: The Default impl for AppConfig sets download_dir to an
empty string which is not a valid path; update the Default for AppConfig (the
default() function) to initialize download_dir to a clear sentinel (e.g. "UNSET"
or "none") or to a resolved user Downloads path at runtime (using a platform
helper like dirs::download_dir()), and document in the AppConfig comment or type
that consumers must treat that sentinel as "unset" and resolve it to the real
download directory when needed (refer to the download_dir field and the Default
implementation).

In `@src-tauri/src/domain/model/http.rs`:
- Around line 27-33: The header lookup in function header(&self, name: &str)
allocates by calling to_lowercase() on both the input and each header key;
change the comparison to an ASCII case-insensitive comparison to avoid
allocations — use the built-in eq_ignore_ascii_case (e.g., in the iterator find
use |(k, _)| k.eq_ignore_ascii_case(name)) so you only compare without creating
new Strings and keep the rest of header(&self, name: &str) unchanged.

In `@src-tauri/src/domain/model/meta.rs`:
- Around line 27-31: The SegmentMeta struct's byte-range fields are ambiguous
about whether end_byte is inclusive or exclusive; update the SegmentMeta
documentation to state the convention explicitly (e.g., that start_byte is
inclusive and end_byte is exclusive) and clarify how downloaded_bytes relates to
that range; add comments on the SegmentMeta struct and fields (start_byte,
end_byte, downloaded_bytes) so adapters and callers know the exact boundary
semantics and avoid off-by-one errors.

In `@src-tauri/src/domain/model/views.rs`:
- Line 133: The alias StateCountMap uses usize which is architecture-dependent;
change its value type to a fixed-width integer (u64) to make DTO/port contracts
portable—update the type alias declaration for StateCountMap (currently defined
as HashMap<DownloadState, usize>) to HashMap<DownloadState, u64> and adjust any
code that constructs or reads from StateCountMap (e.g., places that insert,
serialize, or sum counts) to use u64 accordingly.

In `@src-tauri/src/domain/ports/driven/download_read_repository.rs`:
- Around line 25-26: Change the platform-dependent pagination params from usize
to a fixed-width integer (use u64) for the limit and offset fields; update the
signature(s) where limit and offset are declared (e.g., the
DownloadReadRepository trait and any structs/impls that declare "limit:
Option<usize>" and "offset: Option<usize>") to "Option<u64>", then update all
implementations and call sites to accept/return u64 and perform safe conversions
where interacting with APIs or collections that require usize (use
TryFrom/try_into or checked casts rather than unchecked as when converting u64
-> usize). Ensure tests and function signatures that reference limit/offset are
adjusted accordingly.

In `@src-tauri/src/domain/ports/driven/file_storage.rs`:
- Around line 23-29: The three methods read_meta, write_meta, and delete_meta
currently take a Path parameter called path but the semantics are ambiguous;
update their documentation and parameter name to make it explicit that the
argument is the download file path (i.e., the file for which the `.vortex-meta`
resume metadata is derived/located), not the metadata file itself—for example,
rename path to download_path in the signatures (fn read_meta(&self,
download_path: &Path) ... etc.) and update the doc comments above each method to
state that the `.vortex-meta` is read/written/deleted alongside the provided
download file path so adapters implement consistent behavior.

In `@src-tauri/src/domain/ports/driven/http_client.rs`:
- Around line 15-23: Add a default implementation for supports_range in the
HttpClient trait that calls head(url) and parses the "accept-ranges" response
header (case-insensitive) to determine support (treat "bytes" as true, anything
else or missing as false), returning any DomainError from head() rather than
performing a second network request; use the existing head(&self, url: &str) ->
Result<HttpResponse, DomainError> and HttpResponse header accessors to implement
this logic and return Result<bool, DomainError>.

In `@src-tauri/src/domain/ports/driven/mod.rs`:
- Line 1: The tests module is declared with `mod tests;` but lacks a
`#[cfg(test)]` gate, so add the `#[cfg(test)]` attribute directly above the `mod
tests;` declaration to prevent the module from being parsed/compiled in non-test
builds; locate the `mod tests;` line in this file and prepend `#[cfg(test)]` so
the module is only included during `cargo test`.

In `@src-tauri/src/domain/ports/driven/stats_repository.rs`:
- Around line 10-18: The trait StatsRepository's record_completed(bytes,
avg_speed) signature doesn't cover fields exposed on StatsView (daily_volumes,
top_hosts), causing ambiguity for adapters; either document that StatsView's
daily_volumes and top_hosts are produced by separate history/aggregation queries
or extend the repository API so adapters can update all aggregates atomically.
Update the StatsRepository contract by adding clear docs on StatsView
completeness referencing StatsView, daily_volumes, and top_hosts (if you choose
the doc path), or modify/overload record_completed (or add a new method like
record_completed_with_aggregates) to accept the additional aggregate data
required to update daily_volumes and top_hosts so adapters (implementations of
StatsRepository) can update all aggregates consistently.

In `@src-tauri/src/domain/ports/driven/tests.rs`:
- Around line 138-144: In create_file (fn create_file) avoid the unchecked cast
`size as usize` which can truncate on 32-bit targets; instead attempt a fallible
conversion (e.g., use `TryInto<usize>`) on the `size` parameter and handle the
Err by returning an appropriate DomainError, and only call vec![0u8;
converted_size] when the conversion succeeds; update the function to propagate a
clear error from the conversion so tests and mocks don't silently truncate on
32-bit platforms.
- Around line 146-157: The write_segment test helper currently silently ignores
writes that exceed the buffer (when end > file.len()) which can hide bugs;
update the write_segment implementation to return an Err(DomainError) for
out-of-bounds writes instead of silently doing nothing: locate the write_segment
function, compute start and end as done now, and if end > file.len() return a
suitable DomainError (create or reuse an appropriate variant such as
OutOfBounds/InvalidArgument) so callers receive a clear failure; keep the rest
of the locking/key lookup logic the same and only allow the copy and Ok(()) when
the bounds check passes.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: c412d6a4-84c0-4fca-b2fe-69c3f67c2f6f

📥 Commits

Reviewing files that changed from the base of the PR and between 85c22aa and d9ea78b.

📒 Files selected for processing (27)
  • specs/01-mvp/04-domain-ports.md
  • specs/README.md
  • src-tauri/src/domain/mod.rs
  • src-tauri/src/domain/model/config.rs
  • src-tauri/src/domain/model/credential.rs
  • src-tauri/src/domain/model/http.rs
  • src-tauri/src/domain/model/meta.rs
  • src-tauri/src/domain/model/mod.rs
  • src-tauri/src/domain/model/views.rs
  • src-tauri/src/domain/ports/driven/clipboard_observer.rs
  • src-tauri/src/domain/ports/driven/config_store.rs
  • src-tauri/src/domain/ports/driven/credential_store.rs
  • src-tauri/src/domain/ports/driven/download_engine.rs
  • src-tauri/src/domain/ports/driven/download_read_repository.rs
  • src-tauri/src/domain/ports/driven/download_repository.rs
  • src-tauri/src/domain/ports/driven/event_bus.rs
  • src-tauri/src/domain/ports/driven/file_storage.rs
  • src-tauri/src/domain/ports/driven/history_repository.rs
  • src-tauri/src/domain/ports/driven/http_client.rs
  • src-tauri/src/domain/ports/driven/mod.rs
  • src-tauri/src/domain/ports/driven/plugin_loader.rs
  • src-tauri/src/domain/ports/driven/stats_repository.rs
  • src-tauri/src/domain/ports/driven/tests.rs
  • src-tauri/src/domain/ports/driving/command_handler.rs
  • src-tauri/src/domain/ports/driving/mod.rs
  • src-tauri/src/domain/ports/driving/query_handler.rs
  • src-tauri/src/domain/ports/mod.rs

# Task 04: Domain Ports

## Context
Les ports sont les traits Rust que le domaine definit. Les adapters les implementent. C'est le contrat entre le coeur metier et le monde exterieur. Les driving ports definissent comment on entre dans l'application (commands/queries). Les driven ports definissent ce dont le domaine a besoin du monde exterieur (persistence, reseau, filesystem...).
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Address flagged French typography/apostrophe issues in spec text.

There are minor but repeated text-quality issues (e.g., coeur and incorrect apostrophes) on Line 4 and Line 121–123. Worth cleaning up since this doc is a contract reference.

Also applies to: 121-123

🧰 Tools
🪛 LanguageTool

[typographical] ~4-~4: Variante typographique déconseillée de « cœur ».
Context: ...implementent. C'est le contrat entre le coeur metier et le monde exterieur. Les drivi...

(OE)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@specs/01-mvp/04-domain-ports.md` at line 4, Fix French typography and
spelling in the spec: replace "coeur" → "cœur", "metier" → "métier", "adapters"
→ "adaptateurs", "implementent" → "implémentent", "reseau" → "réseau" and
consider "filesystem" → "système de fichiers"; correct apostrophes to proper
French forms (e.g., use l'application with a straight apostrophe or typographic
apostrophe consistently) and audit the nearby sentences that mention
driving/driven ports to ensure accents and apostrophes are consistent across the
document (search for strings "coeur", "metier", "adapters", "implementent",
"reseau", "filesystem" to locate all occurrences).

Comment on lines +44 to +47
trait DownloadReadRepository: Send + Sync {
fn find_downloads_with_progress(...) -> Result<Vec<DownloadView>>;
fn find_download_detail(id) -> Result<Option<DownloadDetailView>>;
fn count_by_state() -> Result<HashMap<DownloadState, usize>>;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Spec drift with implemented trait contracts.

Line 45 still references find_downloads_with_progress(...), while the implemented port uses find_downloads(...).
Also, Line 99 and Line 104 specify AppError, but driving ports in this PR return DomainError. This mismatch can misguide follow-up tasks that implement against the spec.

Proposed spec alignment patch
 trait DownloadReadRepository: Send + Sync {
-    fn find_downloads_with_progress(...) -> Result<Vec<DownloadView>>;
+    fn find_downloads(...) -> Result<Vec<DownloadView>>;
     fn find_download_detail(id) -> Result<Option<DownloadDetailView>>;
     fn count_by_state() -> Result<HashMap<DownloadState, usize>>;
 }

 trait CommandHandler<C: Command> {
     type Output;
-    async fn handle(&self, cmd: C) -> Result<Self::Output, AppError>;
+    async fn handle(&self, cmd: C) -> Result<Self::Output, DomainError>;
 }

 trait QueryHandler<Q: Query> {
     type Output;
-    async fn handle(&self, query: Q) -> Result<Self::Output, AppError>;
+    async fn handle(&self, query: Q) -> Result<Self::Output, DomainError>;
 }

Also applies to: 97-105

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@specs/01-mvp/04-domain-ports.md` around lines 44 - 47, Update the spec to
match the implemented driving-port trait: in the DownloadReadRepository trait
replace the outdated method name find_downloads_with_progress(...) with the
implemented find_downloads(...) signature and ensure the return/error types
match the implementation by replacing any occurrences of AppError with
DomainError (also update related references such as find_download_detail and
count_by_state if they mention AppError) so the spec and the implemented trait
(DownloadReadRepository, find_downloads, find_download_detail, count_by_state)
use the same method names and DomainError type.

Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

5 issues found across 27 files

Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="src-tauri/src/domain/model/http.rs">

<violation number="1" location="src-tauri/src/domain/model/http.rs:38">
P2: Trim the `content-length` header value before parsing so valid values with surrounding whitespace are accepted.</violation>
</file>

<file name="src-tauri/src/domain/model/meta.rs">

<violation number="1" location="src-tauri/src/domain/model/meta.rs:13">
P2: `DownloadMeta` derives `Debug` while storing `url` as raw `String`, which can leak full URL tokens/credentials in logs.</violation>
</file>

<file name="src-tauri/src/domain/ports/driven/tests.rs">

<violation number="1" location="src-tauri/src/domain/ports/driven/tests.rs:271">
P3: Apply all `ConfigPatch` fields in this mock `update_config`; handling only two fields can hide patching bugs in adapters and misses the tri-state semantics of `speed_limit_bytes_per_sec`.</violation>
</file>

<file name="specs/01-mvp/04-domain-ports.md">

<violation number="1" location="specs/01-mvp/04-domain-ports.md:45">
P3: Update the spec to use `find_downloads(...)` so it matches the implemented `DownloadReadRepository` contract.</violation>

<violation number="2" location="specs/01-mvp/04-domain-ports.md:99">
P2: The Task 04 spec references `AppError` before it is introduced (Task 05), which makes the task contract internally inconsistent.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review, or fix all with cubic.

```rust
trait CommandHandler<C: Command> {
type Output;
async fn handle(&self, cmd: C) -> Result<Self::Output, AppError>;
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot Apr 7, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: The Task 04 spec references AppError before it is introduced (Task 05), which makes the task contract internally inconsistent.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At specs/01-mvp/04-domain-ports.md, line 99:

<comment>The Task 04 spec references `AppError` before it is introduced (Task 05), which makes the task contract internally inconsistent.</comment>

<file context>
@@ -0,0 +1,149 @@
+```rust
+trait CommandHandler<C: Command> {
+    type Output;
+    async fn handle(&self, cmd: C) -> Result<Self::Output, AppError>;
+}
+
</file context>
Fix with Cubic

}

trait DownloadReadRepository: Send + Sync {
fn find_downloads_with_progress(...) -> Result<Vec<DownloadView>>;
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot Apr 7, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P3: Update the spec to use find_downloads(...) so it matches the implemented DownloadReadRepository contract.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At specs/01-mvp/04-domain-ports.md, line 45:

<comment>Update the spec to use `find_downloads(...)` so it matches the implemented `DownloadReadRepository` contract.</comment>

<file context>
@@ -0,0 +1,149 @@
+}
+
+trait DownloadReadRepository: Send + Sync {
+    fn find_downloads_with_progress(...) -> Result<Vec<DownloadView>>;
+    fn find_download_detail(id) -> Result<Option<DownloadDetailView>>;
+    fn count_by_state() -> Result<HashMap<DownloadState, usize>>;
</file context>
Fix with Cubic

- Add SortDirection enum and SortOrder struct for sort direction support
- Change AppConfig.download_dir to Option<String> (None = use platform default)
- Use HashMap<String, Vec<String>> for HTTP headers (multi-value support)
- Add trim() to content_length() parsing for whitespace tolerance
- Clarify timestamp unit (Unix seconds) in HistoryRepository doc
- Complete all ConfigPatch fields in test mock update_config
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (2)
src-tauri/src/domain/model/views.rs (2)

141-145: SortOrder can derive Default to reduce call-site boilerplate.

Since both fields already have defaults, deriving Default here keeps usage cleaner.

Proposed diff
-#[derive(Debug, Clone, Copy, PartialEq, Eq)]
+#[derive(Debug, Clone, Copy, PartialEq, Eq, Default)]
 pub struct SortOrder {
     pub field: SortField,
     pub direction: SortDirection,
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/model/views.rs` around lines 141 - 145, Add a Default
impl for SortOrder by deriving Default on the struct (i.e., change
#[derive(Debug, Clone, Copy, PartialEq, Eq)] to include Default) so callers can
use SortOrder::default(); ensure the constituent types SortField and
SortDirection already implement Default (if they do not, add Default impls for
those enums/types) so the derived Default compiles.

31-31: Add explicit timestamp units on view fields.

Please document units (e.g., Unix seconds vs milliseconds) for created_at, updated_at, and completed_at directly on these fields to avoid adapter/UI drift.

Also applies to: 57-58, 78-78

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src-tauri/src/domain/model/views.rs` at line 31, Add explicit unit
documentation to the timestamp fields in the view structs: add Rust doc comments
(///) directly above each field `created_at`, `updated_at`, and `completed_at`
in src-tauri/src/domain/model/views.rs stating the unit used (e.g., "Unix
timestamp in seconds" or "Unix timestamp in milliseconds") so adapters/UI
consumers cannot drift; ensure you update all occurrences of these fields (the
ones named `created_at`, `updated_at`, and `completed_at`) so the intent is
clear in the model.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src-tauri/src/domain/model/views.rs`:
- Line 4: The module-level doc claims the view structs in views.rs "are returned
by read repositories and never mutated" but all fields are public, so either
update the docstring to remove the immutability guarantee or enforce
immutability: make each view struct's fields private, add a public constructor
(e.g., impl FooView::new(...) for each struct name found in views.rs) that sets
fields, and expose read-only accessors (e.g., FooView::id(&self) -> &Type) for
needed fields; ensure any existing code using direct field access is updated to
call the new accessors or to use the constructor, so the documentation and API
match.

---

Nitpick comments:
In `@src-tauri/src/domain/model/views.rs`:
- Around line 141-145: Add a Default impl for SortOrder by deriving Default on
the struct (i.e., change #[derive(Debug, Clone, Copy, PartialEq, Eq)] to include
Default) so callers can use SortOrder::default(); ensure the constituent types
SortField and SortDirection already implement Default (if they do not, add
Default impls for those enums/types) so the derived Default compiles.
- Line 31: Add explicit unit documentation to the timestamp fields in the view
structs: add Rust doc comments (///) directly above each field `created_at`,
`updated_at`, and `completed_at` in src-tauri/src/domain/model/views.rs stating
the unit used (e.g., "Unix timestamp in seconds" or "Unix timestamp in
milliseconds") so adapters/UI consumers cannot drift; ensure you update all
occurrences of these fields (the ones named `created_at`, `updated_at`, and
`completed_at`) so the intent is clear in the model.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: fc96738c-8f88-42ee-ac85-d7aa1fabe6e6

📥 Commits

Reviewing files that changed from the base of the PR and between 6aaa356 and 68fffe1.

📒 Files selected for processing (7)
  • src-tauri/src/domain/model/config.rs
  • src-tauri/src/domain/model/http.rs
  • src-tauri/src/domain/model/mod.rs
  • src-tauri/src/domain/model/views.rs
  • src-tauri/src/domain/ports/driven/download_read_repository.rs
  • src-tauri/src/domain/ports/driven/history_repository.rs
  • src-tauri/src/domain/ports/driven/tests.rs
✅ Files skipped from review due to trivial changes (4)
  • src-tauri/src/domain/ports/driven/history_repository.rs
  • src-tauri/src/domain/ports/driven/download_read_repository.rs
  • src-tauri/src/domain/ports/driven/tests.rs
  • src-tauri/src/domain/model/config.rs
🚧 Files skipped from review as they are similar to previous changes (1)
  • src-tauri/src/domain/model/http.rs

Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 issue found across 7 files (changes from recent commits).

Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="src-tauri/src/domain/model/config.rs">

<violation number="1" location="src-tauri/src/domain/model/config.rs:14">
P2: Making `AppConfig.download_dir` nullable without updating `ConfigPatch` introduces a tri-state mismatch: patches can no longer represent "reset download dir to default (`None`)" once a custom path is set.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review, or fix all with cubic.

mpiton added 2 commits April 8, 2026 06:56
- Custom Debug on DownloadMeta to redact URL (credential leak prevention)
- Relax views.rs doc wording: pub fields are intentional for DTOs
- ConfigPatch.download_dir is now Option<Option<String>> for tri-state
  semantics (None=no change, Some(None)=reset, Some(Some(path))=set)
Use obviously fake values (test-user/test-value) instead of
pass123/super-secret to avoid GitGuardian false positive on
generic password detection.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation rust

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant