feat(domain): implement hexagonal port traits (task 04)#4
Conversation
Define all driven and driving port traits for the hexagonal architecture: Driven ports (12 traits): - DownloadRepository (write), DownloadReadRepository (read/CQRS) - HistoryRepository, StatsRepository - PluginLoader, CredentialStore, ConfigStore - FileStorage, HttpClient, DownloadEngine - EventBus, ClipboardObserver Driving ports (2 traits): - CommandHandler<C: Command> with RPITIT async - QueryHandler<Q: Query> with RPITIT async Supporting domain types: - views.rs: DownloadView, DownloadDetailView, HistoryEntry, StatsView - config.rs: AppConfig, ConfigPatch - credential.rs: Credential (custom Debug to redact password) - http.rs: HttpResponse - meta.rs: DownloadMeta, SegmentMeta All traits use Send + Sync, std-only (no external crates in domain), documented with /// doc comments. 18 tests verify mock implementations compile and satisfy trait bounds.
|
| GitGuardian id | GitGuardian status | Secret | Commit | Filename | |
|---|---|---|---|---|---|
| 29818992 | Triggered | Generic Password | d9ea78b | src-tauri/src/domain/ports/driven/tests.rs | View secret |
🛠 Guidelines to remediate hardcoded secrets
- Understand the implications of revoking this secret by investigating where it is used in your code.
- Replace and store your secret safely. Learn here the best practices.
- Revoke and rotate this secret.
- If possible, rewrite git history. Rewriting git history is not a trivial act. You might completely break other contributing developers' workflow and you risk accidentally deleting legitimate data.
To avoid such incidents in the future consider
- following these best practices for managing and storing secrets including API keys and other credentials
- install secret detection on pre-commit to catch secret before it leaves your machine and ease remediation.
🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
✅ Files skipped from review due to trivial changes (1)
📝 WalkthroughWalkthroughAdds a new domain ports layer (driven + driving), many domain models and view DTOs, ~12 driven-port trait interfaces, async command/query handler traits, extensive in-memory mocks/tests, and deletes Changes
Sequence Diagram(s)sequenceDiagram
participant UI as "UI / Caller"
participant CmdH as "CommandHandler"
participant Repo as "DownloadRepository"
participant Engine as "DownloadEngine"
participant FS as "FileStorage"
participant HTTP as "HttpClient"
participant Bus as "EventBus"
UI->>CmdH: StartDownload(command)
CmdH->>Repo: save(download)
CmdH->>Engine: start(&download)
Engine->>FS: create_file(path, size)
Engine->>HTTP: head(url)
Engine->>HTTP: supports_range(url)
loop per segment
Engine->>HTTP: get_range(url, start, end)
Engine->>FS: write_segment(path, offset, data)
Engine->>Bus: publish(DomainEvent::Progress)
end
Engine->>Bus: publish(DomainEvent::Complete)
Bus->>Repo: (subscribers may update state)
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Comment |
Greptile SummaryThis PR introduces all 14 hexagonal-architecture port traits for the domain layer (12 driven, 2 driving) along with 5 supporting model types and 18 focused tests. The design is clean: driven ports are synchronous, driving ports use RPITIT async (Rust 1.75+), CQRS read/write split is well-motivated, and Confidence Score: 5/5Safe to merge — all remaining findings are P2 style suggestions with no correctness impact. The implementation is structurally sound, well-tested (18 port-specific tests, 94 total passing), and clippy-clean. The three P2 issues (SortField missing direction, download_dir empty-string default, HashMap header loss) are design limitations that can be addressed without breaking changes before the adapter layer is written. src-tauri/src/domain/model/views.rs and src-tauri/src/domain/model/config.rs have minor design choices worth revisiting before adapters consume them. Important Files Changed
Flowchart%%{init: {'theme': 'neutral'}}%%
flowchart TD
UI["Frontend / Tauri IPC"] -->|"Command / Query"| DrivingPorts
subgraph DrivingPorts["Driving Ports (primary)"]
CH["CommandHandler<C: Command>"]
QH["QueryHandler<Q: Query>"]
end
CH --> AppLayer["Application Layer\n(task 05)"]
QH --> AppLayer
subgraph DrivenPorts["Driven Ports (secondary)"]
DR["DownloadRepository"]
DRR["DownloadReadRepository"]
DE["DownloadEngine"]
EB["EventBus"]
FS["FileStorage"]
HC["HttpClient"]
CS["CredentialStore"]
CFG["ConfigStore"]
HR["HistoryRepository"]
SR["StatsRepository"]
PL["PluginLoader"]
CO["ClipboardObserver"]
end
AppLayer -->|"orchestrates"| DrivenPorts
EB -->|"DomainEvent"| UI
Reviews (1): Last reviewed commit: "feat(domain): implement hexagonal port t..." | Re-trigger Greptile |
| #[derive(Debug, Clone, Copy, PartialEq, Eq, Default)] | ||
| pub enum SortField { | ||
| #[default] | ||
| CreatedAt, | ||
| FileName, | ||
| FileSize, | ||
| Progress, | ||
| Speed, | ||
| State, | ||
| } |
There was a problem hiding this comment.
src-tauri/src/domain/model/config.rs
Outdated
| fn default() -> Self { | ||
| Self { | ||
| download_dir: String::new(), |
There was a problem hiding this comment.
download_dir defaults to empty string
AppConfig::default() sets download_dir: String::new(). Any code that consumes this default without first checking for an empty path will silently resolve to the process's current working directory (or produce an unhelpful I/O error). Consider Option<String> or documenting that an empty value must be replaced before the store is used.
| #[derive(Debug, Clone, PartialEq)] | ||
| pub struct HttpResponse { | ||
| pub status_code: u16, | ||
| pub headers: HashMap<String, String>, | ||
| pub body: Vec<u8>, | ||
| } |
There was a problem hiding this comment.
HashMap<String, String> silently drops duplicate HTTP headers
HTTP allows multiple values for the same header name (e.g., Set-Cookie, Link). Storing headers in a HashMap<String, String> forces adapters to silently discard extras. This is fine for content-length/accept-ranges today, but changing to HashMap<String, Vec<String>> now avoids a breaking port contract change if a plugin later needs multi-value headers.
Specs are gitignored and should not be committed.
There was a problem hiding this comment.
Actionable comments posted: 5
🧹 Nitpick comments (12)
src-tauri/src/domain/model/views.rs (1)
133-133: Use a fixed-width integer for state counts.On Line 133,
usizeis architecture-dependent. For DTO/port contracts, a fixed-width type (u64) is safer and more portable.Proposed change
-pub type StateCountMap = HashMap<DownloadState, usize>; +pub type StateCountMap = HashMap<DownloadState, u64>;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/model/views.rs` at line 133, The alias StateCountMap uses usize which is architecture-dependent; change its value type to a fixed-width integer (u64) to make DTO/port contracts portable—update the type alias declaration for StateCountMap (currently defined as HashMap<DownloadState, usize>) to HashMap<DownloadState, u64> and adjust any code that constructs or reads from StateCountMap (e.g., places that insert, serialize, or sum counts) to use u64 accordingly.src-tauri/src/domain/model/http.rs (1)
27-33: Avoid per-lookup allocations in header matching.On Line 29 and Line 32,
to_lowercase()allocates for every call/key. Prefer ASCII-insensitive comparison directly.Proposed change
pub fn header(&self, name: &str) -> Option<&str> { - let lower = name.to_lowercase(); self.headers .iter() - .find(|(k, _)| k.to_lowercase() == lower) + .find(|(k, _)| k.eq_ignore_ascii_case(name)) .map(|(_, v)| v.as_str()) }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/model/http.rs` around lines 27 - 33, The header lookup in function header(&self, name: &str) allocates by calling to_lowercase() on both the input and each header key; change the comparison to an ASCII case-insensitive comparison to avoid allocations — use the built-in eq_ignore_ascii_case (e.g., in the iterator find use |(k, _)| k.eq_ignore_ascii_case(name)) so you only compare without creating new Strings and keep the rest of header(&self, name: &str) unchanged.src-tauri/src/domain/ports/driven/http_client.rs (1)
15-23: Avoid a second round-trip for range capability checks.Since
head()already returns headers,supports_range()can default to parsingAccept-Rangesfrom that response. This reduces adapter duplication and inconsistent behavior between the two methods.Proposed default implementation
pub trait HttpClient: Send + Sync { /// Send a HEAD request to retrieve response headers without body. fn head(&self, url: &str) -> Result<HttpResponse, DomainError>; /// Fetch a byte range from the URL (inclusive start, inclusive end). fn get_range(&self, url: &str, start: u64, end: u64) -> Result<Vec<u8>, DomainError>; /// Check whether the server supports HTTP Range requests. - fn supports_range(&self, url: &str) -> Result<bool, DomainError>; + fn supports_range(&self, url: &str) -> Result<bool, DomainError> { + let response = self.head(url)?; + Ok(response + .header("accept-ranges") + .is_some_and(|v| v.eq_ignore_ascii_case("bytes"))) + } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/ports/driven/http_client.rs` around lines 15 - 23, Add a default implementation for supports_range in the HttpClient trait that calls head(url) and parses the "accept-ranges" response header (case-insensitive) to determine support (treat "bytes" as true, anything else or missing as false), returning any DomainError from head() rather than performing a second network request; use the existing head(&self, url: &str) -> Result<HttpResponse, DomainError> and HttpResponse header accessors to implement this logic and return Result<bool, DomainError>.src-tauri/src/domain/ports/driven/stats_repository.rs (1)
10-18: Clarify aggregation contract forStatsViewcompleteness.
record_completed(bytes, avg_speed)only carries global counters, whileStatsViewalso exposesdaily_volumesandtop_hosts. Please either (a) document that those fields are derived elsewhere (e.g., history queries) or (b) extend the input payload so adapters can update all aggregates consistently.Proposed contract/doc tweak
-/// Statistics are typically computed from history data via SQL -/// aggregations. The `record_completed` method updates running -/// totals when a download finishes. +/// Statistics may be computed from history data via SQL aggregations. +/// `record_completed` is intended to update only global running totals +/// (bytes/files/speeds). Per-day and per-host aggregates may be derived +/// from history storage in `get_stats`.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/ports/driven/stats_repository.rs` around lines 10 - 18, The trait StatsRepository's record_completed(bytes, avg_speed) signature doesn't cover fields exposed on StatsView (daily_volumes, top_hosts), causing ambiguity for adapters; either document that StatsView's daily_volumes and top_hosts are produced by separate history/aggregation queries or extend the repository API so adapters can update all aggregates atomically. Update the StatsRepository contract by adding clear docs on StatsView completeness referencing StatsView, daily_volumes, and top_hosts (if you choose the doc path), or modify/overload record_completed (or add a new method like record_completed_with_aggregates) to accept the additional aggregate data required to update daily_volumes and top_hosts so adapters (implementations of StatsRepository) can update all aggregates consistently.src-tauri/src/domain/ports/driven/file_storage.rs (1)
23-29: Disambiguatepathsemantics for metadata methods.For Line 23/26/29, it’s unclear whether
pathis the download file path (from which.vortex-metais derived) or the metadata file path itself. Please make that explicit in docs (or parameter names) to avoid adapter divergence.Proposed doc-level clarification
- /// Read the `.vortex-meta` resume metadata for a download. + /// Read the `.vortex-meta` resume metadata derived from `download_path`. fn read_meta(&self, path: &Path) -> Result<Option<DownloadMeta>, DomainError>; - /// Write (or overwrite) the `.vortex-meta` resume metadata. + /// Write (or overwrite) the `.vortex-meta` resume metadata derived from `download_path`. fn write_meta(&self, path: &Path, meta: &DownloadMeta) -> Result<(), DomainError>; - /// Delete the `.vortex-meta` file (called after successful completion). + /// Delete the `.vortex-meta` file derived from `download_path` (after successful completion). fn delete_meta(&self, path: &Path) -> Result<(), DomainError>;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/ports/driven/file_storage.rs` around lines 23 - 29, The three methods read_meta, write_meta, and delete_meta currently take a Path parameter called path but the semantics are ambiguous; update their documentation and parameter name to make it explicit that the argument is the download file path (i.e., the file for which the `.vortex-meta` resume metadata is derived/located), not the metadata file itself—for example, rename path to download_path in the signatures (fn read_meta(&self, download_path: &Path) ... etc.) and update the doc comments above each method to state that the `.vortex-meta` is read/written/deleted alongside the provided download file path so adapters implement consistent behavior.src-tauri/src/domain/model/meta.rs (1)
27-31: Clarify byte-range boundary semantics inSegmentMeta.Line 30 (
end_byte) is ambiguous (inclusive vs exclusive). Please document the convention explicitly to avoid off-by-one mismatches between adapters.Proposed doc clarification
/// Per-segment resume state within a download metadata file. #[derive(Debug, Clone, PartialEq)] pub struct SegmentMeta { pub id: u32, + /// Inclusive start offset (bytes). pub start_byte: u64, + /// Inclusive end offset (bytes). pub end_byte: u64, pub downloaded_bytes: u64, pub completed: bool, }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/model/meta.rs` around lines 27 - 31, The SegmentMeta struct's byte-range fields are ambiguous about whether end_byte is inclusive or exclusive; update the SegmentMeta documentation to state the convention explicitly (e.g., that start_byte is inclusive and end_byte is exclusive) and clarify how downloaded_bytes relates to that range; add comments on the SegmentMeta struct and fields (start_byte, end_byte, downloaded_bytes) so adapters and callers know the exact boundary semantics and avoid off-by-one errors.src-tauri/src/domain/ports/driven/download_read_repository.rs (1)
25-26: Use fixed-width integer types for pagination params.Line 25 and Line 26 currently use
usize; this makes the interface platform-width dependent. Preferu64(oru32) for stable cross-target semantics.Proposed signature tweak
fn find_downloads( &self, filter: Option<DownloadFilter>, sort: Option<SortField>, - limit: Option<usize>, - offset: Option<usize>, + limit: Option<u64>, + offset: Option<u64>, ) -> Result<Vec<DownloadView>, DomainError>;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/ports/driven/download_read_repository.rs` around lines 25 - 26, Change the platform-dependent pagination params from usize to a fixed-width integer (use u64) for the limit and offset fields; update the signature(s) where limit and offset are declared (e.g., the DownloadReadRepository trait and any structs/impls that declare "limit: Option<usize>" and "offset: Option<usize>") to "Option<u64>", then update all implementations and call sites to accept/return u64 and perform safe conversions where interacting with APIs or collections that require usize (use TryFrom/try_into or checked casts rather than unchecked as when converting u64 -> usize). Ensure tests and function signatures that reference limit/offset are adjusted accordingly.src-tauri/src/domain/ports/driven/tests.rs (2)
138-144: Potential truncation on 32-bit platforms.The cast
size as usizeon line 141 could truncate on 32-bit platforms whereusizeis 32-bit. This is unlikely to cause issues in tests, but for production mock code, consider usingTryIntowith appropriate error handling.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/ports/driven/tests.rs` around lines 138 - 144, In create_file (fn create_file) avoid the unchecked cast `size as usize` which can truncate on 32-bit targets; instead attempt a fallible conversion (e.g., use `TryInto<usize>`) on the `size` parameter and handle the Err by returning an appropriate DomainError, and only call vec![0u8; converted_size] when the conversion succeeds; update the function to propagate a clear error from the conversion so tests and mocks don't silently truncate on 32-bit platforms.
146-157: Silent no-op when write exceeds buffer bounds.The
write_segmentimplementation silently does nothing whenend > file.len(). This could mask bugs in tests where callers expect the write to succeed or fail explicitly. Consider returning an error for out-of-bounds writes.♻️ Proposed fix
fn write_segment(&self, path: &Path, offset: u64, data: &[u8]) -> Result<(), DomainError> { let key = path.to_string_lossy().into_owned(); let mut files = self.files.lock().unwrap(); if let Some(file) = files.get_mut(&key) { let start = offset as usize; let end = start + data.len(); if end <= file.len() { file[start..end].copy_from_slice(data); + } else { + return Err(DomainError::Storage(format!( + "write out of bounds: offset {} + len {} > file size {}", + offset, data.len(), file.len() + ))); } + } else { + return Err(DomainError::Storage(format!( + "file not found: {}", + key + ))); } Ok(()) }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/ports/driven/tests.rs` around lines 146 - 157, The write_segment test helper currently silently ignores writes that exceed the buffer (when end > file.len()) which can hide bugs; update the write_segment implementation to return an Err(DomainError) for out-of-bounds writes instead of silently doing nothing: locate the write_segment function, compute start and end as done now, and if end > file.len() return a suitable DomainError (create or reuse an appropriate variant such as OutOfBounds/InvalidArgument) so callers receive a clear failure; keep the rest of the locking/key lookup logic the same and only allow the copy and Ok(()) when the bounds check passes.src-tauri/src/domain/ports/driven/mod.rs (1)
1-1: Consider adding#[cfg(test)]to themod testsdeclaration.The
mod tests;declaration isn't gated by#[cfg(test)], though the module contents are. Adding the attribute here would prevent the module from even being parsed in release builds, slightly improving compile times.♻️ Proposed fix
+#[cfg(test)] mod tests;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/ports/driven/mod.rs` at line 1, The tests module is declared with `mod tests;` but lacks a `#[cfg(test)]` gate, so add the `#[cfg(test)]` attribute directly above the `mod tests;` declaration to prevent the module from being parsed/compiled in non-test builds; locate the `mod tests;` line in this file and prepend `#[cfg(test)]` so the module is only included during `cargo test`.src-tauri/src/domain/model/config.rs (2)
44-48: Document theOption<Option<u64>>semantics.The
speed_limit_bytes_per_sec: Option<Option<u64>>pattern enables three-way distinction, but this isn't documented. Consider adding a comment explaining:
None= no changeSome(None)= clear the limit (unlimited)Some(Some(x))= set limit toxbytes/sec📝 Proposed documentation
pub max_segments_per_download: Option<u32>, + /// Speed limit update. + /// - `None` → no change + /// - `Some(None)` → clear limit (unlimited) + /// - `Some(Some(x))` → set limit to `x` bytes/sec pub speed_limit_bytes_per_sec: Option<Option<u64>>,🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/model/config.rs` around lines 44 - 48, The field speed_limit_bytes_per_sec on struct ConfigPatch uses an Option<Option<u64>> to represent a three-way choice but lacks docs; add a brief comment (doc or inline) above the ConfigPatch definition or immediately above the speed_limit_bytes_per_sec field explaining the semantics: None = no change, Some(None) = clear the limit (unlimited), Some(Some(x)) = set limit to x bytes/sec, so callers and deserializers understand the tri-state intent; refer to the ConfigPatch struct and the speed_limit_bytes_per_sec field when adding the comment.
23-37: Emptydownload_dirdefault may cause issues.The default
download_diris an empty string, which is likely not a valid path. Downstream code may need to handle this case specially (e.g., resolve to user's Downloads folder). Consider documenting this expectation or initializing to a sentinel value that clearly indicates "unset".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/model/config.rs` around lines 23 - 37, The Default impl for AppConfig sets download_dir to an empty string which is not a valid path; update the Default for AppConfig (the default() function) to initialize download_dir to a clear sentinel (e.g. "UNSET" or "none") or to a resolved user Downloads path at runtime (using a platform helper like dirs::download_dir()), and document in the AppConfig comment or type that consumers must treat that sentinel as "unset" and resolve it to the real download directory when needed (refer to the download_dir field and the Default implementation).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@specs/01-mvp/04-domain-ports.md`:
- Around line 44-47: Update the spec to match the implemented driving-port
trait: in the DownloadReadRepository trait replace the outdated method name
find_downloads_with_progress(...) with the implemented find_downloads(...)
signature and ensure the return/error types match the implementation by
replacing any occurrences of AppError with DomainError (also update related
references such as find_download_detail and count_by_state if they mention
AppError) so the spec and the implemented trait (DownloadReadRepository,
find_downloads, find_download_detail, count_by_state) use the same method names
and DomainError type.
- Line 4: Fix French typography and spelling in the spec: replace "coeur" →
"cœur", "metier" → "métier", "adapters" → "adaptateurs", "implementent" →
"implémentent", "reseau" → "réseau" and consider "filesystem" → "système de
fichiers"; correct apostrophes to proper French forms (e.g., use l'application
with a straight apostrophe or typographic apostrophe consistently) and audit the
nearby sentences that mention driving/driven ports to ensure accents and
apostrophes are consistent across the document (search for strings "coeur",
"metier", "adapters", "implementent", "reseau", "filesystem" to locate all
occurrences).
In `@src-tauri/src/domain/model/http.rs`:
- Around line 37-38: The content_length method currently parses the
"content-length" header directly which fails if the header contains surrounding
whitespace; update content_length (and use the existing header("content-length")
call) to trim the header string before parsing (e.g., call trim() on the header
value) and then parse into u64, returning .ok() as before so values like " 123 "
succeed.
In `@src-tauri/src/domain/ports/driven/history_repository.rs`:
- Around line 24-25: The method signature delete_older_than(&self,
before_timestamp: u64) in history_repository.rs is ambiguous about the timestamp
unit; update its documentation and signature comment to state the expected unit
(e.g., "Unix epoch milliseconds" or "Unix epoch seconds") so adapters
implementers use the correct scale, and consider renaming the parameter to
before_timestamp_ms or before_timestamp_s (or add a TimestampUnit enum) to make
the contract explicit; ensure references to delete_older_than and DomainError in
the trait doc/comments reflect the chosen unit so implementing adapters will
delete the correct range.
In `@src-tauri/src/domain/ports/driven/tests.rs`:
- Around line 263-273: The mock update_config implementation only updates
download_dir and max_concurrent_downloads; update_config (in tests.rs) should
apply all ConfigPatch fields to the locked AppConfig so the mock matches
production behaviour. For the speed_limit_bytes_per_sec: Option<Option<u64>>
field, implement three-way handling: None -> leave existing value, Some(None) ->
clear the value on AppConfig, Some(Some(v)) -> set to v. Ensure every other
ConfigPatch field (use the ConfigPatch and AppConfig field names) is checked
with if let Some(...) and assigned to config so the mock faithfully mirrors real
update_config semantics and returns the cloned AppConfig or DomainError as
before.
---
Nitpick comments:
In `@src-tauri/src/domain/model/config.rs`:
- Around line 44-48: The field speed_limit_bytes_per_sec on struct ConfigPatch
uses an Option<Option<u64>> to represent a three-way choice but lacks docs; add
a brief comment (doc or inline) above the ConfigPatch definition or immediately
above the speed_limit_bytes_per_sec field explaining the semantics: None = no
change, Some(None) = clear the limit (unlimited), Some(Some(x)) = set limit to x
bytes/sec, so callers and deserializers understand the tri-state intent; refer
to the ConfigPatch struct and the speed_limit_bytes_per_sec field when adding
the comment.
- Around line 23-37: The Default impl for AppConfig sets download_dir to an
empty string which is not a valid path; update the Default for AppConfig (the
default() function) to initialize download_dir to a clear sentinel (e.g. "UNSET"
or "none") or to a resolved user Downloads path at runtime (using a platform
helper like dirs::download_dir()), and document in the AppConfig comment or type
that consumers must treat that sentinel as "unset" and resolve it to the real
download directory when needed (refer to the download_dir field and the Default
implementation).
In `@src-tauri/src/domain/model/http.rs`:
- Around line 27-33: The header lookup in function header(&self, name: &str)
allocates by calling to_lowercase() on both the input and each header key;
change the comparison to an ASCII case-insensitive comparison to avoid
allocations — use the built-in eq_ignore_ascii_case (e.g., in the iterator find
use |(k, _)| k.eq_ignore_ascii_case(name)) so you only compare without creating
new Strings and keep the rest of header(&self, name: &str) unchanged.
In `@src-tauri/src/domain/model/meta.rs`:
- Around line 27-31: The SegmentMeta struct's byte-range fields are ambiguous
about whether end_byte is inclusive or exclusive; update the SegmentMeta
documentation to state the convention explicitly (e.g., that start_byte is
inclusive and end_byte is exclusive) and clarify how downloaded_bytes relates to
that range; add comments on the SegmentMeta struct and fields (start_byte,
end_byte, downloaded_bytes) so adapters and callers know the exact boundary
semantics and avoid off-by-one errors.
In `@src-tauri/src/domain/model/views.rs`:
- Line 133: The alias StateCountMap uses usize which is architecture-dependent;
change its value type to a fixed-width integer (u64) to make DTO/port contracts
portable—update the type alias declaration for StateCountMap (currently defined
as HashMap<DownloadState, usize>) to HashMap<DownloadState, u64> and adjust any
code that constructs or reads from StateCountMap (e.g., places that insert,
serialize, or sum counts) to use u64 accordingly.
In `@src-tauri/src/domain/ports/driven/download_read_repository.rs`:
- Around line 25-26: Change the platform-dependent pagination params from usize
to a fixed-width integer (use u64) for the limit and offset fields; update the
signature(s) where limit and offset are declared (e.g., the
DownloadReadRepository trait and any structs/impls that declare "limit:
Option<usize>" and "offset: Option<usize>") to "Option<u64>", then update all
implementations and call sites to accept/return u64 and perform safe conversions
where interacting with APIs or collections that require usize (use
TryFrom/try_into or checked casts rather than unchecked as when converting u64
-> usize). Ensure tests and function signatures that reference limit/offset are
adjusted accordingly.
In `@src-tauri/src/domain/ports/driven/file_storage.rs`:
- Around line 23-29: The three methods read_meta, write_meta, and delete_meta
currently take a Path parameter called path but the semantics are ambiguous;
update their documentation and parameter name to make it explicit that the
argument is the download file path (i.e., the file for which the `.vortex-meta`
resume metadata is derived/located), not the metadata file itself—for example,
rename path to download_path in the signatures (fn read_meta(&self,
download_path: &Path) ... etc.) and update the doc comments above each method to
state that the `.vortex-meta` is read/written/deleted alongside the provided
download file path so adapters implement consistent behavior.
In `@src-tauri/src/domain/ports/driven/http_client.rs`:
- Around line 15-23: Add a default implementation for supports_range in the
HttpClient trait that calls head(url) and parses the "accept-ranges" response
header (case-insensitive) to determine support (treat "bytes" as true, anything
else or missing as false), returning any DomainError from head() rather than
performing a second network request; use the existing head(&self, url: &str) ->
Result<HttpResponse, DomainError> and HttpResponse header accessors to implement
this logic and return Result<bool, DomainError>.
In `@src-tauri/src/domain/ports/driven/mod.rs`:
- Line 1: The tests module is declared with `mod tests;` but lacks a
`#[cfg(test)]` gate, so add the `#[cfg(test)]` attribute directly above the `mod
tests;` declaration to prevent the module from being parsed/compiled in non-test
builds; locate the `mod tests;` line in this file and prepend `#[cfg(test)]` so
the module is only included during `cargo test`.
In `@src-tauri/src/domain/ports/driven/stats_repository.rs`:
- Around line 10-18: The trait StatsRepository's record_completed(bytes,
avg_speed) signature doesn't cover fields exposed on StatsView (daily_volumes,
top_hosts), causing ambiguity for adapters; either document that StatsView's
daily_volumes and top_hosts are produced by separate history/aggregation queries
or extend the repository API so adapters can update all aggregates atomically.
Update the StatsRepository contract by adding clear docs on StatsView
completeness referencing StatsView, daily_volumes, and top_hosts (if you choose
the doc path), or modify/overload record_completed (or add a new method like
record_completed_with_aggregates) to accept the additional aggregate data
required to update daily_volumes and top_hosts so adapters (implementations of
StatsRepository) can update all aggregates consistently.
In `@src-tauri/src/domain/ports/driven/tests.rs`:
- Around line 138-144: In create_file (fn create_file) avoid the unchecked cast
`size as usize` which can truncate on 32-bit targets; instead attempt a fallible
conversion (e.g., use `TryInto<usize>`) on the `size` parameter and handle the
Err by returning an appropriate DomainError, and only call vec![0u8;
converted_size] when the conversion succeeds; update the function to propagate a
clear error from the conversion so tests and mocks don't silently truncate on
32-bit platforms.
- Around line 146-157: The write_segment test helper currently silently ignores
writes that exceed the buffer (when end > file.len()) which can hide bugs;
update the write_segment implementation to return an Err(DomainError) for
out-of-bounds writes instead of silently doing nothing: locate the write_segment
function, compute start and end as done now, and if end > file.len() return a
suitable DomainError (create or reuse an appropriate variant such as
OutOfBounds/InvalidArgument) so callers receive a clear failure; keep the rest
of the locking/key lookup logic the same and only allow the copy and Ok(()) when
the bounds check passes.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: c412d6a4-84c0-4fca-b2fe-69c3f67c2f6f
📒 Files selected for processing (27)
specs/01-mvp/04-domain-ports.mdspecs/README.mdsrc-tauri/src/domain/mod.rssrc-tauri/src/domain/model/config.rssrc-tauri/src/domain/model/credential.rssrc-tauri/src/domain/model/http.rssrc-tauri/src/domain/model/meta.rssrc-tauri/src/domain/model/mod.rssrc-tauri/src/domain/model/views.rssrc-tauri/src/domain/ports/driven/clipboard_observer.rssrc-tauri/src/domain/ports/driven/config_store.rssrc-tauri/src/domain/ports/driven/credential_store.rssrc-tauri/src/domain/ports/driven/download_engine.rssrc-tauri/src/domain/ports/driven/download_read_repository.rssrc-tauri/src/domain/ports/driven/download_repository.rssrc-tauri/src/domain/ports/driven/event_bus.rssrc-tauri/src/domain/ports/driven/file_storage.rssrc-tauri/src/domain/ports/driven/history_repository.rssrc-tauri/src/domain/ports/driven/http_client.rssrc-tauri/src/domain/ports/driven/mod.rssrc-tauri/src/domain/ports/driven/plugin_loader.rssrc-tauri/src/domain/ports/driven/stats_repository.rssrc-tauri/src/domain/ports/driven/tests.rssrc-tauri/src/domain/ports/driving/command_handler.rssrc-tauri/src/domain/ports/driving/mod.rssrc-tauri/src/domain/ports/driving/query_handler.rssrc-tauri/src/domain/ports/mod.rs
specs/01-mvp/04-domain-ports.md
Outdated
| # Task 04: Domain Ports | ||
|
|
||
| ## Context | ||
| Les ports sont les traits Rust que le domaine definit. Les adapters les implementent. C'est le contrat entre le coeur metier et le monde exterieur. Les driving ports definissent comment on entre dans l'application (commands/queries). Les driven ports definissent ce dont le domaine a besoin du monde exterieur (persistence, reseau, filesystem...). |
There was a problem hiding this comment.
Address flagged French typography/apostrophe issues in spec text.
There are minor but repeated text-quality issues (e.g., coeur and incorrect apostrophes) on Line 4 and Line 121–123. Worth cleaning up since this doc is a contract reference.
Also applies to: 121-123
🧰 Tools
🪛 LanguageTool
[typographical] ~4-~4: Variante typographique déconseillée de « cœur ».
Context: ...implementent. C'est le contrat entre le coeur metier et le monde exterieur. Les drivi...
(OE)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@specs/01-mvp/04-domain-ports.md` at line 4, Fix French typography and
spelling in the spec: replace "coeur" → "cœur", "metier" → "métier", "adapters"
→ "adaptateurs", "implementent" → "implémentent", "reseau" → "réseau" and
consider "filesystem" → "système de fichiers"; correct apostrophes to proper
French forms (e.g., use l'application with a straight apostrophe or typographic
apostrophe consistently) and audit the nearby sentences that mention
driving/driven ports to ensure accents and apostrophes are consistent across the
document (search for strings "coeur", "metier", "adapters", "implementent",
"reseau", "filesystem" to locate all occurrences).
specs/01-mvp/04-domain-ports.md
Outdated
| trait DownloadReadRepository: Send + Sync { | ||
| fn find_downloads_with_progress(...) -> Result<Vec<DownloadView>>; | ||
| fn find_download_detail(id) -> Result<Option<DownloadDetailView>>; | ||
| fn count_by_state() -> Result<HashMap<DownloadState, usize>>; |
There was a problem hiding this comment.
Spec drift with implemented trait contracts.
Line 45 still references find_downloads_with_progress(...), while the implemented port uses find_downloads(...).
Also, Line 99 and Line 104 specify AppError, but driving ports in this PR return DomainError. This mismatch can misguide follow-up tasks that implement against the spec.
Proposed spec alignment patch
trait DownloadReadRepository: Send + Sync {
- fn find_downloads_with_progress(...) -> Result<Vec<DownloadView>>;
+ fn find_downloads(...) -> Result<Vec<DownloadView>>;
fn find_download_detail(id) -> Result<Option<DownloadDetailView>>;
fn count_by_state() -> Result<HashMap<DownloadState, usize>>;
}
trait CommandHandler<C: Command> {
type Output;
- async fn handle(&self, cmd: C) -> Result<Self::Output, AppError>;
+ async fn handle(&self, cmd: C) -> Result<Self::Output, DomainError>;
}
trait QueryHandler<Q: Query> {
type Output;
- async fn handle(&self, query: Q) -> Result<Self::Output, AppError>;
+ async fn handle(&self, query: Q) -> Result<Self::Output, DomainError>;
}Also applies to: 97-105
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@specs/01-mvp/04-domain-ports.md` around lines 44 - 47, Update the spec to
match the implemented driving-port trait: in the DownloadReadRepository trait
replace the outdated method name find_downloads_with_progress(...) with the
implemented find_downloads(...) signature and ensure the return/error types
match the implementation by replacing any occurrences of AppError with
DomainError (also update related references such as find_download_detail and
count_by_state if they mention AppError) so the spec and the implemented trait
(DownloadReadRepository, find_downloads, find_download_detail, count_by_state)
use the same method names and DomainError type.
There was a problem hiding this comment.
5 issues found across 27 files
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="src-tauri/src/domain/model/http.rs">
<violation number="1" location="src-tauri/src/domain/model/http.rs:38">
P2: Trim the `content-length` header value before parsing so valid values with surrounding whitespace are accepted.</violation>
</file>
<file name="src-tauri/src/domain/model/meta.rs">
<violation number="1" location="src-tauri/src/domain/model/meta.rs:13">
P2: `DownloadMeta` derives `Debug` while storing `url` as raw `String`, which can leak full URL tokens/credentials in logs.</violation>
</file>
<file name="src-tauri/src/domain/ports/driven/tests.rs">
<violation number="1" location="src-tauri/src/domain/ports/driven/tests.rs:271">
P3: Apply all `ConfigPatch` fields in this mock `update_config`; handling only two fields can hide patching bugs in adapters and misses the tri-state semantics of `speed_limit_bytes_per_sec`.</violation>
</file>
<file name="specs/01-mvp/04-domain-ports.md">
<violation number="1" location="specs/01-mvp/04-domain-ports.md:45">
P3: Update the spec to use `find_downloads(...)` so it matches the implemented `DownloadReadRepository` contract.</violation>
<violation number="2" location="specs/01-mvp/04-domain-ports.md:99">
P2: The Task 04 spec references `AppError` before it is introduced (Task 05), which makes the task contract internally inconsistent.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review, or fix all with cubic.
specs/01-mvp/04-domain-ports.md
Outdated
| ```rust | ||
| trait CommandHandler<C: Command> { | ||
| type Output; | ||
| async fn handle(&self, cmd: C) -> Result<Self::Output, AppError>; |
There was a problem hiding this comment.
P2: The Task 04 spec references AppError before it is introduced (Task 05), which makes the task contract internally inconsistent.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At specs/01-mvp/04-domain-ports.md, line 99:
<comment>The Task 04 spec references `AppError` before it is introduced (Task 05), which makes the task contract internally inconsistent.</comment>
<file context>
@@ -0,0 +1,149 @@
+```rust
+trait CommandHandler<C: Command> {
+ type Output;
+ async fn handle(&self, cmd: C) -> Result<Self::Output, AppError>;
+}
+
</file context>
specs/01-mvp/04-domain-ports.md
Outdated
| } | ||
|
|
||
| trait DownloadReadRepository: Send + Sync { | ||
| fn find_downloads_with_progress(...) -> Result<Vec<DownloadView>>; |
There was a problem hiding this comment.
P3: Update the spec to use find_downloads(...) so it matches the implemented DownloadReadRepository contract.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At specs/01-mvp/04-domain-ports.md, line 45:
<comment>Update the spec to use `find_downloads(...)` so it matches the implemented `DownloadReadRepository` contract.</comment>
<file context>
@@ -0,0 +1,149 @@
+}
+
+trait DownloadReadRepository: Send + Sync {
+ fn find_downloads_with_progress(...) -> Result<Vec<DownloadView>>;
+ fn find_download_detail(id) -> Result<Option<DownloadDetailView>>;
+ fn count_by_state() -> Result<HashMap<DownloadState, usize>>;
</file context>
- Add SortDirection enum and SortOrder struct for sort direction support - Change AppConfig.download_dir to Option<String> (None = use platform default) - Use HashMap<String, Vec<String>> for HTTP headers (multi-value support) - Add trim() to content_length() parsing for whitespace tolerance - Clarify timestamp unit (Unix seconds) in HistoryRepository doc - Complete all ConfigPatch fields in test mock update_config
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (2)
src-tauri/src/domain/model/views.rs (2)
141-145:SortOrdercan deriveDefaultto reduce call-site boilerplate.Since both fields already have defaults, deriving
Defaulthere keeps usage cleaner.Proposed diff
-#[derive(Debug, Clone, Copy, PartialEq, Eq)] +#[derive(Debug, Clone, Copy, PartialEq, Eq, Default)] pub struct SortOrder { pub field: SortField, pub direction: SortDirection, }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/model/views.rs` around lines 141 - 145, Add a Default impl for SortOrder by deriving Default on the struct (i.e., change #[derive(Debug, Clone, Copy, PartialEq, Eq)] to include Default) so callers can use SortOrder::default(); ensure the constituent types SortField and SortDirection already implement Default (if they do not, add Default impls for those enums/types) so the derived Default compiles.
31-31: Add explicit timestamp units on view fields.Please document units (e.g., Unix seconds vs milliseconds) for
created_at,updated_at, andcompleted_atdirectly on these fields to avoid adapter/UI drift.Also applies to: 57-58, 78-78
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src-tauri/src/domain/model/views.rs` at line 31, Add explicit unit documentation to the timestamp fields in the view structs: add Rust doc comments (///) directly above each field `created_at`, `updated_at`, and `completed_at` in src-tauri/src/domain/model/views.rs stating the unit used (e.g., "Unix timestamp in seconds" or "Unix timestamp in milliseconds") so adapters/UI consumers cannot drift; ensure you update all occurrences of these fields (the ones named `created_at`, `updated_at`, and `completed_at`) so the intent is clear in the model.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src-tauri/src/domain/model/views.rs`:
- Line 4: The module-level doc claims the view structs in views.rs "are returned
by read repositories and never mutated" but all fields are public, so either
update the docstring to remove the immutability guarantee or enforce
immutability: make each view struct's fields private, add a public constructor
(e.g., impl FooView::new(...) for each struct name found in views.rs) that sets
fields, and expose read-only accessors (e.g., FooView::id(&self) -> &Type) for
needed fields; ensure any existing code using direct field access is updated to
call the new accessors or to use the constructor, so the documentation and API
match.
---
Nitpick comments:
In `@src-tauri/src/domain/model/views.rs`:
- Around line 141-145: Add a Default impl for SortOrder by deriving Default on
the struct (i.e., change #[derive(Debug, Clone, Copy, PartialEq, Eq)] to include
Default) so callers can use SortOrder::default(); ensure the constituent types
SortField and SortDirection already implement Default (if they do not, add
Default impls for those enums/types) so the derived Default compiles.
- Line 31: Add explicit unit documentation to the timestamp fields in the view
structs: add Rust doc comments (///) directly above each field `created_at`,
`updated_at`, and `completed_at` in src-tauri/src/domain/model/views.rs stating
the unit used (e.g., "Unix timestamp in seconds" or "Unix timestamp in
milliseconds") so adapters/UI consumers cannot drift; ensure you update all
occurrences of these fields (the ones named `created_at`, `updated_at`, and
`completed_at`) so the intent is clear in the model.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: fc96738c-8f88-42ee-ac85-d7aa1fabe6e6
📒 Files selected for processing (7)
src-tauri/src/domain/model/config.rssrc-tauri/src/domain/model/http.rssrc-tauri/src/domain/model/mod.rssrc-tauri/src/domain/model/views.rssrc-tauri/src/domain/ports/driven/download_read_repository.rssrc-tauri/src/domain/ports/driven/history_repository.rssrc-tauri/src/domain/ports/driven/tests.rs
✅ Files skipped from review due to trivial changes (4)
- src-tauri/src/domain/ports/driven/history_repository.rs
- src-tauri/src/domain/ports/driven/download_read_repository.rs
- src-tauri/src/domain/ports/driven/tests.rs
- src-tauri/src/domain/model/config.rs
🚧 Files skipped from review as they are similar to previous changes (1)
- src-tauri/src/domain/model/http.rs
There was a problem hiding this comment.
1 issue found across 7 files (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="src-tauri/src/domain/model/config.rs">
<violation number="1" location="src-tauri/src/domain/model/config.rs:14">
P2: Making `AppConfig.download_dir` nullable without updating `ConfigPatch` introduces a tri-state mismatch: patches can no longer represent "reset download dir to default (`None`)" once a custom path is set.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review, or fix all with cubic.
- Custom Debug on DownloadMeta to redact URL (credential leak prevention) - Relax views.rs doc wording: pub fields are intentional for DTOs - ConfigPatch.download_dir is now Option<Option<String>> for tri-state semantics (None=no change, Some(None)=reset, Some(Some(path))=set)
Use obviously fake values (test-user/test-value) instead of pass123/super-secret to avoid GitGuardian false positive on generic password detection.
Summary
domain/ports/DownloadRepository(write, domain aggregates) vsDownloadReadRepository(read, flattened DTOs)DebugonCredentialto redact passwords (same pattern asUrl)Changes
New port traits (
domain/ports/driven/):DownloadRepository,DownloadReadRepository,HistoryRepository,StatsRepository,PluginLoader,CredentialStore,ConfigStore,FileStorage,HttpClient,DownloadEngine,EventBus,ClipboardObserverNew driving traits (
domain/ports/driving/):CommandHandler<C: Command>,QueryHandler<Q: Query>New domain types (
domain/model/):DownloadView,DownloadDetailView,HistoryEntry,StatsView,SegmentView,DownloadFilter,SortField,AppConfig,ConfigPatch,Credential,HttpResponse,DownloadMeta,SegmentMetaDesign decisions
tokio::spawn). Domain stays pure std.DownloadEngineis fire-and-forget:start()spawns async work, outcomes reported viaEventBus.DomainErrorused as error type:AppErrorwill be introduced in task 05.pubfields (data-only, no invariants).Test plan
cargo test)cargo clippy -- -D warningscleandomain/ports/'staticon EventBus, PluginLoader fallibility, SegmentView typing)Summary by cubic
Implements all domain hexagonal ports and DTOs, setting clear CQRS read/write boundaries in a std-only domain ready for adapters. Also adds sorting direction, tri-state config patches, safer HTTP headers/defaults, sensitive Debug redactions, removes
specs/, and updates tests to use fake credentials to satisfy GitGuardian.New Features
DownloadRepository/DownloadReadRepository,HistoryRepository,StatsRepository,PluginLoader,CredentialStore,ConfigStore,FileStorage,HttpClient,DownloadEngine,EventBus,ClipboardObserver.CommandHandler<C: Command>andQueryHandler<Q: Query>using RPITIT async; all ports areSend + Syncand synchronous.DownloadView,DownloadDetailView,SegmentView,HistoryEntry,StatsView, filters +SortField+SortDirection+SortOrder), config (AppConfig,ConfigPatch), auth (Credentialwith redactedDebug), HTTP (HttpResponsewith multi-value headers), resume meta (DownloadMetawith redacted URL,SegmentMeta).DownloadEngineis fire-and-forget with outcomes viaEventBus; errors useDomainError. Tests include in-memory mocks,Send/Syncchecks, and basic CRUD/flow coverage. Removedspecs/.Bug Fixes
SortDirectionandSortOrder.AppConfig.download_diris nowOption<String>;ConfigPatch.download_dirsupports tri-state updates.HashMap<String, Vec<String>>;content-lengthparsing trims whitespace.HistoryRepositorydocs;ConfigPatchmock updates all fields; test credentials switched to obvious fakes to avoid GitGuardian false positives.Written for commit 4bdb073. Summary will update on new commits.
Summary by CodeRabbit