-
Notifications
You must be signed in to change notification settings - Fork 521
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: Don't try to un-archive ar files for hashing #1773
Conversation
Signed-off-by: Xuanwo <github@xuanwo.io>
Signed-off-by: Xuanwo <github@xuanwo.io>
Signed-off-by: Xuanwo <github@xuanwo.io>
Is it a problem to remove the |
Based on my current understanding, the worst-case scenario is a cache miss when the timestamps are different, which I consider acceptable. |
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
Codecov ReportPatch coverage has no change and project coverage change:
Additional details and impacted files@@ Coverage Diff @@
## main #1773 +/- ##
==========================================
+ Coverage 29.37% 29.43% +0.06%
==========================================
Files 49 49
Lines 17626 17616 -10
Branches 8507 8501 -6
==========================================
+ Hits 5177 5185 +8
+ Misses 7357 7341 -16
+ Partials 5092 5090 -2
☔ View full report in Codecov by Sentry. |
let reader = File::open(&path) | ||
.with_context(|| format!("Failed to open file for hashing: {:?}", path))?; | ||
let mut archive = Archive::new(reader); | ||
while let Some(entry) = archive.next_entry() { | ||
let entry = entry?; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd prefer to keep the logic as is, but fallback on the hash in case ?
takes effect, such that the hash of the entire file is used. That way we can preserve behavior as is for anything non-mac
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe this could lead to increasingly delicate and intricate behavior. I suggest removing this logic entirely unless someone wishes to provide a correct implementation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What are the consequences of removing it entirely? The timestamps are still present so on mac os we get (according to the comment) no cache hits for ar
files at all. How many ar
-format files are present in an average compilation run C/C++/rust? With that information we could make a more informed choice what is worth it. We can merge the change in the meantime, to avoid breakage at the potential cost of efficiency, but the topic should be followed up + tracked in an issue. Just deleting it is not solving it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, I add a TODO notes here for hash_all_archives which explains what happened here. PTAL.
Signed-off-by: Xuanwo <github@xuanwo.io>
I've created #1776, which uses object to handle fat archives. |
it needs rebasing, could you please have a look? |
I think it is obsolete with the merge of #1776 iiuc |
Fix #1250
Thanks for @alex's repro, this PR fixed this issue by removing the un-archive logic of AR files.
I believe this issue could be reproduced by put any ar files like the following for sccache (which
ar
can't handle):