Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Publishing to multiple archives has races that may lead to corrupt archives #2190

Closed
MonsieurNicolas opened this issue Jul 10, 2019 · 0 comments · Fixed by #2211
Closed

Publishing to multiple archives has races that may lead to corrupt archives #2190

MonsieurNicolas opened this issue Jul 10, 2019 · 0 comments · Fixed by #2211
Assignees
Labels
Projects

Comments

@MonsieurNicolas
Copy link
Contributor

Looking at GzipAndPutFilesWork, it looks like GzipFileWork is run in parallel on the same files (each archive tries to compress the same bucket file for example).

I suspect this may lead to strange publish failures and even corruptions: it's possible that a gzip process is spawned right before a different archives spawns a "put" of the same file, if this happens an empty/partial file may get uploaded (also, we use stdout redirection to the final name in "process manager", not a safer "redirect to temp file + rename").

@MonsieurNicolas MonsieurNicolas added this to To do in v11.4.0 via automation Jul 10, 2019
@marta-lokhova marta-lokhova self-assigned this Jul 30, 2019
@marta-lokhova marta-lokhova moved this from To do to In progress in v11.4.0 Aug 13, 2019
v11.4.0 automation moved this from In progress to Done Aug 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
No open projects
v11.4.0
  
Done
Development

Successfully merging a pull request may close this issue.

2 participants