Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve buildcache sync and parallelize #39008

Draft
wants to merge 5 commits into
base: develop
Choose a base branch
from

Conversation

scottwittenburg
Copy link
Contributor

Prototype that builds on #38866 to parallelize the version of spack buildcache sync that syncs all environment specs from one mirror to another.

Working on this, I noticed that using Stage/FetchStrategy does not parallelize well, and results in files that end up in the wrong stage directories. I'm not 100% sure, but I suspect there is not a 1-1 relationship between them (many stages share a fetch strategy, and each one says fetcher.stage = this), making Stage not thread-safe. At least not the way I was using it. So this PR removes the use of Stage for fetching buildcache files.

Still todo on this PR would include:

  • parallelize the --manifest-glob path, since that's what is used currently in spack protected-publish jobs
  • better error handling
  • allow to specify thread concurrency on the command line

Provide option to only sync buildcache entries that are properly
signed.  Also, gracefully ignore any entries from the manifest
that did not get created created in the pipeline.  Add tests of
new and previously untested code paths.
@spackbot-app spackbot-app bot added commands core PR affects Spack core functionality gitlab Issues related to gitlab integration tests General test capability(ies) labels Jul 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
commands core PR affects Spack core functionality gitlab Issues related to gitlab integration tests General test capability(ies)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant