New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question] Metadata in Distributed Builds #5282
Comments
Are you talking about the |
You might be interested in this thread, but we really don't have a solution without using multiple repos: #871 |
I meant And yes, you've understood my situation right. I build on different environments on possibly different machines altogether. It's inconvenient to add an upload job for each build job instead of a generic single one. While the issue seems interesting I have a different case. Before I want to upload a bunch - I need to somehow merge them, which is not possible natively now. Maybe through the |
Yes, sorry it was a typo, I meant |
I don't see a better alternative from what you are doing... the thing is, the revision is "calculated" during the export and the only place it is kept is the metadata... you cannot regenerate it safety from the files, because the revision could depend on the git commit of the recipe. |
My concern is that the manual merge of metadata is not a supported feature and could break at any release (making the naive merge broken). I guess I need to explore the |
Yes... I understand your concern, we shouldn't change that so easily but it could happen. |
We're running into an identical situation. We need to run many builds on many different agents. We want all the builds to succeed, then to collect the packages from all the agents onto a single publishing agent. Then do a single publish job up to artifactory. It would be nice to get some support from Conan on this. For example, a job kicks off to build MySpecialLib/1.0.0@owner/stream On each agent I'd like to say Then I'd be responsible for getting those to the publish agent where I'd like to say In my usecase it'd be fine to assume that each zip contains builds for only one version of one package from one owner and one stream. (edits fixed the syntax of the full package identifier) |
Thanks for your feedback @ptc-ccrabb.
Let me know what do you think. The idea is to use the devops flows that are already working for other languages and technologies like Java Maven. So you can "promote" packages between repositories in order to follow the dev cycle, "develop", "QA", "release"... |
I've updated the initial message with comments on the usage of |
@lasote An update on the situation. I rewrote everything to use a separate Conan repository like you suggested, with a
The issue with |
I think this ticket can be closed now. Besides the above comments, there are new tools like cache save/restore of packages recently released (https://blog.conan.io/2023/11/28/Conan-new-features-2-0-14.html), or learned good practices with server side package promotions (supported by new tools like package-lists, and new work-in-progress commands in conan-extensions) |
The problem:
When performing multiple builds in parallel on different machines it's impossible to properly merge them out of the box on the upload machine.
Question:
Would it be possible to somehow help with that scenario?
Right now I naively merge the metadata files (recipe is guaranteed to be the same because of separate pre-build step to export it). Considering future improvements of metadata information it could break one day.
Would it be useful to have a command to regenerate the metadata? How would it work with revisions and such?
UPDATE:
I've ended up using the
export-pkg
command to merge build results from different workers on a single node.I'm using the development flow to achieve that result, but there are some caveats to be aware of. So watch out.Sadly the development flow is broken for a lot of recipes. Do not use it for the building of projects. Use
conan create
and manually store the package id somewhere toexport-pkg
the correct folder.So we build using that and store the results in a
packages/${CI_JOB_ID}/install
for install files andpackages/${CI_JOB_ID}/package
for the resulted package. So far works great without any hacks!The text was updated successfully, but these errors were encountered: