Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature] Generate multiple packages from single compilation #3106

Closed
Ri0n opened this issue Jun 25, 2018 · 28 comments
Closed

[feature] Generate multiple packages from single compilation #3106

Ri0n opened this issue Jun 25, 2018 · 28 comments
Assignees

Comments

@Ri0n
Copy link

Ri0n commented Jun 25, 2018

It's quite a common task for distros' package managers to compile once and split binaries into multiple packages. But such a common and extremely useful feature for some reason is not supported by Conan.

So when it's required to generate multiple packages real magic has to be applied like changes in upstream sources to be more compatible with Conan, maintaining multiple conanfiles.py in different directories duplicating most of information. some trickery to make all of this working on 3rdparty CI like appveyor and others. Recompilation of some common part of sources for each next subproject. It becomes even harder when a project depends on a set of other libraries each of them having a little different dependencies.

So I'm looking forward you guys can handle this somehow. So I basically see two major tasks:

  1. A conanfile should indicate it generates multiple binary packages with specific names, so install and package_info should do this per name.
  2. build method should have a way to get subset of current project dependencies and generated from them compiler flags, libs list etc to be used per subpackage. (example: a proiect compiles core library and a set of plugins. each plugin has its own dependencies and each plugin will be packed into separate Conan binary package. So single list of all possible dependencies not always can be used when we need to modify project sources from Conan file. For example when we patch Visual Studio project files on the fly)
@memsharded
Copy link
Member

Breaking the 1-1 relationship between recipe and reference would be extremely difficult.

There are still many unknowns in what you are suggesting. For example: what happens with dependencies? If you have a monolithic build and want to split into different package builds, means that most times there will be relationships between the packages. How to model that in a single conanfile? You suggest in the build() method, but the build method is only called when necessary to build from source, but what happens when you want to install pre-built binaries? Also, if it is necessary to build from sources those packages, and they are independent, how to avoid CI re-building it again and again, if they are built in parallel?

The only possible solution to this problem is already implemented in conan:

  • Have one package to wrap and package the binaries of the whole build
  • Have one conanfile.py per desired package name, that build_require the package containing the build, and re-package the desired parts

@Ri0n
Copy link
Author

Ri0n commented Jun 25, 2018

Why not just copy model or its part from some existing package manager. debian's for example.
Then conanfile might always mean repacking of some binaries. Then a few other files are required for meta information (mostly version info) and build script.

The hack with build_require doesn't solve p.2. but in fact getting subset of deps could be expressed in terms of other more simpler feature request. My goal is to carefully change/sed sources before build. For this I have to know dependencies for each specific subproject.

@memsharded
Copy link
Member

You mean https://wiki.debian.org/PkgSplit?

Extracted from there:

Splitting a Debian source package into several smaller binary packages is not trivial.
Before you start, you must read the essential guides

Debian policy , Debian developers reference , Debian new maintainers guide , Debian developers manuals , Create Debian Linux packages , Debian Binary Package Building HOWTO , How to backport packages to your version , Debian packaging tutorial , HowToPackageForDebian , Debian Packaging , debhelper(7) manpage , Autobook - GNU autoconf, automake, libtool book (online and paper) , Learning the GNU development tools ,

Now multiply that with having to support multiple build systems and compilers, conditional dependencies in different OS, etc... It is simply not feasible without an incredibly high complexity into the conan codebase that will compromise the whole project functioning and future maintenance.

The hack with build_require doesn't solve p.2. but in fact getting subset of deps could be expressed in terms of other more simpler feature request. My goal is to carefully change/sed sources before build. For this I have to know dependencies for each specific subproject.

I don't fully understand why it doesn't solve it. In each package recipe you can specify the dependencies you want. An ordered set of conanfiles and some helper scripts might do the task decently. Which is the project you want to do this? What would be the sources and the different packages you want to create from it? Maybe the best would be to setup a git repo with a proof of concept of the suggested build_requires and several consumer recipes, and collaborate on it.

@Ri0n
Copy link
Author

Ri0n commented Jun 26, 2018

Example

We compile ProjectX. ProjectX depends on Dep1 and Dep2. ProjectX has single conanfile.py and comprised of 2 subprojects (read: has single *.sln files and 2 *.vcxproj files). Subproj1 doesn't depend on Dep2 as well as Subproj2 doesn't depend on Dep1. So we have dependencies Dep1>Subproj1 and Dep2>Subproj2. Combined package_info of Dep1 and Dep2 has some intersecting flags leading to compilation failure being applied to both Subproj1 and Subproj2.

The goal is to take compilation flags/libs of Dep1 and patch only Subproj1 (generate proper *.props file or use replace_in_file), then do the same for Subproj2 and only then we can compile ProjectX properly.

@Minimonium
Copy link
Contributor

Minimonium commented Jun 26, 2018

@Ri0n

It's quite a common task for distros' package managers to compile once and split binaries into multiple packages. But such a common and extremely useful feature for some reason is not supported by Conan.

Isn't it the export-pkg for?

The goal is to take compilation flags/libs of Dep1 and patch only Subproj1 (generate proper *.props file or use replace_in_file), then do the same for Subproj2 and only then we can compile ProjectX properly.

Isn't it the problem that some flags are not propagated? Isn't it the different problem?

@Ri0n
Copy link
Author

Ri0n commented Jun 26, 2018

Isn't it the export-pkg for?

Is there any instruction how to use this approach with appveyor and "conan create"?

Isn't it the problem that some flags are not propagated? Isn't it the different problem?

No. Let's assume ProjectX is a OpenSSL wrapper and it has two plugins one for OpenSSL 1.0 and another one for OpenSSL 1.1. So ProjectX will have all openssl libs in dependencies and all paths to openssls will be added to LIB by Conan. Also it's possible both versions of OpenSSL will have the same library names (e.g. ssl.lib). Now we have now any guarantee Plugin1 will be linked with OpenSSL 1.0 and Plugin2 will link OpenSSL 1.1. One of them will definitely link with wrong version of OpenSSL leading to compilation failure.

@memsharded
Copy link
Member

Is there any instruction how to use this approach with appveyor and "conan create"?

No, conan export-pkg doesn't work with create. It works packaging from an existing build in user space, not in the local cache.

Let's assume ProjectX is a OpenSSL wrapper and it has two plugins one for OpenSSL 1.0 and another one for OpenSSL 1.1.

This looks like a problem. In the general case you can't have 2 versions of the same library in the same dependency graph.

@madpipeline
Copy link

I'm going to hijack this feature request, because the title matches what I want to do but the implementation is a little bit different from what @Ri0n described.

I'm a member of the KDE community and I'm working on adopting Conan in KDE projects.

We have several (many) projects called frameworks.

  • Each framework is in it's own repo
  • Each framework produces from one to multiple artifacts
  • An artifact can be: a library, an executable, documentation, etc.
  • The artifacts in the framework can sometimes be used on their own
  • Some artifacts depend on other artifacts in the framework

Example:

The KConfig framework produces 2 libraries: KConfigCore and KConfigGUI. The latter depends on the first, but also needs Qt-Gui, which KConfigCore of course doesn't.

Linux distributions build this framework and the package each libraries separately.

We want to do the same thing with Conan, and keep the conanfile.py in the same repo with the sources.

We've looked over having conanfiles in separate subfolders, but the scm attribute doesn't work anymore. We're keeping track of our implementation experiments here, if you're interested.

@memsharded do you have any recommendations for us?
@solvingj also mentioned some interest in this topic.

@gladhorn
Copy link

The same as @ovidiub13 says also applies to Qt. Currently there is a conan package by the community, but that's a monolithic big one, we should instead take advantage of clean dependencies of the individual libraries, so people can get the parts they need, especially for deployment that would be a lot easier. There are several libraries living in one git repository and we could even add conan files in the repos, but for that to be viable, it would be great if one conan file could produce several packages/artifacts.

@memsharded
Copy link
Member

Hi, thanks @gladhorn and @ovidiub13 for the feedback.

There could be different approaches:

  1. Modularizing the build & packaging. This is what the bincrafters have done with Boost, and I think it is the approach that really makes sense. Having such huge monolithic project and builds is because of the historical lack of package manager, but now conan can really help in this modularization.

I know this is a big investment, and might not be feasible in the short term.

  1. One conanfile tries to generate multiple packages (with different names QtCore/1.2@..., QtUI/1.2@... etc). I find it very hard to figure out a reasonable syntax for the names, components, requirements, etc. How will inter-components dependencies be declared? How will options of different components will be declared, and used? I think the complexity of this approach would make it not usable from the user experience point of view, and also conan code complexity could be too much to maintain, but I am looking forward your proposals.

  2. Different conanfiles try to reuse the same build of the project, not rebuilding the whole project.
    This can already be achieved, by using re-packaging: use a conanfile (lets call it QtMonoBuild, for example) to do the monolithic build, then have a different conanfile for each individual component that will build-require QtMonoBuild, and will re-package the parts they need. This could be a good balance, keeping the monolithic built package for simplicity, but taking advantage of conan to modularize downstream. Is this what you are trying with KDE @ovidiub13 ? If you are interested in this approach, I could try to setup a proof of concept.

@ovidiub13

We've looked over having conanfiles in separate subfolders, but the scm attribute doesn't work anymore.

The SCM is quite new feature, the github issue is already reported, so this might be solved when possible.

@Minimonium
Copy link
Contributor

Minimonium commented Aug 24, 2018

I have some thoughts kinda related to the whole monolith/multiple packages thing.

@memsharded Solution of having one single monobuild for generating multiple subpackages is great aside from one little thing that got me confused.

I have been thinking about the situation where I do have a dependency with, for example, a single boost/... and another with specified boost_.../... deps.
But the problem is, they're not related to each other at all (or am I missing something?) while they probably should.

There are multiple possible use cases with such schema (boost, Qt, KDE, Bloomberg' package groups) and I think it'd make sense to introduce a feature that would allow producers to hint Conan dependency tree that they are related.

For example, in one big boost package recipe you would define a package_group="boost", I think one can't just let it implicit because of potential name conflicts. While in a small boost_xxx package recipe you would define package_group="boost" too and it would fire a conflict when you use mismatched versions/(users/channels?) of subprojects or mix them with a mono build (something like name==package_group means that package is a mono one?).

But the trick is that something like that would require intrusiveness into the recipe to detect the conflict. And there was that thing about ABI compatibility defined by the producer instead of the consumer like now (which is not really UX friendly) which also requires the same intrusiveness, so I'm interested in your opinions in this whole thing.

@madpipeline
Copy link

@memsharded I agree with your vision that the monolithic repos exist currently due to a lack of a package manager, and keeping this in mind, I'm hoping that if we find the right solution for this, and write a documentation page in this section, we will soon see monolithic projects get modularized, not just in concept, but also in sources.

For KDE and I guess also Qt, option 1 would not really work, because that requires having the conan recipes in their own repos, or at least in a separate location than the actual sources, which would greatly motivate people not to use them or maintain them. Especially during the "trial" period.

Keeping in mind the first paragraph of this comment and @memsharded's options 2 and 3 , I'm imagining one solution could be like this:

/kconfig
    - conanfile.py (parent)
    kconfigcore/
        - conanfile.py (child)
    kconfiggui/
        - conanfile.py (child)

This is of course pseudo arrangement, as the repo doesn't actually look like this currently.

The parent conanfile.py would do the monolithic build and create a package that would exist only on the machine that creates the package, as we expect the users to only consume the module packages.

The child conanfile.py would "depend" on the parent conanfile.py in some manner. Each child conanfile.py could also depend on one or more of it's siblings.

The problem with this approach (having the parent conanfile.py) comes when people are trying to --build missing. They would be required to build the whole monolithic package, even though they don't require it.


Another option, is to have a single conanfile.py in the project root and based on some command line argument, pass some info to CMake and only build the module/library/etc that is specified.

This also brings the problem of how do you specify the iner dependencies...


It's late for me... @memsharded can you draft the POC you've mentioned, so we have a base to discuss on?

@db4
Copy link
Contributor

db4 commented Sep 5, 2018

I'm trying to generate multiple packages from a big CMake build tree according to approach 3 in the @memsharded's comment.

So I created conanfile.py for the master build:

from conans import ConanFile, CMake

class BuildConan(ConanFile):
    name = "Build"
    scm = {
        "type": "git",
        "url": "auto",
        "revision": "auto",
        "submodule": "recursive"
    }
    settings = "os", "compiler", "build_type", "arch"
    generators = "cmake"

    build_requires = # packages needed for the build...

    def build(self):
        cmake = CMake(self)
        cmake.configure()
        cmake.build()

    def package(self):
        # copying build artifacts into <Artifact_name>/ folders

and conanfile_module.py to create individual artifact packages:

from conans import ConanFile

class ModuleConan(ConanFile):
    settings = "os", "compiler", "build_type", "arch"
    keep_imports = True

    def build_requirements(self):
        self.build_requires("Build/%s@%s/%s" % (self.version, self.user, self.channel))

    def imports(self):
        self.copy("*", src=self.name, dst=self.name)

    def package(self):
        self.copy("*", src=self.name)

This is almost universal as name property is not specified explicitly so I can do

conan create conanfile_module.py Artifact1/1.0.1234@user/channel
conan create conanfile_module.py Artifact2/1.0.1234@user/channel

using the same conanfile_module.py

But some artifacts requires specific dependencies (requires attribute) so one conanfile_module.py does not actually fit all and some conanfile_moduleX.py`, conanfile_moduleY.py, ``conanfile_moduleZ.py ... are needed, sharing the code with conanfile_module.py. At first glance new python_requires feature looks like a solution. But it needs the full reference outside the ConanFile:

from conans import python_requires

base = python_requires("MyBase/0.1@user/channel")

class PkgTest(base.MyBase):
    pass

I cannot provide one; the necessary version can only be obtained inside the class. Is there any way around that?

@lasote
Copy link
Contributor

lasote commented Sep 6, 2018

You could write a full dict with the dependencies of every module and use in the common recipe, without using python requires, that in your case I think it doesn't make much sense.

@db4
Copy link
Contributor

db4 commented Sep 6, 2018

@lasote, indeed. Thanks a lot for the hint!

@solvingj
Copy link
Contributor

Just now reading this thread. Started out believing a new feature will be required if we’re ever going to properly handle existing upstream projects which are superprojects and won’t be changing their methodology any time soon. Unfortunately, this feature is likely to be really complex, and will almost certainly not be here within the next year (if everyone agreed on an implementation).

I briefly got pretty excited about @memsharded clever suggestion regarding build_requires. Seems a good attempt to “make it work with what we’ve got”. But thanks to others’ points, I remembered how non-trivial the problems are, and it seems likely to lead to something with a bunch of new caveats. I will continue thinking about the problem.

@michaelmaguire
Copy link

I think one desire for this feature is to mirror the debian dpkg concepts of Source vs Binary package:

https://www.debian.org/doc/debian-policy/ch-controlfields.html

@madpipeline
Copy link

I haven't read the document you sent, but that's where my proposal comes from.

@jgsogo
Copy link
Contributor

jgsogo commented Jun 17, 2019

Although it is not the same approach, IMO this issue is related to #5242, we are implementing a new feature to model the dependencies of libraries inside a single package, and we will provide also the ability to consume only one of those libraries. That PR is a draft right now, but I hope it will be ready soon.

Components feature is not about splitting a single compilation into several packages, but it will allow consuming a single library from a package with several ones. The resulting scenario could be equivalent from the consumer point of view (although they will download several artifacts to use only one).

@solvingj
Copy link
Contributor

solvingj commented Jun 17, 2019

On this topic, there's one point I think it's valuable to recognize for somewhat novice users of Conan. It's something we realized during the process of making modular Boost recipes, and which is even more pronounced for the current Monolithic Qt recipe which we did not modularize.

There have always been Conan features which enabled package authors to expose a group of pre-compiled "monolithic" binary artifacts as a bunch of independently consumable binary packages. However, the thing that its easy to forget about when trying to provide such binary packages is: "What happens when users pass --build=all". This is actually a fundamental part of every Conan package. With the vast number of platforms Conan is used on, and the fact that most professional uses of Conan involve building all Conan binaries from source on their own CI anyway, this has to remain central to every design discussion. Compared to Linux users, the number of people who use the existing precompiled binary packages in the "Central repository" versus building from source is FAR smaller, so it's not an apples-to-apples comparison.

With this in mind, consider Boost and Qt. One might get the idea that the best solution is to just have one recipe which "builds the whole thing" (because that is what their build systems want to do by default), and then a bunch of downstream wrapper packages which just expose the components by name. So, with some advanced logic in the requirements, if I request Boost ASIO, I can just get all the binaries that are required by that.

The are multiple major problems with:

  • If the user passes --build all because there are no pre-compiled binaries for their environment, their system still ends up building all of boost or qt just to get a single component... even if it's a top-level header-only library like Boost System.

  • You still end up with 1-recipe-per-component to manage (which is a challenge), and the hacky logic for "exposing" one package as many is no less complicated than what we ended up doing with modular boost anyway. It's all a bunch of proprietary python code.

  • The handling of options for the components becomes a very awkward challenge. However it's modeled, all the options of each downstream recipe somehow need to make it to the build system of the "upstream super-recipe". So, you can either expose the options on the downstreams, or the upstreams, or both, but however you do it you have to write a LOT of complicated configure or configure_options. It's almost certain that you'll be fighting the best practices and design of Conan, which encourages packages encapsulating their internals from each other.

In summary, I am very glad Conan team is going to move forward with a first-class feature to tackle this very real and practical challenge. I don't know how you solve the problem and avoid the challenges above, but if they can figure out a way, it might transform how we think about many of the most popular OSS packages, and perhaps some enterprise ones as well.

@iiknd
Copy link

iiknd commented Feb 4, 2021

Different conanfiles try to reuse the same build of the project, not rebuilding the whole project.
This can already be achieved, by using re-packaging: use a conanfile (lets call it QtMonoBuild, for example) to do the monolithic build, then have a different conanfile for each individual component that will build-require QtMonoBuild, and will re-package the parts they need. This could be a good balance, keeping the monolithic built package for simplicity, but taking advantage of conan to modularize downstream. Is this what you are trying with KDE @ovidiub13 ? If you are interested in this approach, I could try to setup a proof of concept.

Is there any example(s) how the repackaging should happen in the recipe that build_requires the monobuild? How the other recipes can access the build contents and re-package those?

Say, the monobuild produces:

lib/libA.so
lib/libB.so
include/A.h
include/B.h

How the "package()" implementation of the individual components conanfile.py should look like? How it can access the build contents of the monobuild in another conan package?

@jgsogo
Copy link
Contributor

jgsogo commented Feb 4, 2021

Hi, @unzap. This is still an open topic with no final answer regarding big packages like qt or boost. With the experience of ConanCenter it looks like the community prefers the monolitic build over trying to modularize (and maintain) those big libraries. It is true that (boost example) you would need to build the package from sources to consume some header-only component, but thanks to the Conan workflow you can build it once and all your company can reuse the generated binaries (you build something bigger, but you build only once).

Still, I see the use-case where a company wants to repackage a bigger thing into smaller packages. The most important here is not to propagate information from the big package to the smaller ones, each package should be able to choose the bits of information it wants to propagate. There have been different approaches:

  • build-requires for the host context:

    from conans import ConanFile
    
    class Recipe(ConanFile):
        name = 'consumer'
        version = '1.0'
    
        def build_requirements(self):
            self.build_requires('brhost/1.0', force_host_context=True)
    
        def build(self):
            self.output.write("Information available in build")
            self.output.write(self.deps_cpp_info['brhost'].includedirs)

    It needs to force the host context because you want the binaries with the same configuration (check docs about two-profiles approach). You have access to deps_cpp_info['brhost'] and you can copy files/binaries and get any property you have already defined in brhost.

  • private requirements: even if the implementation is perfect, private requirements will always propagate some information downstream, if they are providing a shared library this library needs to be available for the consumers, so they can't be the answer for this question.

  • components: Conan provides a way to define components and declare which components a recipe consumes from its requirements. Only the information related to those components will be propagated (it also depends on the generators and the capabilities of the underlying build-system. CMake, at least, support them). With this solution, there is no real isolation, the big-package is still needed and actually everything is consumed from the big-package but only required information is propagated.

Summing up, build-requires-host can provide the best level of isolation, but they copy binaries and storage increases. Component approach might not be suitable for every generator/build-system and it will expose all the information from the big-package (although it won't be propagated by Conan).

@iiknd
Copy link

iiknd commented Feb 10, 2021

Hi @jgsogo and thanks for the tip above!

After some debugging I can see that "build_requires" pulls the whole depedency into the "/build" directory of the consuming/split package.

monobuild/1.0@user/channel

split_component_A/1.0@user/channel
conanfile.py:

    def build_requirements(self):
        self.build_requires('monobuild/1.0@user/channel', force_host_context=True)

    def build(self):
        # contents of the "monobuild/1.0" package are in the "/build" directory of this "split_component_A"
        # so I can just pass..
        pass
        
    def package(self):
        # here copy the needed files (from build_require == monobuild/1.0) from the current build directory
        
    def package_info(self):
        self.cpp_info.libs = ["split_component_A"]

So consumers of split_component_A can just conan install the "split_component_A/1.0".
I wonder if there are any details in package_info that need to be taken care of for cmake consumers, i.e. -g cmake_paths ?

@iiknd
Copy link

iiknd commented Feb 16, 2021

Hmm, if I use --install-folder=/foo for a split component then the whole "build_requires" ends up in "foo/"? Shouldn't it copy only the contents of the split components package? Inside conan cache I can see that the contents inside the "package//" of the split component is correct, i.e. it contains only the files belonging to the split component, not the whole "build_requires" (monobuild).

Maybe I have done something wrong in the recipes...?

@jgsogo
Copy link
Contributor

jgsogo commented Feb 16, 2021

No, packages are unitary, there is no way to split them... but using components you can select which components to link from your consumers (but all the packages will be downloaded, installed,...). Try with cmake_find_package[_multi] generators and see how the CMake targets are generated in the dependencies and which ones are added to target_link_libraries in the consumers when using self.cpp_info.requires.

@iiknd
Copy link

iiknd commented Feb 17, 2021

No, packages are unitary, there is no way to split them...

But you mentioned earlier:

I see the use-case where a company wants to repackage a bigger thing into smaller packages

And based on the given snippet I tested the approach where there is one conanfile_monobuild.py to do the actual build, then conanfile_module_a.py and conanfile_module_b.py.

The "package/" directory contents for module_a and module_b look correct, i.e. "package/" directory of the module_a contains only files that belong to module_a. Similarly for module_b.

I used the fnmatch filter possibility in "self.copy(...)":

class ModuleA(ConanFile):
    name = "module_a"
    ...
    
    def build_requirements(self):
        self.build_requires("monolitemodule/1.0@user/channel", force_host_context=True)

    def build(self):
        pass

    def package(self):
        root_path = self.deps_cpp_info["monolitemodule"].rootpath
        self.copy("module_a*", src=root_path, dst="")

So to summarize:

/home/user/.conan/data/monolitemodule/1.0/user/channel/package/58687..963299d/ # contains all files
/home/user/.conan/data/module_a/1.0/user/channel/package/23443.63234e/ # contains that belong to module_a only
/home/user/.conan/data/module_b/1.0/user/channel/package/293846.73611a/ # contains that belong to module_b only

All looks good above. But:

rm -rf build_dir && cd build_dir
conan install module_a/1.0@user/channel --build

The build_dir will contain all the files from the monobuild? The same result happens if I use "--install-dir" in conjunction with the above command? Is this because I've used this in the recipe as well:

    def deploy(self):
        self.copy("*")
        self.copy_deps("*")  # copy from dependencies

I've used the copy_deps() in the recipes. But should that pull in also the "build_requires" (i.e. the monobuild in this case)?

@jgsogo
Copy link
Contributor

jgsogo commented Feb 17, 2021

No, packages are unitary, there is no way to split them...

But you mentioned earlier:

I see the use-case where a company wants to repackage a bigger thing into smaller packages

And based on the given snippet I tested the approach where there is one conanfile_monobuild.py to do the actual build, then conanfile_module_a.py and conanfile_module_b.py.

Then you have module_a package, module_b package and the monobuild one. From Conan perspective a package is a monolith (all or nothing) from the user perspective each package can have different content, that's why we have packages for zlib, opencv,... My point is that there is no way to tell Conan: "use zlib package, but retrieve only file zlib.h and helper.h from it".


When you run conan install module_a/1.0@user/channel --build you are telling Conan to build module_a from sources, and the recipe (conanfile_modules_a.py) has a build_requires that will be retrieved... because you asked to build from sources. Why those files appear in the build directory? Because of the deploy() you have in the recipes. It is usually not needed and, for so far in this use-case, you can get rid of that function.

@memsharded
Copy link
Member

build_id() method (https://docs.conan.io/2/reference/conanfile/methods/build_id.html) allows to build once, and then create multiple binary packages from the same binary folder.

Closing as solved, please create new tickets if necessary, thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests