-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Target for build based on shared libs #11
Comments
When I made several modules out of the project instead of stuffing all of this into the plugin I was thinking of code reuse (if only by myself). Putting all of it in one repo will make the other parts invisible to other users. I think not calling sub-make would just make it more complicated because then the user would have to just do it themselves. But I do agree that currently it's not optimal - I could put all the generated object files into a static lib (.a) instead. Would that help? (I guess I just didn't think of this step after I got rid of libtool which did it automatically.) |
That will not be sufficient for packaging. We need full control over the build systems (e.g. exactly how cmake is invoked). We could descend into the library directories manually and build that stuff first, but the library rules do not refer to the library file names, which is sort of standard and avoids unconditional calling of the rules. That means we might end up with a rebuild (especially for cmake) with different parameters than we specified, because of the way the dependencies are specified. That means we lose control and it will produce unwanted results or just break the build. |
As might have become obvious from the open issues, I really don't have much experience in making software build well on other systems than my own.
In the end I'm not sure why submodules produce this problem because it seems like a more general build procedure thing to me. |
Yes, then we can ignore submake, build them with our tools and parameters before wer build the lurch target. But it's still rather weird, because you don't support to build against proper shared libraries, but build against objects of other libraries. That's not really standard in any way. The proper way is:
For the latter, you can keep the submodules, but that makes them optional now for the first option. Everything else is really just too much foo to care for packaging. Submodules are already not nice and for example break github tarball generation. Also, relying on commit hashes of your submodules for your project to not break is non-standard as well. Use proper API/ABI checks or pkg-config etc. Use SONAMEs. |
Just to throw my 2c into the mix: It's really hard trying to fix things to compile for Windows in the submodules if an update to the parent module changes which revision it's looking at to a broken/non-compiling revision. |
@hasufell As I said, I realize that it is weird, but before I used libtool so they were actually built into shared libraries, though as the dependency on libtool made it even harder to get it to run on different systems so I removed it and just didn't do the last step of putting the object files into an archive file (as I did not see it as a high priority task at that moment). But libtool libraries don't need to be "installed" and can be just linked into others (but it is possible to install them). As for broken tarball generation - if GitHub does not properly support a git feature, I'd say it's GitHub's fault. @EionRobb Can you explain how that is different than a problem caused by pushing a broken version as part of the same repository? Does updating the submodule erase your local changes? To summarize: I am not trying to argue against you, because as I said my knowledge on building stuff is limited. Some of the arguments just do not seem convincing at this moment, so I am trying to get to the core by putting forward my objections. I do agree about making proper static libs instead of just object files, and to make proper makefile targets and I'll put it on my to-do-list. I'm just not sure about making shared libs the standard way. |
If it's growing along with the main project, then it's not really independent. Make it independent when it's actually independent (and possibly used by others). You can maintain modularity just fine, even if it's part of lurch. You're complicating things for no apparent reason, imo. |
You are right - and that is exatly what I did (but I worded my question wrong). One could argue that I should have separated the projects only after the initial bugfixing that is currently going on, but that would only blow up this repository's size as all the code is contained in it forever (especially as I did just put another lib into one of the subprojects, because I patched the build to do what I need). |
I'm not really trying to tell you how to structure your projects. I'm just saying in the current state, this project is almost unpackagable. And I think I've given sufficient reasoning what the problems are. Also: I can write Makefile patches, but I need to know what course you want to take. |
And I'm not trying to defend the way I do it just for the sake of it. But the arguments didn't seem entirely convincing to me - especially why I should assume my libs are installed as shared on the system, or why submodules are any worse for the build than putting all files in one repo (assuming static linking). In any case, I think this discussion is now at a more philosophical level. I will do it as you suggested - make one build that relies on shared libs and no submodules, and one that links everything into the final .so statically and therefore requires the submodules (so as it is right now, but make proper static libs and fix the makefiles). But I'd like to keep the latter the standard way (if I ever find out how to properly mock libpurple stuff for testing I think this will work better for CI). I just currently don't have that much time anymore. |
I started in 7194438 by building proper static libs and fixed the makefiles to have the actual path of the result as target names. |
@hasufell what distro are you packaging this for? I've seen arch can handle submodules this way https://wiki.archlinux.org/index.php/VCS_package_guidelines#Git_Submodules and most other distros that i'm familiar with never package git versions of things. |
Exherbo. But gentoo will have the same requirements.
looks like it ignores the commit reference of the submodules and will just fetch latest HEAD?
Well, since the github tarball is broken anyway, what's the alternative? |
Googled and found git-r3.eclass, used by this rust-9999.ebuild for example, which handles submodules and passes
No, the commit references aren't touched. It just does the repo fetch separately, then tells git submodule to look at the local repo instead of the remote one.
I meant cases like debian which just don't package things without a stable release. |
Oh apparently exherbo doesn't use ebuilds. Sorry. https://exherbo.org/docs/exlibs/scm.html mentions EXTERNAL_REFS, does that help? |
I know. I've been a gentoo dev for 4 years.
I know how to package exheres. But I don't get why that's relevant here. Of course you can package scm things based on submodules. I didn't say you can't, but you still have those mentioned problems. |
If it is of any help, I can say that I do not intend to make the repositories' HEADs incompatibe. (I still intend to offer a build with shared libraries, but it might take a bit. If your offer still stands, I'd welcome the help - it sounds like it would be a simple routine task to you :D) |
Okay decided to reread the whole thing, so the problem here isn't actually about submodules at all. Sorry for the noise. FWIW I appreciate the split between subprojects (as someone who is interested in doing a non-libpurple frontend to axc/libomemo, and possibly a non-omemo frontend to axc) and the static linking. Because C doesn't have proper dependency management like every recent language out there and it sucks to wait years for distros to decide to package shared libs. Just doing a single I think stilll I don't understand what the real problem is, though. Is it about having both cmake and makefiles in the same repo? Is it about not propagating cflags and stuff correctly to subdirectories? |
The remaining problem now is that downloading tags is broken. As in: you need to create actual releases containing the full sources, not just tags. Custom source archives are possible for github releases. See for example https://github.com/nicklan/pnmixer/releases |
I stripped all the .git files and put all the code (including submodules) into an archive: https://github.com/gkdr/lurch/releases/tag/v0.6.1 Does that sound about right? Also, do you think it would be a bad idea to also add compiled binaries to the release? I could put the compiled .dll for Windows up there at least, and maybe the (x86_64) .so for completeness. |
I'm a source distro user since half a decade, so don't care that much. But yes, some people do that and it would be the appropriate place.
Yeah, maybe you can automate that, but not sure how well |
Hi, I am willing to move lurch to the repos if the package can be built properly and this (and all dependencies) are GPG signed #54 |
I wouldn't say "of course" here considering that this issue isn't really about submodules. Other than gpg signing of the release tarballs, what is the issue? What "security reasons"? |
If there is a security issue in the library dependency it will be fixed by the package maintainers. But this only works for shared libraries. With static linkage the lurch project itself needs to track their issues and the update might be later or if it gets discontinued some day will never happen. But at least we can secure the dependent libraries from such risks by using shared libs. |
I am still planning to provide a build based on shared libs, but I'm currently tied up trying to finish some other things and can only put out fires right now. |
@gkdr Yes I know and I've opened another issue about that here: signalapp/libsignal-protocol-c#68 They cannot refuse GPG signature as security company. |
That somehow complicates packaging/building this thing. I'd suggest:
a) either stuff everything into one repo
b) allow to build against the actual libraries instead of calling sub-make (so they can be packaged separately)
The text was updated successfully, but these errors were encountered: