Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Target for build based on shared libs #11

Open
hasufell opened this issue Feb 6, 2017 · 26 comments
Open

Target for build based on shared libs #11

hasufell opened this issue Feb 6, 2017 · 26 comments

Comments

@hasufell
Copy link
Contributor

hasufell commented Feb 6, 2017

That somehow complicates packaging/building this thing. I'd suggest:

a) either stuff everything into one repo
b) allow to build against the actual libraries instead of calling sub-make (so they can be packaged separately)

@gkdr
Copy link
Owner

gkdr commented Feb 7, 2017

When I made several modules out of the project instead of stuffing all of this into the plugin I was thinking of code reuse (if only by myself). Putting all of it in one repo will make the other parts invisible to other users.
Now putting them in their own repositories allows me to work on them independently without changing the build at this "top" level - git doesn't simply check out the newest commit, but the one I used at the time of the commit of the "parent" repository (no idea what git actually calls the one containing the submodules...).
I hope this makes it clear why I think submodules are a good choice here.

I think not calling sub-make would just make it more complicated because then the user would have to just do it themselves. But I do agree that currently it's not optimal - I could put all the generated object files into a static lib (.a) instead. Would that help? (I guess I just didn't think of this step after I got rid of libtool which did it automatically.)

@hasufell
Copy link
Contributor Author

hasufell commented Feb 7, 2017

That will not be sufficient for packaging. We need full control over the build systems (e.g. exactly how cmake is invoked).

We could descend into the library directories manually and build that stuff first, but the library rules do not refer to the library file names, which is sort of standard and avoids unconditional calling of the rules. That means we might end up with a rebuild (especially for cmake) with different parameters than we specified, because of the way the dependencies are specified.

That means we lose control and it will produce unwanted results or just break the build.

@gkdr
Copy link
Owner

gkdr commented Feb 7, 2017

As might have become obvious from the open issues, I really don't have much experience in making software build well on other systems than my own.
That being said, I'm not entirely sure how to condense a plan from what you said and I need to ask for more explanations.

  1. Are you saying there is a way to rewrite the makefile to meet your requirements without further changes? More specifically, making the target name the exact lib path/filename? That is all I get from your suggestion, but I am not sure how this gives "full control".
  2. I read your first post again and I'm actually not sure what you mean by "build against the actual library", since you also mention packaging there. Does that mean you want a build that uses shared libraries instead of static libraries?

In the end I'm not sure why submodules produce this problem because it seems like a more general build procedure thing to me.

@hasufell
Copy link
Contributor Author

hasufell commented Feb 7, 2017

More specifically, making the target name the exact lib path/filename?

Yes, then we can ignore submake, build them with our tools and parameters before wer build the lurch target. But it's still rather weird, because you don't support to build against proper shared libraries, but build against objects of other libraries. That's not really standard in any way.

The proper way is:

  1. have a default target which assumes shared versions of all your libraries already installed on the users system
  2. have an optional alternative target that builds against proper static versions of your libraries (not objects floating around) that can be placed at a specific path (target name = lib filename)

For the latter, you can keep the submodules, but that makes them optional now for the first option.

Everything else is really just too much foo to care for packaging. Submodules are already not nice and for example break github tarball generation. Also, relying on commit hashes of your submodules for your project to not break is non-standard as well. Use proper API/ABI checks or pkg-config etc. Use SONAMEs.

@EionRobb
Copy link

EionRobb commented Feb 7, 2017

Just to throw my 2c into the mix:

It's really hard trying to fix things to compile for Windows in the submodules if an update to the parent module changes which revision it's looking at to a broken/non-compiling revision.

@gkdr
Copy link
Owner

gkdr commented Feb 7, 2017

@hasufell As I said, I realize that it is weird, but before I used libtool so they were actually built into shared libraries, though as the dependency on libtool made it even harder to get it to run on different systems so I removed it and just didn't do the last step of putting the object files into an archive file (as I did not see it as a high priority task at that moment). But libtool libraries don't need to be "installed" and can be just linked into others (but it is possible to install them).
The only other library was libaxolotl, which I used as shared, but in #5 there was a valid complaint about having to install it not from the package manager. I guess the alternative is to add some path to ldconfig, but it seems weird.
So I do not think I should assume shared libraries if I can't assume it will end up as a package (which I can't - in the same thread there is a complaint because one of the libs I rely on only comes in an 8 year old version on Debian stable and does not include one of the functions I use).

As for broken tarball generation - if GitHub does not properly support a git feature, I'd say it's GitHub's fault.

@EionRobb Can you explain how that is different than a problem caused by pushing a broken version as part of the same repository? Does updating the submodule erase your local changes?

To summarize: I am not trying to argue against you, because as I said my knowledge on building stuff is limited. Some of the arguments just do not seem convincing at this moment, so I am trying to get to the core by putting forward my objections. I do agree about making proper static libs instead of just object files, and to make proper makefile targets and I'll put it on my to-do-list. I'm just not sure about making shared libs the standard way.
Maybe you can tell me how this is usually handled in other projects - if I have code that is growing along with the main project, but is supposed to be independent, how do I do that? Have it in one repo and then export (copypaste...) it into another directory which is its own repository? Because if I just init a repository in a subfolder, git will stop tracking that folder.

@hasufell
Copy link
Contributor Author

hasufell commented Feb 8, 2017

if I have code that is growing along with the main project, but is supposed to be independent, how do I do that?

If it's growing along with the main project, then it's not really independent. Make it independent when it's actually independent (and possibly used by others). You can maintain modularity just fine, even if it's part of lurch. You're complicating things for no apparent reason, imo.

@gkdr
Copy link
Owner

gkdr commented Feb 8, 2017

You are right - and that is exatly what I did (but I worded my question wrong).
I did develop them together as one project, and as I was done, I separated them into their own, exactly as you said. But there can (almost) never be a guarantee that no later changes are needed, usually for bugs, in this case for some compatibility issues. In fact the XEP is very young and will change often, but I don't think this is a reason to claim that code that only handles the OMEMO part is not actually independent yet. My question was more about this phase - they are already reasonably independent but as improvements are made they need to track each other's state (or in this case, just the project that combines the two independent subprojects into a plugin).

One could argue that I should have separated the projects only after the initial bugfixing that is currently going on, but that would only blow up this repository's size as all the code is contained in it forever (especially as I did just put another lib into one of the subprojects, because I patched the build to do what I need).
I still believe this is the exact use case for submodules and fail to see how it is different than what other frameworks/languages do for dependency management (some kind of .modules file that just tracks the exact version), except this is just built into git.

@hasufell
Copy link
Contributor Author

hasufell commented Feb 8, 2017

I still believe this is the exact use case for submodules and fail to see how it is different than what other frameworks/languages do for dependency management

I'm not really trying to tell you how to structure your projects. I'm just saying in the current state, this project is almost unpackagable. And I think I've given sufficient reasoning what the problems are.

Also: I can write Makefile patches, but I need to know what course you want to take.

@gkdr
Copy link
Owner

gkdr commented Feb 9, 2017

And I'm not trying to defend the way I do it just for the sake of it. But the arguments didn't seem entirely convincing to me - especially why I should assume my libs are installed as shared on the system, or why submodules are any worse for the build than putting all files in one repo (assuming static linking).

In any case, I think this discussion is now at a more philosophical level. I will do it as you suggested - make one build that relies on shared libs and no submodules, and one that links everything into the final .so statically and therefore requires the submodules (so as it is right now, but make proper static libs and fix the makefiles). But I'd like to keep the latter the standard way (if I ever find out how to properly mock libpurple stuff for testing I think this will work better for CI). I just currently don't have that much time anymore.

@gkdr
Copy link
Owner

gkdr commented Feb 11, 2017

I started in 7194438 by building proper static libs and fixed the makefiles to have the actual path of the result as target names.

@dequis
Copy link

dequis commented Feb 12, 2017

@hasufell what distro are you packaging this for?

I've seen arch can handle submodules this way https://wiki.archlinux.org/index.php/VCS_package_guidelines#Git_Submodules and most other distros that i'm familiar with never package git versions of things.

@hasufell
Copy link
Contributor Author

hasufell commented Feb 12, 2017

what distro are you packaging this for?

Exherbo. But gentoo will have the same requirements.

I've seen arch can handle submodules this way

looks like it ignores the commit reference of the submodules and will just fetch latest HEAD?

and most other distros that i'm familiar with never package git versions of things.

Well, since the github tarball is broken anyway, what's the alternative?

@dequis
Copy link

dequis commented Feb 12, 2017

Exherbo. But gentoo will have the same requirements

Googled and found git-r3.eclass, used by this rust-9999.ebuild for example, which handles submodules and passes --disable-manage-submodules to the configure script.

looks like it ignores the commit reference of the submodules and will just fetch latest HEAD?

No, the commit references aren't touched. It just does the repo fetch separately, then tells git submodule to look at the local repo instead of the remote one.

Well, since the github tarball is broken anyway, what's the alternative?

I meant cases like debian which just don't package things without a stable release.

@dequis
Copy link

dequis commented Feb 12, 2017

Oh apparently exherbo doesn't use ebuilds. Sorry. https://exherbo.org/docs/exlibs/scm.html mentions EXTERNAL_REFS, does that help?

@hasufell
Copy link
Contributor Author

Googled and found git-r3.eclass, used by this rust-9999.ebuild for example, which handles submodules and passes --disable-manage-submodules to the configure script

I know. I've been a gentoo dev for 4 years.

mentions EXTERNAL_REFS, does that help?

I know how to package exheres. But I don't get why that's relevant here. Of course you can package scm things based on submodules. I didn't say you can't, but you still have those mentioned problems.

@gkdr
Copy link
Owner

gkdr commented Feb 13, 2017

If it is of any help, I can say that I do not intend to make the repositories' HEADs incompatibe.
I use different branches for changes which I then rebase into the master when I confirmed the whole project still works. Then I only have to update the commit hashes of the submodules, which shouldn't take much more than a minute during which an incompatibility might exist if an interface changed.

(I still intend to offer a build with shared libraries, but it might take a bit. If your offer still stands, I'd welcome the help - it sounds like it would be a simple routine task to you :D)

@dequis
Copy link

dequis commented Feb 13, 2017

Okay decided to reread the whole thing, so the problem here isn't actually about submodules at all. Sorry for the noise.

FWIW I appreciate the split between subprojects (as someone who is interested in doing a non-libpurple frontend to axc/libomemo, and possibly a non-omemo frontend to axc) and the static linking. Because C doesn't have proper dependency management like every recent language out there and it sucks to wait years for distros to decide to package shared libs. Just doing a single make and getting everything in a single .so is wonderful for end users.

I think stilll I don't understand what the real problem is, though. Is it about having both cmake and makefiles in the same repo? Is it about not propagating cflags and stuff correctly to subdirectories?

@hasufell
Copy link
Contributor Author

hasufell commented Feb 24, 2017

The remaining problem now is that downloading tags is broken. As in: you need to create actual releases containing the full sources, not just tags. Custom source archives are possible for github releases. See for example https://github.com/nicklan/pnmixer/releases

@gkdr
Copy link
Owner

gkdr commented Feb 24, 2017

I stripped all the .git files and put all the code (including submodules) into an archive: https://github.com/gkdr/lurch/releases/tag/v0.6.1

Does that sound about right? Also, do you think it would be a bad idea to also add compiled binaries to the release? I could put the compiled .dll for Windows up there at least, and maybe the (x86_64) .so for completeness.

@hasufell
Copy link
Contributor Author

Also, do you think it would be a bad idea to also add compiled binaries to the release?

I'm a source distro user since half a decade, so don't care that much. But yes, some people do that and it would be the appropriate place.

Does that sound about right?

Yeah, maybe you can automate that, but not sure how well git archive works with submodules. Autotools based build systems have a (rather sophisticated) make dist target. One could try to emulate something similar here.

@NicoHood
Copy link

NicoHood commented May 6, 2017

Hi,
I am an ArchLinux TU and I am thinking of moving lurch into the official [community] repository. I am trying to build the release 0.6.5 from source but of course I do not like the use of static linked submodules. Also because of security reasons. It would be great if you can fix those.

I am willing to move lurch to the repos if the package can be built properly and this (and all dependencies) are GPG signed #54

@dequis
Copy link

dequis commented May 6, 2017

but of course I do not like the use of static linked submodules

I wouldn't say "of course" here considering that this issue isn't really about submodules. Other than gpg signing of the release tarballs, what is the issue? What "security reasons"?

@NicoHood
Copy link

NicoHood commented May 7, 2017

If there is a security issue in the library dependency it will be fixed by the package maintainers. But this only works for shared libraries. With static linkage the lurch project itself needs to track their issues and the update might be later or if it gets discontinued some day will never happen. But at least we can secure the dependent libraries from such risks by using shared libs.

@gkdr
Copy link
Owner

gkdr commented May 7, 2017

I am still planning to provide a build based on shared libs, but I'm currently tied up trying to finish some other things and can only put out fires right now.
However, I want to note that OWS does not seem to plan to provide signed releases for now, so if that is a prerequisite for you it might not happen anytime soon anyway.

@NicoHood
Copy link

NicoHood commented May 7, 2017

@gkdr Yes I know and I've opened another issue about that here: signalapp/libsignal-protocol-c#68

They cannot refuse GPG signature as security company.

@gkdr gkdr changed the title Why submodules? Target for build based on shared libs Mar 7, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants