-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Package C libraries independently of core #7660
Comments
Wasn't the main motivation behind the bundling of those libraries to make sure the "right" version is being used, the one we actually test with, etc? |
If it was then cant you just hard pin the deps on every release? |
Doing this would certainly make building astropy for development quicker, as you would have to compile a lot less stuff. |
Isn't this putting unnecessary burden on the end user? In the case of two different astropy versions with different C libraries how do we ensure linking to the right version? How is this going to work outside a conda environment? |
@nden surely if you have two astropy versions installed, be it pip in a venv or in conda envs you can just pip/conda install the correct version of the external lib? Am I missing a different way of installing multiple astropys? |
Perhaps I misunderstand something but say you have 2 wcslib libraries installed on your system. When you pip install astropy it links to the library it finds on your system or on your |
I'm sorry, but unless there is a very strong compelling pro argument, I'm very much -1 on this with the motto, don't fix what's not broken. There are quite a few possible ways to get users trip over on this, I'm not keen to explore those while also bringing more possible burdens on ourselves with the extra dependencies and their packaging. |
I agree with the "don't fix what's not broken" sentiment above, but I am curious about what other packages are doing about this (e.g. numpy, scipy, etc.). Is this going to be the future of packaging external dependencies? If so, maybe it's worth at least investigating further. |
I still think this is worth doing on conda, even if not pip. Given that most system package managers already split out erfa etc already, I cant see any issues with doing this on conda. pip I do not know the details of, so I cant be as sure it will work. |
Let's see how other packages do it first then before we jump off this cliff. 😉 |
@Cadair's suggestion sounds especially worthwhile for those of us who are working on newer packages that might add C libraries in the future but haven't done so already. I'm going to try to remember it. The advantage of speeding up build times sounds really appealing. |
I can see the benefits of having them as separate packages but also the drawbacks for the end-users (version mismatches) and for us - the developers (do we need to test against stable and devs versions of that package and how should we spot and handle problems?) One alternative could be to make astropy-specific packages (like the test split-offs under It's probably not really worth the trouble but it's certainly possible to distribute them independently without bringing havok to the end-user. Depending on the setup these are probably not even useful as independent packages (for example we only distribute very specific fitsio files), except that we have shorter CI build times. And we somehow would need to solve the problem how we can safely update these packages and make sure that astropy keeps working. But in my opinion if someone really wants to do this and we try to keep them astropy-centric (the primary purpose of these should be that astropy keeps working correctly) then it would be fine by me. |
Not sure this will help much: there's four packages in Might it be possible to tweak the build system a bit so that rebuilds don't happen so frequently? (Clearly, rebuilds do not happen all the time, but I have not been able to figure out what exactly triggers them). SImilarly, can some form of caching help to reduce the impact on travis? |
@mhvk - I suspect we can still improve the travis experience by using more of its caching infrastructure as currently we don't really use it. We can also look more into doing nightly builds that can be used in place of building the latest dev version for the affiliated packages. So, surely there are lower hanging fruits for CI that need some brainstorming (e.g. at the coordination meeting) and then someone sitting down and implement the decisions. I would be willing to work on this CI aspect, but realistically I don't have any time for it in the next few months. |
We do have infrastructure to build astropy without the bundled libraries, using Anyway I agree that we don't want to do this by default. The current setup has worked for years. |
There is already some caching on Travis, with For the cfitsio part, installing it with conda is already possible, but it will not change the compilation time : we are not building cfitsio, but a Python extension that read compressed files with cfitsio. So here bundling cfitsio is mostly to simplify users' life, avoiding issues with paths at compile time or runtime (and with compilation options etc.). |
Just to chime in and add to @nden point of multiple versions of the same lib - it's a very bad idea. From what I'm aware, this is not actually possible even with conda. Not unless there's a way to be specific, i.e. the versions get mangled into the lib names and they are installed as such, e.g. libsomething-4.3.0 vs libsomething. This will break things in itself though, in fact, some libraries already do this but they are then symbolically linked to generic forms dropping the version, this is such that the other packages can version-agnostically know what to link against. To install multiple versions you'd have to use separate environments as is intended. Without this, every time one astropy was installed its dependency would clobber the old versions unless you set Though, TBH, I'm not sure why this would be an issue against packaging the C libs independently. You shouldn't really have multiple version of anything within the same env anyhow. |
BTW, if this is a question of build time, what's the situation with parallel builds? |
Looks like Erfa is coming out. Is this resolved? |
@pllim - yes, |
Given the modern state of Python packaging I think there is a strong argument to be made for packing the contents of
cextern
(at least erfa) as separate packages (at least on conda). Conda is obviously well suited to having pure-C dependencies as packages, but it's also possible with wheels to do this on pypi as well.The text was updated successfully, but these errors were encountered: