New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DM-23308: Add CMake support #34
Conversation
Please add a new github action that builds the package and runs the tests using cmake. |
(repost from the utils PR, where I accidentally posted this first) If this CMake build includes building the pybind11 modules, I think I'd be in favor of publishing sphgeom on conda-forge via that, and using conda to get it in (e.g.) daf_butler CI, and then retiring the Now I'm probably getting way ahead of myself, but having the main stack use a separately-semver'd sphgeom from conda-forge and dropping the scons/eupspkg build could then seem pretty attractive, in this particular case, at least. |
The image cutout service (and any other IVOA services SQuaRE writes) use PIP for installation, which means they want to express a dependency on daf-butler via PIP. I'm not sure how that would work if daf-butler were only available via conda. My guess is "not well." I'm also quite dubious about switching them to conda, although I guess I haven't checked to see how much is missing from conda that I rely on. That said, I strongly suspect that the IVOA frontends are going to care about a very small subset of Butler functionality and, in a client/server Butler world, only the client. So maybe the solution is to split off the small bit of Butler that they care about and which presumably wouldn't need a dependency on sphgeom and upload only that bit to PyPI. |
@@ -70,10 +70,10 @@ class NormalizedAngle { | |||
NormalizedAngle const & b); | |||
|
|||
/// This constructor creates a NormalizedAngle with a value of zero. | |||
NormalizedAngle() {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We did a clang tidy run on afw. Eventually this could globally be run on all C++ code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What compiler/version warns about default ctors? Not sure I recall seeing these with the scons build, but I would need to check.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mwittgen I encountered the warning under gcc 8.4.1 in the centos:8 Qserv build containers. The actual warning was that the compiler would/could not implicitly instantiate operators= since a non-default ctor had been declared. Moving to the default ctor (which these were, effectively, anyway, and which would be in the long-term less error prone?) seemed preferable to adding explicit "=default" versions of operator= to squelch the warning.
I'm not averse to moving first-party packages into rubin-env, but conda-only/non-pip packaging seems hostile. Seems like having cmake build wheels and publish to pypi is what this scikit-build thing is about? |
That seems like a step in the right direction, in that we could have a real build system (CMake) in sphgeom without duplicating (as much) stuff into setuptools configuration. But does that actually give us the possibility of distributing a standalone C++ library (and headers?) via
The sphgeom usage is pretty central; it provides the data structures for all of our spatial geometry objects and the predicates we use to compare them. If we can't distribute a sphgeom that C++ code can build against via pip, and you can't use conda, then I think I agree that we're stuck, and maybe the best we can do is the renaming that I alluded to earlier. But I'm a bit worried that even if the current |
I'm sure there's a lot missing from my mental model here. Could you fill me in on why the client portion of a client/server Butler would need spacial geometry objects? Naively, I would assume that it's making REST calls to a Butler server, so all of the objects have to be serialized as floats anyway and the comparisons would be done on the server, and hence I'm not sure why the client benefits from dedicated objects implemented in C++. |
A |
I would have thought it was the Butler server that didn't need much C++ (potentially just sphgeom) while the client effectively needs the full Science Pipelines stack to be useful, because the things one fetches with In the more limited context of the butler client being used inside a IVOA service implementation that's really just getting butler metadata, not actually fetching datasets, you may well be right that you don't need any of the non-sphgeom C++, and the sphgeom dependency is just a refactoring problem. Data structures you would use often have a I was imagining image-cutout as the kind of service we were talking about - that does need a lot of C++, including sphgeom - but maybe it's atypical. Happy to discuss on a live call or slack if I'm still not actually answering the question you asked. |
Ah, thanks, I see. The image cutout frontend (the part installed with PIP) probably does not care about any of these concepts, since it receives parameters as floats in ASCII and passes those parameters to its backend as JSON objects. The only thing it would need is enough machinery to do basic input validation, which probably doesn't require anything more than astropy. Only the cutout backend that performs the cutout requires the C++ data structures, and it's separate software running on a stack container since it's using a pipeline to do the work. However, the frontend does need to look up datarefs via Butler and get their URLs. So far as I know, the pip-compatible way of doing what the project wants to do with C++ libraries is to build and ship the C++ library independent of the Python library that binds to it and require that any user of the Python library have already obtained and installed the C++ library via some other mechanism. Then only the glue required to make a Python dynamic module linked against the C++ library is included in the library uploaded to PyPI. I don't believe PyPI has a mechanism to provide C++ libraries as such because it's only a Python module distribution system, not a C++ library distribution system; it can distribute a Python dynamic module with C++ code built into it, but not a library that can be independently linked against. (Typically one would package the C++ library as an RPM or deb or both, since those packaging systems are designed to work with shared libraries.) I realize this isn't at all how the project is currently architected, and it would pose a bunch of additional challenges. |
a36cc88
to
56a04ee
Compare
@timj GHA add to do cmake build and run C++ tests -- please have a look? |
.github/workflows/cmake.yaml
Outdated
|
||
- name: CMake test | ||
working-directory: ${{github.workspace}}/build | ||
run: ctest --parallel `nproc` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This test doesn't seem to do anything to test the python code. You are going to have to install pytest as I do in the other action and then run the tests to make sure it really does work from python.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@timj Added python tests to the cmake GHA; please have another look?
0611289
to
29bd2f3
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding the python tests -- this makes it less likely someone will break cmake. If you also install pytest-xdist
you can add -n 2
to the pytest to make it run in parallel so be a little faster.
Thanks @timj -- at less than 3s for the Python tests compared to the C++ build/link times, I'm inclined to not worry on it at this point. :-) Let me know when the version tool lands, and I'll happily help out with the retrofitting of that into the cmake here and over in |
Adds ability to build package with CMake in addition to eupspkg (allows to be included as submodule in other CMake projects, e.g. Qserv.)