-
Notifications
You must be signed in to change notification settings - Fork 238
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WIP: Feature-test macros + config header! #740
Conversation
We're getting bug reports (e.g. #732) for situations where people have a libpqxx compiled as C++17 but an application compiled as C++20, or _vice versa._ Generally it's probably not wise to link C++17-compiled code to code compiled as C++20. But two things I did exacerbate the problem: 1. I moved a bunch of C++ feature checks to compile time, using C++20's feature test macros. It looked like a much cleaner, easier way forward with reduced maintenance overhead. 2. Usage of C++20's `std::source_location` _when present_ affects the ABI. So effectively there's one ABI with, and one without. I see that mostly as the price of doing libraries in C++ — it's generally dangerous to package library binaries, unless they've been designed to be compatible, or they come as a range of binaries for various compilers, versions, and configurations. And the real problem is that _these two changes interacted._ The detection of support for `std::source_location` happened at compile time. And so if you compile libpqxx in C++17 (without this feature) and the application in C++20 (with this feature), the two will not be link-compatible! In this first commit I'm prototyping a new approach that I hope will combine the ease of maintenance of feature test macros with the ABI stability of a configuration header. Configuration speed should lie somewhere inbetween: no more compiling separate little checks for every individual C++ feature. But it's not easy. There are 2 orthogonal binary choices, leaving me with 4 scenarios to support: * Autoconf vs. CMake * Cross-compiled vs. native. How does cross compilation factor into it? It works like this: I need to check C++ feature test macros. I check them in C++. But I don't want to keep editing that C++ code for every feature — there's a repetitive preprocessor dance that I don't think I could simplify to one simple line of code, because I'd need to pass macro names as parameters to macros. So, I write a minimal config file and run it through a Python script that generates the C++ code for me. Then I have the build configuration machinery compile _and run_ that C++ code, and generate the configuration header. Yes, alright, but how does cross compilation factor into it? First, if you're cross-compiling, it's not a given that you can _run_ the binary you produce on the same system! The whole point is that the two systems are different. And two, you'll have a native compilation environment but there's no guarantee that it will resemble the cross compilation environment at all. So if you compile a binary to run locally, you may get very different results. So for cross-compilation, the Python script just generates a minimal configuration header that just has all features disabled. And in this first commit I've got that working for autoconf. But I'm still struggling with CMake (thanks @KayEss for helping with this). If it gets too difficult, I may go a different route: generate C++ code from Python, but only run it through the preprocessor. (I believe autoconf has a standard, portable way of running the preprocessor; let's hope CMake has one as well.) The output will still C++ code, but now it's been preprocessed, so hopefully it'll be possible to tell portably which features are present. And ironically, I think I'd then have to have another Python script to _postprocess_ the preprocessed code and turn it into a ready-to-use config header.
You may want to look at this as a pure C++ solution to being able to use source location when possible, but not have it break builds when not available: https://godbolt.org/z/x81v7MzaM The basic trick here seems to be to use your own data structure for the source location. I'm not convinced this wouldn't technically be an ODR violation, but I guess it isn't something that causes a problem in practice. This would likely be a simpler way to fix #743 |
Given the wealth of platforms out there, I'm loath to cheat the ODR. Easiest thing to do may be to enable use of However I would very much like to resolve the configuration header problem before I cut a 7.9 release! Here's a trick I used in my autoconf experiments for this branch: would it be possible somehow to dress up the preprocessor dance as a compiler check for CMake purposes? |
Possibly. I don't know enough of the details here to answer definitively though, maybe @tt4g knows? |
As far as I know, CMake does not support complex compilation controls to support multiple build tools on multiple platforms. I have seen many times when people link libraries to another project in this way, they overwrite macros or otherwise cause link errors. I am sure that this problem will not be solved until there is a new tool that comprehensively manages multiple projects and all the libraries that depend on them, rather than one tool that manages a single project. |
That is disappointing. Perhaps I'll go with a dumber alternative, where for each C++ feature that I want to test, I generate a check program along the lines of... int main()
{
#if !defined(__cpp_lib_ssize) || !__cpp_lib_ssize
#error "Feature not supported."
#endif
} And then I just write the classic "does this code compile" checks for each. It just feels so massively inefficient! |
I'm giving up on this approach, and going for a more conservative one. Apparently CMake just does not support what I wanted to do. :-( |
This should help with #732 and #739. But it's still experimental.
We're getting bug reports for situations where people have a libpqxx compiled as C++17 but an application compiled as C++20, or vice versa.
Generally it's probably not wise to link C++17-compiled code to code compiled as C++20. But two things I did exacerbate the problem:
std::source_location
when present affects the ABI. So effectively there's one ABI with, and one without. I see that mostly as the price of doing libraries in C++ — it's generally dangerous to package library binaries, unless they've been designed to be compatible, or they come as a range of binaries for various compilers, versions, and configurations.And the real problem is that these two changes interacted. The detection of support for
std::source_location
happened at compile time. And so if you compile libpqxx in C++17 (without this feature) and the application in C++20 (with this feature), the two will not be link-compatible!In this first commit I'm prototyping a new approach that I hope will combine the ease of maintenance of feature test macros with the ABI stability of a configuration header. Configuration speed should lie somewhere inbetween: no more compiling separate little checks for every individual C++ feature.
But it's not easy. There are 2 orthogonal binary choices, leaving me with 4 scenarios to support:
How does cross compilation factor into it? It works like this: I need to check C++ feature test macros. I check them in C++. But I don't want to keep editing that C++ code for every feature — there's a repetitive preprocessor dance that I don't think I could simplify to one simple line of code, because I'd need to pass macro names as parameters to macros. So, I write a minimal config file and run it through a Python script that generates the C++ code for me. Then I have the build configuration machinery compile and run that C++ code, and generate the configuration header.
Yes, alright, but how does cross compilation factor into it? First, if you're cross-compiling, it's not a given that you can run the binary you produce on the same system! The whole point is that the two systems are different. And two, you'll have a native compilation environment but there's no guarantee that it will resemble the cross compilation environment at all. So if you compile a binary to run locally, you may get very different results.
So for cross-compilation, the Python script just generates a minimal configuration header that just has all features disabled. And in this first commit I've got that working for autoconf. But I'm still struggling with CMake (thanks @KayEss for helping with this).
If it gets too difficult, I may go a different route: generate C++ code from Python, but only run it through the preprocessor. (I believe autoconf has a standard, portable way of running the preprocessor; let's hope CMake has one as well.) The output will still C++ code, but now it's been preprocessed, so hopefully it'll be possible to tell portably which features are present. And ironically, I think I'd then have to have another Python script to postprocess the preprocessed code and turn it into a ready-to-use config header.