-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MPI build appears successful but actually excludes MPI #973
Comments
I believe linking of the executable cannot be successful if any calls to MPI libraries are made, and the code is not linking against any libraries that contain them. Looking at Bad system #1, you have gfortran + OpenMPI which is a supported configuration, so please help us understand if there is a bug:
|
Sure thing. Here's
And
Here's the verbose build output, in a file because of length: |
Thank you! |
I think I figured it out. There's a file that defines dummy versions of all the MPI subroutines that are used, I think to accommodate running without MPI after building with MPI. (I inherited this codebase, so I'm not entirely sure of the intent here.) Deleting that file causes the linking to succeed on Bad system 1, and Good system continues to succeed. However, Bad system 2 now shows undefined references to all the MPI subroutines at the final linking step:
|
This falls back to #974, so I will close this issue. |
Description
I am building an MPI Fortran program using a pretty basic configuration. I have tried this build on three systems and on two of them, this problem occurs. (System info and meta.yaml below) By all indications, the build succeeds. However, when you try to run the executable with MPI,
mpi_init
just does nothing and the MPI environment is never initialized.Looking at the library dependencies of the executable produced by fpm shows that the MPI libraries are missing:
Building the same code in the same environment with
make
produces an executable with the MPI libraries linked:Expected Behaviour
The build should not appear successful in this scenario, where the MPI libraries are not linked (or whatever the true underlying problem is). Ideally, the MPI libraries should always link correctly, but if something is preventing that, it should be reported.
Version of fpm
0.9.0
Platform and Architecture
Ubuntu 22.04
Additional Information
Good system:
Bad system 1:
Bad system 2:
Bad system 2 is really old, so I was ready to blame that until the same thing happened on a separate, up-to-date system.
Here is the fpm.toml used in all cases:
fpm.toml
Just in case, I did also test without the preprocessing, which had no effect.
The text was updated successfully, but these errors were encountered: