-
-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
broken kwant builds because of MPI build #21
Comments
I think that this is the most sensible course of action. Debian has several different packages for MUMPS, each with a different build configuration (sequential/parallel, with/without scotch), so this seems something common |
Despite maintaining the ipopt feedstock I don't have a great idea as to its particular requirements for the MPI build requirements. Is this related to #6? |
That sounds like splitting serial and parallel mumps into separate packages is the right thing to do. I haven't used parallel mumps yet (PETSc is where I want to use it next). |
I'd like to take another stab at getting parallel mumps. Scotch is currently building |
That's probably a good idea, mostly for backward compatibility.
|
mumps-mpi will be created by merging #24 |
This should be fixed by #24 |
kwant
currently has no MUMPS support (even though MUMPS should do nearly ALL the heavy lifting there), I thought I had fixed it in the latest build.We used to link against
mpiseq
however that library isn't being build anymore. Linking against the MPI versions will result in errors.Before we linked against:
zmumps mumps_common pord metis esmumps scotch scotcherr gfortran
Now I did:
ptesmumps ptscotch ptscotcherr mpi mpifort scalapack openblas pthread
plus all the former libs.When linking agaist the MPI libs, kwant test error on:
@jbweston and @michaelwimmer do you think we should build the sequential version of MUMPS again?
Related: conda-forge/kwant-feedstock#34
The text was updated successfully, but these errors were encountered: