Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build fails with MFEM_USE_MPI=on #622

Closed
yurivict opened this issue Oct 10, 2018 · 7 comments
Closed

Build fails with MFEM_USE_MPI=on #622

yurivict opened this issue Oct 10, 2018 · 7 comments
Assignees
Labels

Comments

@yurivict
Copy link

In file included from /usr/ports/science/mfem/work/mfem-3.4/linalg/handle.cpp:12:
In file included from /usr/ports/science/mfem/work/mfem-3.4/linalg/handle.hpp:18:
In file included from /usr/ports/science/mfem/work/mfem-3.4/linalg/hypre.hpp:25:
In file included from /usr/local/include/seq_mv.h:21:
In file included from /usr/local/include/HYPRE_seq_mv.h:25:
/usr/local/include/HYPRE_utilities.h:100:19: error: typedef redefinition with different types ('HYPRE_Int' (aka 'int') vs 'struct ompi_communicator_t *')
typedef HYPRE_Int MPI_Comm;
                  ^
/usr/local/mpi/openmpi/include/mpi.h:326:37: note: previous definition is here
typedef struct ompi_communicator_t *MPI_Comm;
                                    ^
In file included from /usr/ports/science/mfem/work/mfem-3.4/linalg/handle.cpp:12:
In file included from /usr/ports/science/mfem/work/mfem-3.4/linalg/handle.hpp:18:
In file included from /usr/ports/science/mfem/work/mfem-3.4/linalg/hypre.hpp:25:
In file included from /usr/local/include/seq_mv.h:23:
/usr/local/include/_hypre_utilities.h:120:9: warning: 'MPI_COMM_WORLD' macro redefined [-Wmacro-redefined]
#define MPI_COMM_WORLD       hypre_MPI_COMM_WORLD
        ^
/usr/local/mpi/openmpi/include/mpi.h:1021:9: note: previous definition is here
#define MPI_COMM_WORLD OMPI_PREDEFINED_GLOBAL( MPI_Comm, ompi_mpi_comm_world)
        ^

OS: FreeBSD 11.2

@tzanio tzanio self-assigned this Oct 10, 2018
@tzanio
Copy link
Member

tzanio commented Oct 10, 2018

Hi @yurivict,

It looks like your hypre was build with HYPRE_SEQUENTIAL. This probably not what you want:

  • If you need a serial version of MFEM, then just do make serial, or do not specify MFEM_USE_MPI=on
  • If you need a parallel version of MFEM, then hypre has to be build in parallel mode, which is its default

Hope this helps,
Tzanio

@yurivict
Copy link
Author

Yes, hypre was sequential.

However, even with parallel hypre it still fails:

In file included from /usr/local/include/HYPRE_utilities.h:25:
/usr/local/include/mpi.h:98:13: error: typedef redefinition with different types ('int' vs 'struct ompi_datatype_t *')
typedef int MPI_Datatype;
            ^
/usr/local/mpi/openmpi/include/mpi.h:327:33: note: previous definition is here
typedef struct ompi_datatype_t *MPI_Datatype;
                                ^
In file included from /usr/ports/math/mfem/work/mfem-3.4/linalg/handle.cpp:12:
In file included from /usr/ports/math/mfem/work/mfem-3.4/linalg/handle.hpp:18:
In file included from /usr/ports/math/mfem/work/mfem-3.4/linalg/hypre.hpp:25:
In file included from /usr/local/include/seq_mv.h:21:
In file included from /usr/local/include/HYPRE_seq_mv.h:25:
In file included from /usr/local/include/HYPRE_utilities.h:25:
/usr/local/include/mpi.h:99:9: warning: 'MPI_CHAR' macro redefined [-Wmacro-redefined]
#define MPI_CHAR           ((MPI_Datatype)0x4c000101)
        ^
/usr/local/mpi/openmpi/include/mpi.h:1047:9: note: previous definition is here
#define MPI_CHAR OMPI_PREDEFINED_GLOBAL(MPI_Datatype, ompi_mpi_char)
        ^

@tzanio
Copy link
Member

tzanio commented Oct 10, 2018

Can you check if the same MPI implementation is used to build hypre and mfem?

In particular, is /usr/local/include/mpi.h from OpenMPI?

@tzanio
Copy link
Member

tzanio commented Oct 12, 2018

Were you able to resolve this, @yurivict?

@yurivict
Copy link
Author

yurivict commented Oct 13, 2018

hypre is built with mpich, and /usr/local/include/mpi.h is from mpich.
hypre allows to use mpich or openmpi.
Does it have to use openmpi in order for mfem to work?

@v-dobrev
Copy link
Member

Hi @yurivict,

Looking at the output you pasted above, I see two different versions of mpi.h - maybe MFEM was configured with OpenMPI (/usr/local/mpi/openmpi/include/mpi.h) while hypre was configured with MPICH (/usr/local/include/mpi.h)?

@tzanio
Copy link
Member

tzanio commented Oct 14, 2018

Does it have to use openmpi in order for mfem to work?

Yes, MFEM and hypre need to use the same MPI implementation. It could be either mpich or openmpi, but you can't mix the two.

@tzanio tzanio closed this as completed Oct 22, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants