Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Call MPI_Init from main rather than TPS::Tps #210

Closed
uvilla opened this issue Jun 30, 2023 · 1 comment
Closed

Call MPI_Init from main rather than TPS::Tps #210

uvilla opened this issue Jun 30, 2023 · 1 comment

Comments

@uvilla
Copy link
Member

uvilla commented Jun 30, 2023

It turns out that calling MPI_Init from main rather than the TPS::Tps object is not as simple.

A key issue is that mfem::MPI_Session is a singleton class hard-coded to call MPI_Init.

That is, if we want parla to handle the initialization of mpi then we must get rid of mfem::MPI_Session from everywhere in the tps code (Note: mfem has already deprecated this class).

Before we do that, I'd like to better understand the use of tps::MPI_Groups are we actually solving different physics on different set of processors or are we simply use one single communicator for everything?

@uvilla uvilla changed the title Call MPI_Init from main rather then TPS::Tps Call MPI_Init from main rather than TPS::Tps Jun 30, 2023
@uvilla
Copy link
Member Author

uvilla commented Jul 17, 2023

Solved by #211

@uvilla uvilla closed this as completed Jul 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant