-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor MPI handling in TPS #211
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Merged
…job_almost_done (Not split communicator safe). \n Address CI errors
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Interfaces with python/PARLA require that the
MPI_Init
is called from themain
executable and not by theTPS::Tps
class.To this aims, this PR introduce the following changes:
Remove the object
mfem::MPI_Session
from the base code. Note thatmfem::MPI_Session
is deprecated inmfem
and does not allow for MPI initialization outside this object.Bump required MFEM version to 4.4
Introduce an
MPI_Comm TPSCommWorld
to be used inside thetps
code as the world communicator. New tests were added to run a simulation on a subset of processesRemove
mfem::MPI_Session
from the constructor of allTPS::Solver
.TPS::Solver
can grab theMPI_Comm TPSCommWorld
directly from theTPS::Tps
that is passed at construction time.All
tps
executables now callmfem::Mpi::Init
from themain
functiontps.py
now usesmpi4py
to initialize MPI. MPI Communicator can now be provided from the Python interfaceTODOs:
pip install mpi4py
in thetps
Docker containermpi4py
communicators