-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add MPI support for conforming P4estMesh
es in 2D
#977
Conversation
If the p4est library does not support MPI, it is not allowed to pass an MPI communicator when creating a new p4est object. If it does support MPI, an MPI communicator has to be passed even if only one process is used.
This enables adding methods specialized on the parallel type for MPI runs.
Adds a new container for the interfaces at the MPI domain boundaries and methods for initializing such container and adds specialized versions for some initialization methods for the existing containers for parallel runs (e.g. to distinguish between regular and mpi interfaces).
Codecov Report
@@ Coverage Diff @@
## main #977 +/- ##
==========================================
- Coverage 93.68% 92.18% -1.50%
==========================================
Files 294 295 +1
Lines 22606 22995 +389
==========================================
+ Hits 21177 21196 +19
- Misses 1429 1799 +370
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work! I have left a few comments and questions, but nothing struck me as problematic!
This avoids a few unnecessary "where {IsParallel}" and makes the code more readable overall.
- New P4est.jl version supports MPI, hence enable `ParallelP4estMesh` tests - `p4est_has_mpi()` function no longer required as it's not built into P4est.jl
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Are there any other changes required or can we remove the WIP and make this ready to be merged?
P4estMesh
es in 2DP4estMesh
es in 2D
From my side this is ready once all checks passed. |
Nice work, @lchristm 👍 Out of curiosity: Did you run some benchmarks of MPI vs. multithreading? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I haven't looked into everything in detail, but what I've seen looks very good. Well done!
I'm also curious about performance, although it's only 2D yet.
@ranocha, @efaulhaber I didn't do any extensive benchmarking but I did save some numbers when I checked the correctness of my implementation. The following numbers were obtained by running a modified conforming version of Multithreading with 6 threads:
MPI with 6 processes:
|
Thanks! Did you use multi-threaded or serial RK methods? |
I forgot - does this only time the MPI operation or also the unpacking of buffers? Especially the |
Serial. I think that's where difference in runtime is coming from.
It includes unpacking the buffers. |
This PR adds containers and solver functions for running simulations on a conforming 2D
P4estMesh
in parallel. TheP4estMesh
has been slightly adapted so that parallel and serialP4estMesh
es can be distinguished which allows for elixirs using theP4estMesh
to "just work" when executed in parallel, similar to theTreeMesh
implementation.There is a new container for interfaces shared by two MPI domains (MPI interfaces). Furthermore, there are new functions for initializing the parallel and serial containers for parallel runs (see
containers_parallel.jl
andcontainers_parallel_2d.jl
). Similar to theTreeMesh
implementation, there is also a new cache for holding data structures related to MPI communication (seedg_parallel.jl
).Todo: