Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reuse MPI contexts across different mpi calls #552

Merged
merged 2 commits into from
Dec 23, 2021
Merged

Conversation

csegarragonz
Copy link
Collaborator

@csegarragonz csegarragonz commented Dec 17, 2021

At every MPI call we were creating a new ContextWrapper instance. This, albeit not really necessary, wasn't harming performance much. However, iniside the constructor, we were calling MpiWorldRegistry::getOrInitialiseWorld(int worldId) which acquires a lock to access the MPI world.

The mentioned method in the world registry is meant to be called upon world creation time from each different rank, not every time. There is a lock-free method, MpiWorldRegistry::getWorld(int worldId) that can be used when we know the world has been created (i.e. after we have called MPI_Init). Given that all MPI calls, in every thread, must happen after MPI_Init, it is safe to assume that the world will already exist.

In this PR I change the way we query for a world in the ContextWrapper and, while at it, re-use context wrappers per MPI rank. Preliminary results in high-contention environments show a 50% execution time reduction.

@csegarragonz csegarragonz marked this pull request as ready for review December 17, 2021 13:19
@csegarragonz csegarragonz self-assigned this Dec 17, 2021
@csegarragonz csegarragonz marked this pull request as draft December 17, 2021 13:44
@csegarragonz csegarragonz marked this pull request as ready for review December 18, 2021 09:22
@csegarragonz csegarragonz merged commit bd3c712 into master Dec 23, 2021
@csegarragonz csegarragonz deleted the reuse-mpi-ctx branch December 23, 2021 14:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants