Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MPI wrapper class #1808

Merged
merged 7 commits into from
Dec 2, 2019
Merged

MPI wrapper class #1808

merged 7 commits into from
Dec 2, 2019

Conversation

cmacmackin
Copy link
Collaborator

I created a separate pull request for the refactoring of the MPI wrapper requested in PR #1803. I originally added some MPI wrapper methods to BoutMesh. These could be overridden in subclasses used for unit testing, allowing parallel-communication to be simulated on a single processor. @ZedThree suggested that they didn't really belong there and would be better off in their own class. As this requires quite a significant refactor to large portions of the code, I decided to make it a separate PR. However, this PR still contains the PETSc interface which I've submitted in PR #1805, as this is what the MPI wrapper was created for and demonstrates its use for testing. As such, it would be best that it is merged into next only after PR #1805.

I have created a new class called MpiWrapper and define a global instance of it bout::globals::mpi. Its member methods simply forward calls onto PETSc routines and have exactly the same interface. All MPI calls throughout the code now use the methods on the global object (or, in the case of meshes, on a pointer to the global object which I defined for convenience and brevity). The global MpiWrapper object is created and destroyed in BoutInitialise() and BoutFinalise(), respectively.

This involved defining a class MpiWrapper and creating a global
instance of this. I changed all MPI calls to use the methods on this
class rather than the direct MPI calls. Files which include the PETSc
headers provded slightly difficult, as PETSc defines macros to overwrite MPI
calls in a way which is incompatible with my wrapper. These seem to
just be to provide diagnostic information and are not strictly
necessary, so my solution was to undef these macros where needed.
In addition to this refactoring, I changed the FakeParallelMesh class
which is used when testing the PETSc indexers to use a FakeMpiWrapper
class rather than override the MPI methods itself.
This was referenced Oct 18, 2019
@cmacmackin
Copy link
Collaborator Author

@ZedThree These changes are ready for review. The CI fails on the CMake build which now runs two tests which previously were not routine run: test-petsc-laplace and test-petsc-laplace-MAST-grid. It appears that these tests always fail including in the next branch itself, as reported in Issue #1850.

@ZedThree
Copy link
Member

ZedThree commented Dec 2, 2019

Thanks @cmacmackin ! Looks like the simplest way to get this done, and now gives us the opportunity to mock out MPI calls in the tests, which I am unreasonably excited about! 🎉

I'm going to wait till #1859 goes in before merging this.

@cmacmackin
Copy link
Collaborator Author

@ZedThree Unfortunately, I've merged this branch into #1859. I will be the first to admit that was not the best idea (I did it without thinking) but at this point it would be quite difficult to reverse it I think. So probably best if this does get merged in first.

@d7919
Copy link
Member

d7919 commented Dec 2, 2019

@cmacmackin at least on github it doesn't look like #1859 includes any of this, so perhaps you didn't push?

@ZedThree ZedThree merged commit 7c1f42c into next Dec 2, 2019
@ZedThree ZedThree deleted the MPI_wrapper branch December 2, 2019 15:12
@cmacmackin
Copy link
Collaborator Author

@d7919 That's right, the changes are still local to my machine. Should be pushing soon, once I iron out a few more things with integration tests.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants