-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MPI wrapper class #1808
MPI wrapper class #1808
Conversation
This involved defining a class MpiWrapper and creating a global instance of this. I changed all MPI calls to use the methods on this class rather than the direct MPI calls. Files which include the PETSc headers provded slightly difficult, as PETSc defines macros to overwrite MPI calls in a way which is incompatible with my wrapper. These seem to just be to provide diagnostic information and are not strictly necessary, so my solution was to undef these macros where needed. In addition to this refactoring, I changed the FakeParallelMesh class which is used when testing the PETSc indexers to use a FakeMpiWrapper class rather than override the MPI methods itself.
Somehow I had forgotten to actually add this file to version control previously.
Thanks @cmacmackin ! Looks like the simplest way to get this done, and now gives us the opportunity to mock out MPI calls in the tests, which I am unreasonably excited about! 🎉 I'm going to wait till #1859 goes in before merging this. |
@cmacmackin at least on github it doesn't look like #1859 includes any of this, so perhaps you didn't push? |
@d7919 That's right, the changes are still local to my machine. Should be pushing soon, once I iron out a few more things with integration tests. |
I created a separate pull request for the refactoring of the MPI wrapper requested in PR #1803. I originally added some MPI wrapper methods to BoutMesh. These could be overridden in subclasses used for unit testing, allowing parallel-communication to be simulated on a single processor. @ZedThree suggested that they didn't really belong there and would be better off in their own class. As this requires quite a significant refactor to large portions of the code, I decided to make it a separate PR. However, this PR still contains the PETSc interface which I've submitted in PR #1805, as this is what the MPI wrapper was created for and demonstrates its use for testing. As such, it would be best that it is merged into
next
only after PR #1805.I have created a new class called
MpiWrapper
and define a global instance of itbout::globals::mpi
. Its member methods simply forward calls onto PETSc routines and have exactly the same interface. All MPI calls throughout the code now use the methods on the global object (or, in the case of meshes, on a pointer to the global object which I defined for convenience and brevity). The globalMpiWrapper
object is created and destroyed inBoutInitialise()
andBoutFinalise()
, respectively.