-
Notifications
You must be signed in to change notification settings - Fork 37
Implemented MPI-parallel multilevel diagonal SDC #427
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Yep, However, I'm not sure to fully understand why the base transfer class is not compatible with diagonal sweepers ... this should be totally decoupled normally (at least I thought so) |
The transfer classes need to build the tau correction, which in turn needs to calculate |
|
I just noticed that for some reason this only works with two levels. I'll reopen this PR once I find and fix the bug. |
|
Bug was in the tau correction which was not computed with only two levels. Test for MPI sweeper now goes up to 3 levels. |
|
What is the status here? |
|
Either somebody review it again or merge it, I guess :D |
|
I kept thinking "Didn't I do this at some point??" and yes, I did:
|
|
face palm... Looks rather similar :D But are you sure your |
|
As you know, I'm a very good programmer, so, sure, it works 😉 ... that said, can you please remove my code and replace it with yours, making it indeed accessible for "everyone"? |
|
Using the lovely diff feature of vim, I can see there are some differences after all. Why do you do stuff to the initial conditions in prolong? Eg here. I cannot find something similar in the non MPI version. So is this needed or not? We should have it consistent anyways. |
"it might have changed in PFASST" ... wait, what does the |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #427 +/- ##
==========================================
+ Coverage 77.38% 78.26% +0.87%
==========================================
Files 327 330 +3
Lines 26085 26201 +116
==========================================
+ Hits 20187 20507 +320
+ Misses 5898 5694 -204 ☔ View full report in Codecov by Sentry. |
|
Relax. "might" means that these things are also called during MLSDC sweeps, but there the |
Ah, sorry! Was it removed here? |
|
Hm. Maybe so. In any case, it is currently not there. Tbh, this is all a bit blurry in my head, long time ago. I would assume the base transfer class works as expected and we should go from there. |
|
I don't really know what's going on. I just took the current version and MPI-ied it. The tests check that MPI and non-MPI versions produce the same results. Let me know if more tests are needed. I don't know if I cover enough cases. |
|
Well, tests pass, must be correct, no? 😉 |
Turns out the base transfer class is not compatible with the diagonal sweepers. This is required for PFASSTER.
I don't really know the underlying mathematics, so I just test that the result is the same between the new
base_transfer_MPIandbase_transferclasses. Let me know if you want anything else tested. There is also a test that a single sweep of multilevel SDC gives the same result with diagonal sweeper and regular sweeper.I tried
pytest-easyMPIand it didn't work for me. Instead, I usesubprocessin combination withargparse. I didn't useargparseso far, but will use it a lot from now on. I think it's a very useful library generally, including tests with MPI. Have a look if you're interested in using command line arguments.