Skip to content

Conversation

@brownbaerchen
Copy link
Contributor

Turns out the base transfer class is not compatible with the diagonal sweepers. This is required for PFASSTER.
I don't really know the underlying mathematics, so I just test that the result is the same between the new base_transfer_MPI and base_transfer classes. Let me know if you want anything else tested. There is also a test that a single sweep of multilevel SDC gives the same result with diagonal sweeper and regular sweeper.

I tried pytest-easyMPI and it didn't work for me. Instead, I use subprocess in combination with argparse. I didn't use argparse so far, but will use it a lot from now on. I think it's a very useful library generally, including tests with MPI. Have a look if you're interested in using command line arguments.

@tlunet
Copy link
Member

tlunet commented Apr 30, 2024

Yep, argparse is really nice indeed (been using it quite a lot already ...).

However, I'm not sure to fully understand why the base transfer class is not compatible with diagonal sweepers ... this should be totally decoupled normally (at least I thought so)

@pancetta
Copy link
Member

pancetta commented May 1, 2024

Yep, argparse is really nice indeed (been using it quite a lot already ...).

However, I'm not sure to fully understand why the base transfer class is not compatible with diagonal sweepers ... this should be totally decoupled normally (at least I thought so)

The transfer classes need to build the tau correction, which in turn needs to calculate QF. And that is distributed in case of diagonal/parallel sweepers. Yes, this is a design flaw, but hey, 10 years ago..

@brownbaerchen
Copy link
Contributor Author

I just noticed that for some reason this only works with two levels. I'll reopen this PR once I find and fix the bug.

@brownbaerchen
Copy link
Contributor Author

Bug was in the tau correction which was not computed with only two levels. Test for MPI sweeper now goes up to 3 levels.

@brownbaerchen brownbaerchen reopened this May 3, 2024
@pancetta
Copy link
Member

What is the status here?

@brownbaerchen
Copy link
Contributor Author

Either somebody review it again or merge it, I guess :D

@pancetta
Copy link
Member

I kept thinking "Didn't I do this at some point??" and yes, I did:

class base_transfer_MPI(object):

@brownbaerchen
Copy link
Contributor Author

face palm... Looks rather similar :D But are you sure your prolong f works? There's no communication at all in there...
How do you want to proceed? We should have this feature not hidden in a project in the end, but we can use your version if you want.

@pancetta
Copy link
Member

As you know, I'm a very good programmer, so, sure, it works 😉 ... that said, can you please remove my code and replace it with yours, making it indeed accessible for "everyone"?

@brownbaerchen
Copy link
Contributor Author

Using the lovely diff feature of vim, I can see there are some differences after all. Why do you do stuff to the initial conditions in prolong? Eg here. I cannot find something similar in the non MPI version. So is this needed or not? We should have it consistent anyways.

@tlunet
Copy link
Member

tlunet commented May 24, 2024

Using the lovely diff feature of vim, I can see there are some differences after all. Why do you do stuff to the initial conditions in prolong? Eg here. I cannot find something similar in the non MPI version. So is this needed or not? We should have it consistent anyways.

"it might have changed in PFASST" ... wait, what does the might mean ? 😱

@brownbaerchen
Copy link
Contributor Author

@pancetta, what the hell where you going from when you implemented this MPI version? The stuff in question was removed by you in the non MPI version in 2018, see here :D

@codecov
Copy link

codecov bot commented May 24, 2024

Codecov Report

Attention: Patch coverage is 91.76471% with 7 lines in your changes are missing coverage. Please review.

Project coverage is 78.26%. Comparing base (18989d3) to head (931e2e8).
Report is 14 commits behind head on master.

Files Patch % Lines
...mentations/sweeper_classes/generic_implicit_MPI.py 25.00% 6 Missing ⚠️
...lementations/sweeper_classes/imex_1st_order_MPI.py 50.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #427      +/-   ##
==========================================
+ Coverage   77.38%   78.26%   +0.87%     
==========================================
  Files         327      330       +3     
  Lines       26085    26201     +116     
==========================================
+ Hits        20187    20507     +320     
+ Misses       5898     5694     -204     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@pancetta
Copy link
Member

Relax. "might" means that these things are also called during MLSDC sweeps, but there the u0 is not changed. PFASST however changes the u0 during its iterations and that's why we need to take care of it here. It was not removed by the commit you mention, just done differently.

@brownbaerchen
Copy link
Contributor Author

Relax. "might" means that these things are also called during MLSDC sweeps, but there the u0 is not changed. PFASST however changes the u0 during its iterations and that's why we need to take care of it here. It was not removed by the commit you mention, just done differently.

Ah, sorry! Was it removed here?

@pancetta
Copy link
Member

Hm. Maybe so. In any case, it is currently not there. Tbh, this is all a bit blurry in my head, long time ago. I would assume the base transfer class works as expected and we should go from there.

@brownbaerchen
Copy link
Contributor Author

I don't really know what's going on. I just took the current version and MPI-ied it. The tests check that MPI and non-MPI versions produce the same results. Let me know if more tests are needed. I don't know if I cover enough cases.

@pancetta
Copy link
Member

Well, tests pass, must be correct, no? 😉

@pancetta pancetta merged commit 5491e18 into Parallel-in-Time:master May 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants