Skip to content

Conversation

@lisawim
Copy link
Contributor

@lisawim lisawim commented May 30, 2024

The current SDC-DAE sweepers now get MPI versions by adding fully_implicit_DAE_MPI and SemiImplicitDAEMPI to the SDC-DAE family! Hooray! For this, I implemented a base class for doing the MPI stuff called SweeperMPIDAE similar to the SweeperMPI base class. Then, for the SDC-DAE sweepers we have multiple inheritance again and it seems like it works, which is great.

To make sure the sweepers do work as they should I added a test that checks the order for both sweepers for different cases of DAEs (index one and index two case). I was struggling how to implement the test, because first I would like to write a test that checks if the MPI and non-MPI versions do match as here and I still want to do this. However, I was not able to understand how the stuff using argparse does work and my tests fail all the time.. Since I don't want to invest too much time (which I would do anyways but I can hear someone calling from far away then..) so I decided to skip that for the first time (but have this still on my to do list in my mind) and wrote instead a test checking the order of accuracy. Another point that is kind of embarrassing for me: Running the file accuracy_check_MPI.py does work when I run it with mpiexec -n 3 python3 accuracy_check_MPI.py but fails when I use pytest for the test file, and it returns some "Cant open file [bla bla] - No such file or directory" so maybe I did a mistake using the correct path? Let's see if the test here is successful. Setting path's is still a big mystery for me although I thought I would understand it .. :D

Since this PR is a request of @Ouardghi I also added playground_MPI.py with some comments how to use and how to run the files. Moreover, the sweepers do also contain some documentation. I'm not sure if this is enough but if he has questions he should not hesitate to ask me!

I appreciate your review very much! Feel free to throw comments at me I'm always happy about that!

@codecov
Copy link

codecov bot commented May 31, 2024

Codecov Report

Attention: Patch coverage is 98.62069% with 2 lines in your changes are missing coverage. Please review.

Project coverage is 78.26%. Comparing base (18989d3) to head (63318e5).
Report is 16 commits behind head on master.

Files Patch % Lines
...DC/projects/DAE/sweepers/fully_implicit_DAE_MPI.py 96.55% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #439      +/-   ##
==========================================
+ Coverage   77.38%   78.26%   +0.87%     
==========================================
  Files         327      332       +5     
  Lines       26085    26303     +218     
==========================================
+ Hits        20187    20585     +398     
+ Misses       5898     5718     -180     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@brownbaerchen brownbaerchen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! I think the test that MPI and non-MPI produce the same result is quite valuable, actually. But I don't know if it's better than testing the order.
@pancetta, what do you think?

@lisawim
Copy link
Contributor Author

lisawim commented May 31, 2024

I totally agree with you @brownbaerchen! Thus I added also a test checking uend in MPI and non-MPI version.

@brownbaerchen
Copy link
Contributor

LGTM 🚀

@pancetta
Copy link
Member

pancetta commented Jun 2, 2024

Looks good indeed. Can you add this to the documentation of the project or should this happen in #431 ?

@lisawim
Copy link
Contributor Author

lisawim commented Jun 2, 2024

I can add this here and we can directly merge then if you would like to.

@pancetta pancetta merged commit 45f5ca0 into Parallel-in-Time:master Jun 3, 2024
@pancetta
Copy link
Member

pancetta commented Jun 3, 2024

Thanks, @lisawim. Well done, we appreciate your quick response to @Ouardghi's request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants