-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ParMETIS on simple domains correction #27691
base: next
Are you sure you want to change the base?
Conversation
Job Precheck on 26257af wanted to post the following: Your code requires style changes. A patch was auto generated and copied here
Alternatively, with your repository up to date and in the top level of your repository:
|
@@ -1,6 +1,8 @@ | |||
# PetscExternalPartitioner | |||
|
|||
Allow users to use several external partitioning packages (parmetis, chaco, ptscotch and party) via PETSc. | |||
Note that partitioning, just as meshing, requires a level of user insight to assure the domain is appropriately distributed. For example, edge cases where one seeks to have very few elements per process | |||
may misbehave with certain partitioners. To avert such situations, we switch ParMETIS to PTScotch in cases with less than 28 elements per process, and notify the user with a warning. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know it's hard to come up with a good answer to "at what point should we overrule the user and stop trusting Parmetis", but the answer in the docs is 28, the answer in the comment is 32, and the answer in the code is 20? :-D
I don't see any other problems here, but I'll wait until after your planned squash/rebase/etc before submitting an official review.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch! With the going back and forth on the figs and doc I didn't get to make sure it's consistent everywhere. I would have noticed after civet runs and I give it another look. Annoying one can't run all the civet tests before pushing, the local tests aren't always enough.
Job Documentation on ada3938 wanted to post the following: View the site here This comment will be updated on new commits. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So it sounded like you guys were able to reproduce the poor performance of parmetis with small local element counts without PetscExternalPartitioner?
@lindsayad yeah, using the libmesh one for example. The behaviour is not identical and depends on how the partitioner is initialized. There is certainly a difference between Libmesh and Petsc but the partitioning becomes poor at low elements per process in either case and almost for any initialization. So it made most sense to do a switch for the edge cases. |
Well, the trick was that in the libMesh case the partitioning becomes better at very low elements per process ... because it turns out that we specifically test for that and switch from Parmetis to serialization+Metis. We don't test robustly enough, so the problem is still reproduceable in some cases, but according to the git logs I hit even worse cases (crashes, not just bad partitioning) back in 2012 and added this same sort of workaround then. |
Job Coverage on 1a91538 wanted to post the following: Framework coverage
Modules coverageCoverage did not change Full coverage reportsReports
This comment will be updated on new commits. |
This PR closes #25691.