New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
step-55: init sparsity pattern with relevant dofs #13363
Conversation
Yes, they are. Solver doesn't converge with Shall we update the documentation then? |
Yeah, relevant is correct and the documentation is incorrect. We need to store all entries that come from all owned cells so that we can distribute them to the right owner afterwards. Can you please fix this as well? |
/rebuild |
I am surprised it doesn't crash without these entries. Can you check Trilinos and Petsc? |
Patch works with both Trilinos and Petsc. It also worked with both without the patch. It's just that initializing the matrices in |
Will do in a separate PR. |
I am sorry, I didn't explain in enough detail: You mentioned that you get the wrong result if you use the locally owned range. I am wondering why this is not generating errors about missing sparsity pattern entries. Did you use PETSc or Trilinos for the test? |
Calculated errors of the solution are the same for So I actually get errors, not in creating the sparsity pattern or initializing the matrix, but during the solution process.
|
Oh, actually, can you propose a changelog entry in a separate PR? |
When increasing the problem size and solving
step-55
on plenty of MPI processes (~100), I noticed thatsetup_system()
consumes a lot of memory.I noticed that we initialize the
BlockDynamicSparsityPattern
with all dofs in the system.I cross-checked with
step-40
where we initialize the correspondingDynamicSparsityPattern
with locally relevant dofs, and proposed the same forstep-55
with this PR.Are locally relevant dofs the right choice? The documentation says locally owned should be favored in most cases.