Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

step-55: init sparsity pattern with relevant dofs #13363

Merged
merged 1 commit into from Feb 14, 2022

Conversation

marcfehling
Copy link
Member

When increasing the problem size and solving step-55 on plenty of MPI processes (~100), I noticed that setup_system() consumes a lot of memory.

I noticed that we initialize the BlockDynamicSparsityPattern with all dofs in the system.

I cross-checked with step-40 where we initialize the corresponding DynamicSparsityPattern with locally relevant dofs, and proposed the same for step-55 with this PR.

Are locally relevant dofs the right choice? The documentation says locally owned should be favored in most cases.

@marcfehling
Copy link
Member Author

Are locally relevant dofs the right choice? The documentation says locally owned should be favored in most cases.

Yes, they are. Solver doesn't converge with owned_partitioning.

Shall we update the documentation then?

@tjhei
Copy link
Member

tjhei commented Feb 12, 2022

Yeah, relevant is correct and the documentation is incorrect. We need to store all entries that come from all owned cells so that we can distribute them to the right owner afterwards.

Can you please fix this as well?

@tjhei
Copy link
Member

tjhei commented Feb 12, 2022

/rebuild

@tjhei
Copy link
Member

tjhei commented Feb 12, 2022

I am surprised it doesn't crash without these entries. Can you check Trilinos and Petsc?

@marcfehling
Copy link
Member Author

Can you check Trilinos and Petsc?

Patch works with both Trilinos and Petsc. It also worked with both without the patch. It's just that initializing the matrices in setup_system consumes a lot of memory if the problem size is large and you try to solve it with lots of MPI processes.

@marcfehling
Copy link
Member Author

Can you please fix this as well?

Will do in a separate PR.

@tjhei
Copy link
Member

tjhei commented Feb 12, 2022

I am sorry, I didn't explain in enough detail: You mentioned that you get the wrong result if you use the locally owned range. I am wondering why this is not generating errors about missing sparsity pattern entries. Did you use PETSc or Trilinos for the test?

@marcfehling
Copy link
Member Author

marcfehling commented Feb 12, 2022

I am sorry, I didn't explain in enough detail: You mentioned that you get the wrong result if you use the locally owned range. I am wondering why this is not generating errors about missing sparsity pattern entries. Did you use PETSc or Trilinos for the test?

Petsc Trilinos
all dofs works works
locally relevant dofs works works
locally owned dofs solver doesn't converge solver doesn't converge

Calculated errors of the solution are the same for all dofs or locally relevant dofs.

So I actually get errors, not in creating the sparsity pattern or initializing the matrix, but during the solution process.

An error occurred in line <347> of file </raid/fehling/bin/dealii-10.0.0-pre/include/deal.II/lac/solver_minres.h> in function
    void dealii::SolverMinRes<VectorType>::solve(const MatrixType&, VectorType&, const VectorType&, const PreconditionerType&) [with MatrixType = dealii::PETScWrappers::MPI::BlockSparseMatrix; PreconditionerType = Step55::LinearSolvers::BlockDiagonalPreconditioner<dealii::PETScWrappers::PreconditionBoomerAMG, Step55::LinearSolvers::InverseMatrix<dealii::PETScWrappers::MPI::SparseMatrix, dealii::PETScWrappers::PreconditionBoomerAMG> >; VectorType = dealii::PETScWrappers::MPI::BlockVector]
The violated condition was: 
    conv == SolverControl::success
Additional information: 
    Iterative method reported convergence failure in step 660. The
    residual in the last step was 0.278649.
    
    This error message can indicate that you have simply not allowed a
    sufficiently large number of iterations for your iterative solver to
    converge. This often happens when you increase the size of your
    problem. In such cases, the last residual will likely still be very
    small, and you can make the error go away by increasing the allowed
    number of iterations when setting up the SolverControl object that
    determines the maximal number of iterations you allow.
    
    The other situation where this error may occur is when your matrix is
    not invertible (e.g., your matrix has a null-space), or if you try to
    apply the wrong solver to a matrix (e.g., using CG for a matrix that
    is not symmetric or not positive definite). In these cases, the
    residual in the last iteration is likely going to be large.

Stacktrace:
-----------
#0  ./step-55: void dealii::SolverMinRes<dealii::PETScWrappers::MPI::BlockVector>::solve<dealii::PETScWrappers::MPI::BlockSparseMatrix, Step55::LinearSolvers::BlockDiagonalPreconditioner<dealii::PETScWrappers::PreconditionBoomerAMG, Step55::LinearSolvers::InverseMatrix<dealii::PETScWrappers::MPI::SparseMatrix, dealii::PETScWrappers::PreconditionBoomerAMG> > >(dealii::PETScWrappers::MPI::BlockSparseMatrix const&, dealii::PETScWrappers::MPI::BlockVector&, dealii::PETScWrappers::MPI::BlockVector const&, Step55::LinearSolvers::BlockDiagonalPreconditioner<dealii::PETScWrappers::PreconditionBoomerAMG, Step55::LinearSolvers::InverseMatrix<dealii::PETScWrappers::MPI::SparseMatrix, dealii::PETScWrappers::PreconditionBoomerAMG> > const&)
#1  ./step-55: Step55::StokesProblem<2>::solve()
#2  ./step-55: Step55::StokesProblem<2>::run()
#3  ./step-55: main

@bangerth bangerth merged commit 2105cad into dealii:master Feb 14, 2022
@bangerth
Copy link
Member

Oh, actually, can you propose a changelog entry in a separate PR?

@marcfehling marcfehling deleted the sparsity branch February 14, 2022 05:41
marcfehling added a commit to marcfehling/dealii that referenced this pull request Feb 14, 2022
bangerth added a commit that referenced this pull request Feb 14, 2022
NiklasWik pushed a commit to NiklasWik/dealii that referenced this pull request Mar 17, 2022
ivweber pushed a commit to ivweber/dealii that referenced this pull request May 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants