Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update documentation for PETSc PreconditionLU #5453

Merged
merged 1 commit into from
Nov 13, 2017

Conversation

tjhei
Copy link
Member

@tjhei tjhei commented Nov 13, 2017

No description provided.

@drwells
Copy link
Member

drwells commented Nov 13, 2017

I think that this also works with PETScWrappers::MPI::SparseMatrix as long as we run on a single processor. Let me check.

@drwells
Copy link
Member

drwells commented Nov 13, 2017

Yes: this preconditioner works as long as the matrix only lives on a single processor. Since we will deprecate PETScWrappers::SparseMatrix at some point soon (like we did for the non-parallel vector classes) I would prefer that we mention this.

@bangerth bangerth merged commit fa3ee6a into dealii:master Nov 13, 2017
@bangerth
Copy link
Member

Oh, I guess I pre-empted a discussion here. My apologies.

@drwells
Copy link
Member

drwells commented Nov 13, 2017

Its not a problem: this is a pretty obscure use case for the parallel matrix.

When I have a bit more time I'll clean up the PETSc wrappers some more, which will involve deprecating this class anyway.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants