Skip to content

Do more comprehensive setting of the hypre memory location#4035

Merged
lindsayad merged 1 commit intolibMesh:develfrom
lindsayad:more-hypre-memory-manipulation
Dec 11, 2024
Merged

Do more comprehensive setting of the hypre memory location#4035
lindsayad merged 1 commit intolibMesh:develfrom
lindsayad:more-hypre-memory-manipulation

Conversation

@lindsayad
Copy link
Copy Markdown
Member

It's not enough to check the top-level PC for PCHYPRE because we may be doing a field split. We also must remember to control the memory location for linear solves as well

It's not enough to check the top-level PC for PCHYPRE because
we may be doing a field split. We also must remember to control
the memory location for linear solves as well
@lindsayad lindsayad marked this pull request as ready for review December 11, 2024 00:58
Copy link
Copy Markdown
Member

@roystgnr roystgnr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll take your word on the answer but: why do we need to do all this ourselves? PETSc is kind enough to even initialize MPI by default; you'd think if they know a vector has mem type host and they know they're using HYPRE then they'd take charge of passing that info along.

@moosebuild
Copy link
Copy Markdown

Job Coverage, step Generate coverage on 26ba782 wanted to post the following:

Coverage

201eb6 #4035 26ba78
Total Total +/- New
Rate 62.33% 62.33% -0.00% -
Hits 72660 72655 -5 0
Misses 43916 43914 -2 0

Diff coverage report

Full coverage report

This comment will be updated on new commits.

@lindsayad
Copy link
Copy Markdown
Member Author

What I would say is that by the time you are doing SNESSolve or KSPSolve in which you pass in the solution and RHS vectors, all the PC data structures have been created so then it's too late without doing a bunch of conversions. And the SetUp functions take no arguments. There is a SNESSetSolution function which we do not use for recent versions of PETSc. That would seem like a logical spot maybe to try and do this manipulation but if we don't leverage that API then probably other PETSc users do not either. I agree that there is probably a way this should be handled at the PETSc level. Maybe I'll bring this up with them

@lindsayad lindsayad merged commit 54ffeed into libMesh:devel Dec 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants