Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Preconditioner reuse in petsc_nonlinear_solver #3208

Merged
merged 6 commits into from
Jul 26, 2022

Conversation

reverendbedford
Copy link
Contributor

@reverendbedford reverendbedford commented Mar 25, 2022

This alters the logic used in PetscNonlinearSolver to:

  1. Not free the SNES instance between solves
  2. Use SNESReset by default in between solves
  3. Add two new options "reuse preconditioner" and "reuse preconditioner maximum iterations" to support preconditioner reuse during and across nonlinear solves.

This PR would close issue #3209

@lindsayad
Copy link
Member

@fdkong

@roystgnr
Copy link
Member

Looks reasonable to me. I'd love to get some test coverage on it, but I know it'd be a pain in the neck to get something that covers the PETSc case but doesn't false positive on other solvers.

Copy link
Contributor

@fdkong fdkong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for contributions. If I am wrong, please correct me.

From my understanding, preconditioner reuse within a single nonlinear solve, can be realized using "-snes_lag_preconditioner" without changing libmesh code.

Preconditioner reuse across multiple time steps can be implemented using PETSc options as well such as the combination of -snes_lag_preconditioner_persists and -snes_lag_preconditioner. The only issue we have in libmesh right now is that we throw aways SNES after each nonlinear solve.

It is great that this PR take an attempt to keep SNES after each nonlinear solve, which introduces more possibilities on PC reuse.

My concern here is that we introduce some PC-reuse parameters and a few related functions that are not necessary, since these capabilities are accessible via PETSc options. If we can not do that, I am happy to enhance PETSc. My philosophy here is that we should maintain as little PETSc code as possible in libmesh/MOOSE.

Therefore, my suggestion would be to keep the capability that can have SNES live across several time steps, but remove all other stuffs such as: reuse_preconditioner, reuse_preconditioner_max_its, libmesh_petsc_recalculate_monitor

include/solvers/petsc_nonlinear_solver.h Outdated Show resolved Hide resolved
src/solvers/petsc_nonlinear_solver.C Outdated Show resolved Hide resolved
src/solvers/petsc_nonlinear_solver.C Show resolved Hide resolved
@reverendbedford
Copy link
Contributor Author

@fdkong, no PETSc does not do what I want right now. -snes_lag_preconditioner only has options for

  • "recalculate the preconditioner next time you need it and never again"
  • "never recalculate"
  • "recalculate every X times we calculate the Jacobian" where X is a fixed number (so every time, ever other time, etc.)

What I want is "recalculate whenever it took more than Y linear iterations to solve the system" which isn't provided by current PETSc options, as best as I can tell. Using the monitor here provides this functionality. If it's possible to implement this functionality in PETSc that's fine, but it's not currently there.

@fdkong
Copy link
Contributor

fdkong commented Mar 30, 2022

It might be much clearer if we could implement this in PETSc, when we consider that all information required are at the algebra level. But the current hookup approach might be an easy one. So I am fine with this PR.

@reverendbedford
Copy link
Contributor Author

I'm not qualified to work on PETSc :) but if this type of heuristic ever gets implemented over there I'd be happy to then go back and update libmesh and MOOSE. So let me make your suggested changes/fixes here and then figure out why those 2 stupid tests are failing in MOOSE and I'll ping you for re-review.

This alters the logical used in PetscNonlinearSolver to:

1. Not free the SNES instance between solves
2. Use SNESReset by default in between solves
3. Add two new options "reuse preconditioner" and "reuse preconditioner maximum iterations" to support preconditioner reuse
Fix failing tests by changing SNESReset back to old behavior of
calling the custom _snes destructor.
@reverendbedford
Copy link
Contributor Author

@fdkong I'm sorry I dropped the ball on this. This commit ought to address your review plus fix the two failing MOOSE tests. Could you or someone else release the tests?

@moosebuild
Copy link

moosebuild commented Jun 14, 2022

Job Coverage on a292390 wanted to post the following:

Coverage

No coverage report(s) were found for the base commit b5e028e.
Full coverage report

This comment will be updated on new commits.

@jwpeterson
Copy link
Member

@reverendbedford I sent you an invite to join the libMesh Associates group. Members of this group have the PRs tested automatically, so you won't need a developer to activate the tests every time.

@reverendbedford
Copy link
Contributor Author

@jwpeterson Thanks! I joined. But I think you need to activate the tests one last time for me :)

Use getter/setter to supply the required properties for the new reuse
capability.  Default to libmesh_not_implemented() in superclass,
implemented in the petsc solvers.
Copy link
Member

@roystgnr roystgnr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this now except for the lack of testing, but I can't think of how to get a proper test case even in the enable-petsc configuration. Solve with one matrix, reuse preconditioner, change the matrix to something well-conditioned but horribly incompatible, assert that the next solve diverges??? Probably not robust enough to be worth the hassle.

@reverendbedford
Copy link
Contributor Author

I would hold off merging until I modify MOOSE to actually use this. I can at least link the MOOSE PR, which will have tests, here.

@jwpeterson
Copy link
Member

I can at least link the MOOSE PR, which will have tests, here.

Your MOOSE PR will need to use a libmesh that includes your branch from this PR... so we would need to push your branch to libmesh upstream and you would have to do a libmesh update on your MOOSE PR in order to test it.

@reverendbedford
Copy link
Contributor Author

Ah, right. Well at least let me run some tests locally. I'm having trouble finding a small problem that will actually trigger the behavior we want to test.

This fixes an error I introduced in passing the value of the
maximum iterations to the monitor.  The storage there needs to
persist across multiple solves with the same solver.
@reverendbedford
Copy link
Contributor Author

@jwpeterson Yes, I screwed something up when I made the changes last night. This commit fixes it and I now have MOOSE tests ready to go. Those tests check that we can trigger the correct behavior, specifically that you reuse the same preconditioner until the linear iterations exceed the "maximum iterations" parameter.

Please merge when you have a chance, this should now be good to go.

@hugary1995
Copy link
Contributor

I can't wait to try this :-)

Copy link
Member

@jwpeterson jwpeterson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@reverendbedford sorry for losing track of this PR... it looks like @roystgnr and @fdkong signed off on it already and were just waiting on me. I took another look and my only concern is that you un-defaulted the PetscNonlinearSolver destructor, which I don't think should be necessary. If you agree, I can go ahead and make that change and finally get this merged.

src/solvers/petsc_nonlinear_solver.C Outdated Show resolved Hide resolved
src/solvers/petsc_nonlinear_solver.C Show resolved Hide resolved
@reverendbedford
Copy link
Contributor Author

@jwpeterson should be good now, though for some reason the tests over here aren't running.

@jwpeterson
Copy link
Member

@jwpeterson should be good now, though for some reason the tests over here aren't running.

OK, yeah I think CIVET is down for scheduled maintenance for a few days right now.

@jwpeterson jwpeterson merged commit eee85bc into libMesh:devel Jul 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants