Apply constraints via element nonlinear residuals#3519
Apply constraints via element nonlinear residuals#3519roystgnr merged 9 commits intolibMesh:develfrom
Conversation
With backwards compatibility shims, of course. We're not PETSc.
We want to be able to turn this off since the Jacobian edits end up being so slow.
The one in DiffSolver was necessary, but this might be useful.
This should let us play with the option easily in FEMSystem codes, which should be good for verification purposes.
This is a relatively clean way to handle drift away from constraints at an elemental rather than a global level. The distinction between how we want to do heterogeneous constraints in linear vs in nonlinear systems gets a little tricky.
|
Job Coverage on 5d92fb0 wanted to post the following: Coverage
Warnings
This comment will be updated on new commits. |
||||||||||||||||||||||||||
|
My moose branch seems to be working - and if I disable global constraint handling while not using element residual constraints, it fails, so I think the test coverage is good. |
|
No changes in a GRINS run either. |
jwpeterson
left a comment
There was a problem hiding this comment.
I think you can go ahead and merge. As discussed over IM, I don't think this change will affect our internal testing because we don't use the PetscNonlinearSolver.
| { | ||
| #ifdef LIBMESH_HAVE_PETSC | ||
| PetscDiffSolver *solver = new PetscDiffSolver(system); | ||
| system.time_solver->diff_solver().reset(solver); |
There was a problem hiding this comment.
Hmm... I don't like using raw new here, but I see it's just continuing an existing pattern. Could I convince you to change those 2 lines to:
system.time_solver->diff_solver() = std::make_unique<PetscDiffSolver>(system);
auto solver = cast_ptr<PetscDiffSolver*>(system.time_solver->diff_solver().get());
?
There was a problem hiding this comment.
Actually there's a lot of these so I can just update all of them in a separate PR.
| * \p residual_constrain_element_vector processing option in | ||
| * \p DofMap. | ||
| */ | ||
| virtual void set_exact_constraint_enforcement(bool enable) { |
There was a problem hiding this comment.
Open curly brace to next line.
| _exact_constraint_enforcement = enable; | ||
| } | ||
|
|
||
| bool exact_constraint_enforcement() { |
There was a problem hiding this comment.
Same comment re: curly brace.
| * \p residual_constrain_element_vector processing option in | ||
| * \p DofMap. | ||
| */ | ||
| virtual void set_exact_constraint_enforcement(bool enable) { |
There was a problem hiding this comment.
| virtual void set_exact_constraint_enforcement(bool enable) { | |
| virtual void set_exact_constraint_enforcement(bool enable) | |
| { |
| _exact_constraint_enforcement = enable; | ||
| } | ||
|
|
||
| bool exact_constraint_enforcement() { |
There was a problem hiding this comment.
| bool exact_constraint_enforcement() { | |
| bool exact_constraint_enforcement() | |
| { |
|
I'll put in two separate PRs for the braces and the new-s. |
|
Thanks @roystgnr ! |
|
This was the last thing I really wanted to get in before the next libMesh submodule update ... except that it feels like we're getting really close to figuring out our sawtooth MPI troubles, so waiting to see if we can get a fix/workaround into TIMPI is tempting. |
As promised in libMesh#3519 comments
This, combined with updates in app codes, ought to fix #3504.
I'm going to need to retest my corresponding MOOSE branch (it was working before, but I've done a lot of cleanup to the libMesh branch since then) before we merge this, but I'm confident - getting MOOSE problems to work was much easier than getting every combination of adjoints and heterogeneous constraints and NewtonSolver-vs-PetscDiffSolver to work in libMesh examples.