Skip to content

Conversation

pjaap
Copy link
Member

@pjaap pjaap commented Apr 8, 2025

No description provided.

@pjaap pjaap force-pushed the feature/add-iterative-example branch 2 times, most recently from 0fe15d4 to e11c430 Compare April 8, 2025 13:33
@pjaap pjaap requested a review from chmerdon April 8, 2025 13:33
@pjaap pjaap marked this pull request as draft April 8, 2025 15:06
@pjaap
Copy link
Member Author

pjaap commented Apr 8, 2025

After a discussion with @chmerdon: This example will be incorporated into Example250 with method_linear and precon_linear.

@j-fu
Copy link
Member

j-fu commented Apr 8, 2025

There is now the "precs" API in LinearSolve:

https://docs.sciml.ai/LinearSolve/stable/basics/Preconditioners/#Specifying-Preconditioners

This makes much more sense as it passes a constructor of a preconditioner (pair) from a given matrix instead of a preconditioner for a fixed matrix and allows to remove boilerplate in packages like VoronoiFVM (removing the "old" API there is on the list of breaking changes for 3.0) and ExtendableFEM. I considered the API before this as flawed, see SciML/LinearSolve.jl#308 . What I described there I started in ExtendableSparse, but decided to contribute to LinearSolve instead after Chris started to push it.
Now the API there is still not optimal (IMHO) but much better.

I plan to do PRs to reasonable preconditioning packages like this one:
JuliaLinearAlgebra/AlgebraicMultigrid.jl#116

AMGCLWrap also has PreconBuilders now: https://j-fu.github.io/AMGCLWrap.jl/stable/preconditioners/

For the time being I keep PreconBuilders in ExtendableSparse:
https://github.com/WIAS-PDELib/ExtendableSparse.jl/blob/master/src/preconbuilders.jl

The idea is to remove them (and other preconditioning infrastructure) from ExtendableSparse in a breaking release, once enough of this stuff has been spread.

@chmerdon
Copy link
Member

chmerdon commented Apr 9, 2025

Then for now maybe let's do it as suggested in the documentation of LinearSolve.jl and give the preconditioner to the iterative solver directly

method_linear = KrylovJL_GMRES(precs = (A, p) -> (IncompleteLU.ilu(A), I))

so no precon_linear is needed

The two HomogeneousBoundaryData operators on region 1 can be replaced by InterpolateBoundaryData(u, u_boundary!; regions = [1]).

@pjaap
Copy link
Member Author

pjaap commented Apr 11, 2025

method_linear = KrylovJL_GMRES(precs = (A, p) -> (IncompleteLU.ilu(A), I)) crashes the REPL.

[ Info: Step 1 : solving for μ=0.05
┌ Info: SOLVING My problem @ time = 0.0
│                               unknowns = ["velocity", "pressure"]
│                               fetypes = ["H1Pk{2,2,2}", "H1Pk{1,2,1}"]
└                               ndofs = [50134, 6364]
[ Info:  nonlinear = true
 #IT    ------- RESIDUALS -------       ---- DURATION (s) ----          ---- ALLOCATIONS (MiB) ----
        NONLINEAR       LINEAR          ASSEMB  SOLVE   TOTAL           ASSEMB  SOLVE   TOTAL
 INI                                                    1.16                            116.95
corrupted size vs. prev_size

[162142] signal 6 (-6): Abgebrochen
in expression starting at REPL[4]:1
unknown function (ip: 0x7f332106495c) at /lib/x86_64-linux-gnu/libc.so.6
gsignal at /lib/x86_64-linux-gnu/libc.so.6 (unknown line)
abort at /lib/x86_64-linux-gnu/libc.so.6 (unknown line)
unknown function (ip: 0x7f3320ff9290) at /lib/x86_64-linux-gnu/libc.so.6
unknown function (ip: 0x7f332106e464) at /lib/x86_64-linux-gnu/libc.so.6
unknown function (ip: 0x7f332106ee7d) at /lib/x86_64-linux-gnu/libc.so.6
unknown function (ip: 0x7f332106efff) at /lib/x86_64-linux-gnu/libc.so.6
unknown function (ip: 0x7f3321071857) at /lib/x86_64-linux-gnu/libc.so.6
malloc at /lib/x86_64-linux-gnu/libc.so.6 (unknown line)
posix_memalign at /lib/x86_64-linux-gnu/libc.so.6 (unknown line)

No idea what causes the crash. Seems that there are some open problems with ILU: haampie/IncompleteLU.jl#26 for the new LinearSolve interface.

using

precs = (A, p) -> (AlgebraicMultigrid.aspreconditioner(ruge_stuben(A)), I))seems to work in general, but causes convergence problems with GMRES.

Sigh. I will use the old preconditioner interface of ExFEM with ILU. This worked for this problem. I'll add a comment that we should use the new LinearSolve interface once ILU works there.

@pjaap pjaap force-pushed the feature/add-iterative-example branch from e11c430 to 7281df1 Compare April 11, 2025 13:11
@pjaap pjaap marked this pull request as ready for review April 11, 2025 13:12
@chmerdon
Copy link
Member

hmm... well, the example works in principle but it is a pity that it only works in this limited case and also is not really faster than the direct solver (probably because it is only a medium-sized 2d saddle-point problem) . Maybe we should have chosen another example for demonstrating that feature. Should we switch to Example301? Here, also other Preconditioners should work (I succesfully tested method_linear = KrylovJL_GMRES(precs = (A, p) -> (Diagonal(A), I))) and the IncompleteLU.ilu gives much better result than the direct solver (for nrefs =4), but still has to be configured via the precon_linear argument. What do you think?

@j-fu
Copy link
Member

j-fu commented Apr 11, 2025

Well essentially this is to be expected: sparse direct solvers scale with problem sizes on par with iterative solvers in 2D. Not so in 3D - there the sparse direct solvers have a significantly worse complexity scaling compared to iterative ones. As for Pardiso - it just has the better constants...

See my two lectures on this:
https://www.wias-berlin.de/people/fuhrmann/AdSciComp-WS2425/week04/#week_04

The first refers to estimates of complexity scaling from
https://repositories.lib.utexas.edu/items/ba1e71ce-9c2a-4a0b-a736-510aaa7dda63

The second one provided some back-on-the-envelope calculations on complexity of PCG.

  • numerical experiments confirming this.

See also this one, which I provided: https://docs.sciml.ai/SciMLBenchmarksOutput/stable/LinearSolve/SparsePDE/

@jpthiele
Copy link
Contributor

I previously had good results with the schur complement block preconditioner described here:
https://link.springer.com/book/10.1007/978-3-642-58393-3

That would require construction of our own preconditioner, which could be an interesting showcase anyways.

@chmerdon
Copy link
Member

@jpthiele yes, that is a nice project for later

For now, I am in favour of some simple scalar elliptic 3D problem with inhomogeneous boundary data.

@pjaap pjaap force-pushed the feature/add-iterative-example branch from 9526610 to ef19e3c Compare April 14, 2025 14:49
@pjaap
Copy link
Member Author

pjaap commented Apr 14, 2025

Iterative solvers are now out of example 250.

@pjaap pjaap force-pushed the feature/add-iterative-example branch from ef19e3c to 089c87f Compare April 14, 2025 15:02
@chmerdon chmerdon merged commit ec57ab3 into master Apr 14, 2025
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants