Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

coupled Cahn-Hilliard example doesn't run properly #378

Open
guyer opened this issue Sep 19, 2014 · 2 comments
Open

coupled Cahn-Hilliard example doesn't run properly #378

guyer opened this issue Sep 19, 2014 · 2 comments

Comments

@guyer
Copy link
Member

guyer commented Sep 19, 2014

Although examples/cahnHilliard/mesh2DCoupled.py@fe00adaf9e passes tests, any attempt to increase the size of the mesh causes problems:

  • At 1002, PySparse needs over 350 MB, likely due to LinearLUSolver.
  • Trilinos does not give a sensible solution to the coupled problem, but does fine with the vector equation. They seem to be equivalent.

Although the tests, as written, pass (not that we check very much), we should document that PySparse is not suitable and we need to figure out what's wrong with the coupled treatment.

Imported from trac ticket #582, created by guyer on 03-26-2013 at 13:45, last modified: 09-06-2013 at 10:04

@guyer
Copy link
Member Author

guyer commented Jan 21, 2017

The changes in 8a1f81d 1e56dc5 seem correct, but PySparse still uses a deranged amount of memory and Trilinos doesn't seem to evolve at all (with the default solver)

@guyer guyer reopened this Jan 21, 2017
@guyer guyer added this to Needs triage in Triage Apr 23, 2019
@guyer
Copy link
Member Author

guyer commented Dec 13, 2019

PETSc (#659) also has problems. Conversely to Trilinos, PETSc can solve the coupled form, but not the vector form. The difference appears to be a bunch of non-zero zeros entered in the matrix for the vector form, e.g.,

vector

0 convergence: KSP_CONVERGED_RTOL 
1 convergence: KSP_CONVERGED_RTOL 
0 iterations: 5 / 1000 
1 iterations: 5 / 1000 
Mat Object: 2 MPI processes
  type: mpiaij
row 0: (0, 9.27582)  (1, -0.)  (2, -0.)  (4, 2.)  (5, -1.)  (6, -1.) 
row 1: (0, -0.)  (1, 9.27582)  (3, -0.)  (4, -1.)  (5, 2.)  (7, -1.) 
row 2: (0, -0.)  (2, 9.27582)  (3, -0.)  (4, -1.)  (6, 2.)  (7, -1.) 
row 3: (1, -0.)  (2, -0.)  (3, 9.27582)  (5, -1.)  (6, -1.)  (7, 2.) 
row 4: (0, -2.)  (1, 1.)  (2, 1.)  (4, 0.0625)  (5, -0.)  (6, -0.) 
row 5: (0, 1.)  (1, -2.)  (3, 1.)  (4, -0.)  (5, 0.0625)  (7, -0.) 
row 6: (0, 1.)  (2, -2.)  (3, 1.)  (4, -0.)  (6, 0.0625)  (7, -0.) 
row 7: (1, 1.)  (2, 1.)  (3, -2.)  (5, -0.)  (6, -0.)  (7, 0.0625) 

coupled

0 convergence: KSP_CONVERGED_RTOL 
1 convergence: KSP_CONVERGED_RTOL 
0 iterations: 4 / 1000 
1 iterations: 4 / 1000 
Mat Object: 2 MPI processes
  type: mpiaij
row 0: (0, 9.27582)  (4, 2.)  (5, -1.)  (6, -1.) 
row 1: (1, 9.27582)  (4, -1.)  (5, 2.)  (7, -1.) 
row 2: (2, 9.27582)  (4, -1.)  (6, 2.)  (7, -1.) 
row 3: (3, 9.27582)  (5, -1.)  (6, -1.)  (7, 2.) 
row 4: (0, -2.)  (1, 1.)  (2, 1.)  (4, 0.0625) 
row 5: (0, 1.)  (1, -2.)  (3, 1.)  (5, 0.0625) 
row 6: (0, 1.)  (2, -2.)  (3, 1.)  (6, 0.0625) 
row 7: (1, 1.)  (2, 1.)  (3, -2.)  (7, 0.0625) 

For 6x6 and below, PETSc gets the same answer, but not if larger.

@guyer guyer moved this from Needs triage to High priority in Triage Dec 13, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Triage
  
High priority
Development

No branches or pull requests

2 participants