Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added global damping based on density change in CompositionalFlow/WellSolvers #1065

Merged
merged 11 commits into from
Aug 8, 2020

Conversation

francoishamon
Copy link
Contributor

@francoishamon francoishamon commented Jul 21, 2020

We currently check the validity of a solution in CheckSystemSolution by checking that all the component densities are positive, and if not, we cut the time step. As pointed out by @mquanbui and @klevzoff this is quite restrictive.

In this PR, I tried to address this problem with two changes:

  1. An option to perform global damping based on component fraction change between two Newton iterations in ScalingForSystemSolution. This option is controlled by the flag maxCompFractionChange in the flow and well solver blocks. Note by default this option is not used.

  2. An option to skip the check on the positivity of the component densities in CheckSystemSolution, and to instead chop the negative densities cell by cell in ApplySystemSolution. This option is controlled by the flag allowLocalCompDensityChopping in the flow and well solver blocks. By default, this option is activated.

I will do a cleanup of the multiphase tutorials in a separate PR.

Related to https://github.com/GEOSX/IntegratedTests/pull/79.
Related to GEOS-DEV/PVTPackage#23.

@francoishamon francoishamon added the flag: requires rebaseline Requires rebaseline branch in integratedTests label Jul 21, 2020
Copy link
Contributor

@klevzoff klevzoff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good!

@@ -26,6 +26,7 @@
solidNames="{ rock }"
relPermNames="{ relperm }"
temperature="297.15"
maxRelComponentDensityChange="0.9"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does the test actively make use of this feature now (i.e. actually changing the number of Newton iterations due to chopping)?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@klevzoff yes this test actively uses some chopping. To summarize what I did in the integratedTests:

  • In the CompositionalMultiphaseFlow integratedTests, the default behavior is used everywhere (no chopping/damping based on change in component fractions, but chopping of the negative densities). But of course the chopping of the negative densities never kicks in, because no negative density is found.
  • In the CompositionalMultiphaseWell integratedTests, the default behavior is used for all tests except this one (that I converted to CO2, because I previously had three true compositional tests, but no CO2). For this one, I made sure that the chopping based on component fractions kicks in during the first time step. In the other tests, I modified dead_oil_wells_2d.xml to make sure that negative densities are found and chopped.

@klevzoff
Copy link
Contributor

The current implementation is probably quite conservative (i.e. too much damping is done) and has some redundancies. I think it would help a lot if we could chop the update (cell-by-cell) directly in ScalingForSystemSolution or CheckSystemSolution, but I did want to change the member functions of SolverBase (currently these functions take a const Newton update that cannot be modified).

I guess we could have a separate option flag (allowCellDamping or something), in which case ScalingForSystemSolution and CheckSystemSolution would accept negative densities (essentially ignoring the max relative change parameter) while ApplySystemSolution would do the cell-wise chopping.

Copy link
Collaborator

@CusiniM CusiniM left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks good to me. It would probably make sense to revisit the structure of the Newton solver at some point. For example, I think that using 3 different functions (which implies looping over all cells every time) for this is a bit of an overkill.

I also find it a bit hard to come up with a good limit for a variable like density which does not really have physical bounds. However, doing the chopping using fractions (or saturations) might be hard since densities are our primary variables.

@francoishamon
Copy link
Contributor Author

I guess we could have a separate option flag (allowCellDamping or something), in which case ScalingForSystemSolution and CheckSystemSolution would accept negative densities (essentially ignoring the max relative change parameter) while ApplySystemSolution would do the cell-wise chopping.

@klevzoff Yes I modified my initial implementation with the introduction of a flag allowLocalCompDensityChopping. When this flag is on, CheckSystemSolution accepts negative densities that are later chopped in ApplySystemSolution. When this flag is off, we revert to the current behavior, so the time step will fail if a negative density is found. By default, I set the flag allowLocalCompDensityChopping to be on, which will change the current behavior and make it more robust.

Regarding the computation of the scaling factor using the change in component fraction between Newton iterations, I decided to deactivate it by default. I find it useful in the CO2 simulations on HI24L, but I think this is going to be highly problem-dependent. To use it, one has to select a value of maxCompFracChange strictly smaller than 1, which will activate ScalingForSystemSolution.

To summarize, if allowLocalCompDensityChopping == 0 and maxCompFracChange == 1, there is no chopping/damping at all (current behavior). In the other cases, we turn on some local and/or global chopping/damping.

@francoishamon
Copy link
Contributor Author

It would probably make sense to revisit the structure of the Newton solver at some point. For example, I think that using 3 different functions (which implies looping over all cells every time) for this is a bit of an overkill.

Yes I agree the current implementation involves a lot of redundancies, since, as you said, we loop once over all cells to compute the scaling factor in ScalingForSystemSolution, once again to check the solution (at least the pressures) in CheckSystemSolution, and then again to chop the negative densities in ApplySystemSolution. Everything could definitely be consolidated in one loop if we modified the functions of SolverBase.

As a side note, I think consolidating these functions would allow a more convenient form of local chopping (i.e. having a different scaling factor for each cell). This may not be a very rigorous approach, but it can work well in some cases.

@francoishamon
Copy link
Contributor Author

I also find it a bit hard to come up with a good limit for a variable like density which does not really have physical bounds. However, doing the chopping using fractions (or saturations) might be hard since densities are our primary variables.

@CusiniM I agree with your comment. To address it, I modified the computation of the scaling factor to try to enforce a check on the change in component fractions (that will always be between zero and one) instead of on the change in component densities.

Computing the scaling factor \lambda using component fractions directly would mean enforcing
image
where
image
You can then inverse this expression to compute the \lambda as a function of the component fractions and the updates in component fractions. This approach is similar to what is done with the saturations in the schemes that have saturation as a primary variables (i.e., controlling the change of a quantity that varies between zero and one). I did not find this very robust because for the cases I tried, the denominator of the updated component fraction (\rho_T + \sum_c \rho_c) can vary wildly, some component densities may be negative..., so I was not very happy with the results. Maybe, if we just skip the cells that have negative densities as suggested by @klevzoff , this has a chance to work, but I am not sure.

I ended up doing something simpler that we may want to revisit later. I compute the scaling factor \lambda that satisfies
image
So, essentially, I lag the total density in the computation of the first component fraction. This can also be viewed as a check on the change in component density, relative to the total density.

In the future, I think we will have more tests to validate/invalidate this approach, and then we can come back to this to modify it as necessary.

@klevzoff
Copy link
Contributor

klevzoff commented Aug 3, 2020

While testing #1086 I've encountered a case (deadoil_3ph_staircase_gravity_segregation_3d_08.xml) where new direct solver (KLU2) is producing tiny negative entries in solution vector, while the older one (KLU) was placing zeroes; this leads precisely to the scenario (small negative densities) this PR resolves. Looking forward to this being merged!

@francoishamon
Copy link
Contributor Author

This PR is ready to be merged once we merge the corresponding integratedTests and PVTPackage PRs.

@francoishamon francoishamon added the ci: run CUDA builds Allows to triggers (costly) CUDA jobs label Aug 5, 2020
Copy link
Member

@rrsettgast rrsettgast left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a couple of requests to consolidate some constants to the top of the file where they may be found easily.

@rrsettgast rrsettgast merged commit 9099e16 into develop Aug 8, 2020
@rrsettgast rrsettgast deleted the feature/hamon/chopping branch August 8, 2020 05:27
@klevzoff klevzoff mentioned this pull request Aug 8, 2020
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ci: run CUDA builds Allows to triggers (costly) CUDA jobs flag: requires rebaseline Requires rebaseline branch in integratedTests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants