Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

H(div)-conforming incompressible Navier-Stokes #233

Merged
merged 11 commits into from
Oct 13, 2022
Merged

Conversation

NiklasWik
Copy link
Contributor

@NiklasWik NiklasWik commented Jun 4, 2022

Adds support for H(div)-conforming finite element (Raviart-Thomas) in the incompressible N-S setting.

This PR needs to wait for dealii/dealii#13907 to get merged. ✔️

Copy link
Member

@peterrum peterrum left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks very nice and clean 👍

@NiklasWik
Copy link
Contributor Author

Is this not using the master branch of dealii?

@peterrum
Copy link
Member

peterrum commented Jun 7, 2022

It does. But it seems that there is an issue with the docker deployment pipeline: https://github.com/dealii/dealii/actions/workflows/docker.yml

@nfehn
Copy link
Member

nfehn commented Jun 10, 2022

I was on vacation this week. So I plan to have a detailed look at this PR next week.

@NiklasWik
Copy link
Contributor Author

I pushed another commit which makes so that Hdiv also uses VectorTools::interpolate since #13976 fixes it.

@nfehn nfehn added new feature New feature is implemented or requested exactly divergence-free labels Jun 24, 2022
@nfehn
Copy link
Member

nfehn commented Jun 27, 2022

As a general comment, it would be good if you include the periodic 2d vortex example in the present PR, so that we have a test case for which we can verify the rsults obtained with HDIV.

@nfehn
Copy link
Member

nfehn commented Jun 27, 2022

In the very beginning, you mention dealii PR that needs to be merged. Could you update that information?

@nfehn
Copy link
Member

nfehn commented Sep 1, 2022

@NiklasWik you could rebase and push again to see whether the checks are passing now.

@NiklasWik NiklasWik force-pushed the hdiv-RT branch 2 times, most recently from 720ba9c to ff21d4a Compare September 1, 2022 18:21
@nfehn
Copy link
Member

nfehn commented Sep 6, 2022

@NiklasWik are you ready for merging?

@NiklasWik
Copy link
Contributor Author

@NiklasWik are you ready for merging?

Actually no. The 2D problem this adds is no longer working for HDIV and I can't figure out why. 3D tgv is still correct though. Not sure if this has to do with deal.ii or exadg since it works fine for L2.

@peterrum
Copy link
Member

Actually no. The 2D problem this adds is no longer working for HDIV and I can't figure out why. 3D tgv is still correct though. Not sure if this has to do with deal.ii or exadg since it works fine for L2.

@NiklasWik Do you happen to have a combination of versions of ExaDG/deal.II that used to work; have you stored somewhere the code from your Master's thesis? From that it might be possible to determine the commit causing the issue. I would be surprised that the issue is coming from deal.II, since we - as far as I remember - didn't do any modifications there regarding RT. But to identify the root might be quite useful! Let's find the problem soon so that we can merge and finalize your Master's thesis!

@NiklasWik
Copy link
Contributor Author

@NiklasWik Do you happen to have a combination of versions of ExaDG/deal.II that used to work; have you stored somewhere the code from your Master's thesis?

Unfortunately not! I'm not sure when it stopped working either as when I've done these small changes I've only ever tested the three dimensional tgv.

@NiklasWik
Copy link
Contributor Author

It works using the dual splitting scheme, so that's something.

@NiklasWik
Copy link
Contributor Author

It was just the cfl! I'm pretty sure it was the first thing I tried last weekend... Must have been tired and done something weird. I don't either understand how I ended up pushing with an unstable cfl since I ran this problem quite a few times back in May/June.

@nfehn should be all good now!

@nfehn
Copy link
Member

nfehn commented Oct 12, 2022

@NiklasWik Thanks for figuring this out! I am a bit irritated because you changed the CFL number to a larger value (0.1->0.4). Normally, we would expect that a smaller CFL number cannot be unstable if CFL=0.4 is stable? So did you see instabilities/blowup for CFL=0.1?

@NiklasWik
Copy link
Contributor Author

@NiklasWik Thanks for figuring this out! I am a bit irritated because you changed the CFL number to a larger value (0.1->0.4). Normally, we would expect that a smaller CFL number cannot be unstable if CFL=0.4 is stable? So did you see instabilities/blowup for CFL=0.1?

Ah, yes! This is why I was so confused since the first thing I tried was taking a smaller CFL which didn't work. The week after I just compared with the settings of tgv as I thought it might have been something else. But yea, I did see blowup for CFL=0.1.

@nfehn
Copy link
Member

nfehn commented Oct 12, 2022

I might have an idea where this might come from but I am not sure. Could you figure out for which solution strategies (coupled, dual splitting, pressure correction) you observe this problem? For all of them?

@nfehn
Copy link
Member

nfehn commented Oct 12, 2022

Another question: Do you see numerical blowup in the sense that the simulation crashes, or "just" large errors (i.e. a seemingly wrong solution) but a simulation that completes until final time? The reason behind this question is that - from a mathematical/physical perspective - the given problem with its analytical solution is not necessarily "stable" in the sense that a "chaotic" solution might develop, e.g. triggered by small perturbations. These perturbations might be damped more strongly by a (dissipative) time integration scheme with larger time step size (and higher time integration error).

@NiklasWik
Copy link
Contributor Author

Do you see numerical blowup in the sense that the simulation crashes, or "just" large errors

Blowup as in crashing after a few timesteps.

Could you figure out for which solution strategies (coupled, dual splitting, pressure correction) you observe this problem? For all of them?

I think I tried both coupled and dual splitting, but I will have to double check this. I have not tested pressure correction though. I don't really have time this week unfortunately, but hopefully Sunday!

@nfehn
Copy link
Member

nfehn commented Oct 12, 2022

I am just interested whether see this problem also in case of the coupled solver. (I would hope not.)

@NiklasWik
Copy link
Contributor Author

I am just interested whether see this problem also in case of the coupled solver. (I would hope not.)

Definitely for the coupled solver I'm afraid..

@nfehn nfehn merged commit b0f0d54 into exadg:master Oct 13, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
exactly divergence-free new feature New feature is implemented or requested
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants