Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tpv5 slip artifacts #546

Open
NicoSchlw opened this issue May 10, 2022 · 16 comments
Open

tpv5 slip artifacts #546

NicoSchlw opened this issue May 10, 2022 · 16 comments
Assignees
Labels

Comments

@NicoSchlw
Copy link
Contributor

Describe the bug
Tpv5 produces elements with unreasonable high absolute slip.

Expected behavior
A reasonable and smooth absolute slip distribution.

To Reproduce
Steps to reproduce the behavior:

  1. Which version do you use? d5656cd; but had the same problem also with 9895309

  2. Which build settings do you use? Which compiler version do you use? Double precision, order 4

  3. On which machine does your problem occur? If on a cluster: Which modules are loaded? I'm running it on Supermuc and compiled SeisSol following the documentation: https://seissol.readthedocs.io/en/latest/supermuc.html
    Screenshot from 2022-05-10 12-13-57

  4. Provide parameter/material files.

parameters.txt

2089910.tpv5.out.txt

Screenshots/Console output
Screenshot from 2022-05-09 20-05-08

If you encounter any errors/warnings/... during execution please provide the console output.
I got no errors.

Additional context
I used this setup: https://github.com/SeisSol/Examples/tree/master/tpv5
and increased the domain size and resolution (150m on-fault element size; order 4).

@NicoSchlw NicoSchlw added the bug label May 10, 2022
@ravil-mobile
Copy link
Collaborator

Hi @NicoSchlw, can you, please, try the same setup with the SeisSol version from dr/cpp branch. You will see 2 fault outputs. One of them comes with -new- suffix. I just wonder whether our new DR implementation have the same issue.

@Thomas-Ulrich
Copy link
Contributor

Thomas-Ulrich commented Jun 2, 2022

Hi,
I can confirm that it is not fixed by the new fault output.
And it is not fixed by the new quadrature rule (see a comparison of P_n, top Dunavant, bottom strout).
image

The only know way to mitigate the problem are:

  • use a symmetrical mesh (no more problem)
  • modify an impedance, as in 181fc85
    (but this comes with a loss of accuracy).

@sebwolf-de
Copy link
Contributor

sebwolf-de commented Sep 7, 2022

I would like to add some details: If I use the "eta-hack" from 181fc85 (blue) and compare it to the master (orange), I first see the initial rupture (until ~1s), which then comes to stop. For the master I see slow-slip rupture after ~2s until ~8s. The plateau after 10s indicates that this is not a numerical instability (or at least not the kind of instability that blows up to infinity). I also checked until 40s. The solution stays constant after ~10s.
The solution (P_n = 20GPa, T_s = 0, |SR| ~= 2.55m/s) seems to be a stable solution of the friction problem.
Receiver_1
Unfortunately, there are several such solutions, see the second screenshot (only 150m distance between both receivers).
Receiver_2

@Thomas-Ulrich
Copy link
Contributor

I guess the question is what generates locally normal stress, on the master branch.

@sebwolf-de
Copy link
Contributor

Interstingly, if we increase the output subsampling, we see that only a few elements (gauss-poitns) are bugged.
Top: master, Bottom: "eta-hack"
Screenshot from 2022-09-07 13-49-29

@sebwolf-de
Copy link
Contributor

I also put 64 faultreceivers into one single fault element and found that within one element, we find qualitatively completely different solutions.

@Thomas-Ulrich
Copy link
Contributor

What may complicate the investigation is the resampling of the state variable which blurs the pattern within each cell.
if you turn that off, you will probably see that the anomalies are on totally isolated GP.
I would rather look at Pn0 (unaffected by the friction law, rather than Ts0, which is updated by the friction law).

@sebwolf-de
Copy link
Contributor

Unfortunately, omitting the filter, does not help. But it was a good idea, thanks!
Without the filtering, I still see different variants within this one cell. For one GP, Pn0 drops from -1.2e8 to -1.8e8 and for another one, Pn0 rises from -1.2e8 to -0.6e8.

@sebwolf-de
Copy link
Contributor

The more I think about it, the more inclined I am to believe it is some kind of numerical artifact, e.g. cancelation in https://github.com/SeisSol/SeisSol/blob/master/src/DynamicRupture/FrictionLaws/LinearSlipWeakening.h#L74-L76

@sebwolf-de
Copy link
Contributor

I created a coarse symmetric mesh and then moved one node by 20m (element size ~200m in that region) and I could reproduce a faulty behaviour :/

@sebwolf-de
Copy link
Contributor

sebwolf-de commented Sep 9, 2022

Top: Almost symmetric mesh, with one node moved, Bottom: symmetric mesh. Note the different scales for top and bottom.
Screenshot from 2022-09-09 15-09-28
Edit: t = 0.5s

@sebwolf-de
Copy link
Contributor

@daisy20170101
Copy link
Contributor

Hi, I recently have this issue with my application. I am wondering where could I add this 'eta-hack' to fix it temporally. Thanks!

@Thomas-Ulrich
Copy link
Contributor

Thomas-Ulrich commented May 9, 2023

@francescomosconi
Copy link

Hello,

I'm encountering the same slip artifacts on a 70° dipping fault. I also observe a strong oscillation in normal stress in the receivers (see attached image). Could you provide more details on how to temporarily fix this behavior with the above-mentioned "eta-hack"? I am currently working with GPUs; should it work correctly like this as well?

Many thanks in advance!

fault_rec_51

@davschneller
Copy link
Contributor

Apologies for the large delay—we've added the "eta hack" as a parameter now to the main branch (cf. #1087 ). I.e. if you specify etahack=0.9 in the dynamicrupture section, you should enable it (the default value is etahack=1). Also, it should work on GPUs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

7 participants