New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
List of currently failing tests #13703
Comments
I will have a look at the EDIT: My guess is that we should pick a partitioner here, rather than using the |
@drwells how should be handle the
rather than the one you added:
Since the exact error message is not critical, and it seems to be a lot of manual work to get this robust on all different compilers and compiler versions, I suggest to simply remove the lines dealii/tests/dofs/nodal_renumbering_01.cc Lines 313 to 314 in 1d883b6
dealii/tests/dofs/nodal_renumbering_01.cc Lines 342 to 343 in 1d883b6
|
@peterrum |
Yes, I think the nodal renumbering test fails because the assertions try to be too clever. I'll fix it. |
For some reason, CDash does not show the error messages. What happens? Which assert is triggered? |
In addition, the following tests fail on our machine:
EDIT: The |
I assume this is because your MPI version is very, very old. Not sure what to do about it. |
Also see #13638 |
What is the output here? |
also see the failing tests in #13458 |
It looks like these tests are flagged I will check the output on our machines in a moment and provide them here. |
We have OpenMPI 1.10.7 on our machines. It believe this is the reason why these tests fail. Do we have an elegant way how to disable tests if the MPI library is too old? We could add an You can find the ctest output to the tests here: |
That would hide the fact that your system is broken, so I am not sure this is a good idea. One option would be to check "MPI VERSION <=3.0" (see https://github.com/tjhei/BigMPICompat#test-results |
This comment was marked as outdated.
This comment was marked as outdated.
I also see test failures on the test |
@kronbichler I am able to reproduce the error locally when compiling with |
@kronbichler I am pretty sure that the problem is the following lines
dealii/include/deal.II/matrix_free/mapping_info.templates.h Lines 3036 to 3046 in a56c2ee
The reduced mapping of the neighbors is set up based on the current cell. In the test, we have a hyperball (for some reason I disabled in the test the manifolds) so that we have an affine cell in the center and the other cells are all general. When enabling the manifolds so that the inner cells are not affine the results are identical independent of the vectorization. Do you have an idea how to address this? I would say that we need |
I had a look at the failing test multigrid-global-coarsening/interpolate_01.mpirun=1.release, which only fails in release mode and only recently, at some point during the last week. I see with valgrind:
While I can't see line numbers (and I can't run valgrind on the debug version right now, it seems our library is too big for valgrind), I believe the most likely cause is some work related on the consensus algorithms or, as a second possibility, my work on the number cache or dof indices inside the triangulation. I can try to investigate in more detail later, but I just wanted to hear if anyone sees it immediately. |
For the
...and the following backtrace
This looks like a bug in hypre. We should come up with a minimal testcase and report it to them. |
Suggestions on how I should fix #13638 ? |
On my jenkins we are down to 3 failing tests:
see https://jenkins.tjhei.info/blue/organizations/jenkins/dealii/detail/PR-13458/10/pipeline/ |
(Matthias) Failing tests with RC1:
The text was updated successfully, but these errors were encountered: