New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Magma dense mat #1
Conversation
… Cholesky factorization.
…for dense matrices.
I commented on the original patch submission and was expecting this PR to be updated, but I see it hasn't been. http://mail-archive.com/search?l=mid&q=87ob6zlap8.fsf@mcs.anl.gov Please address those concerns: put code in a "magma" subdirectory, add your test as an automated test, and fix the commit messages. |
Hi Jed, Sorry, I didn't get your earlier message! I will start working on it. Harshad ----- Original Message ----- I commented on the original patch submission and was expecting this PR to be updated, but I see it hasn't been. http://mail-archive.com/search?l=mid&q=87ob6zlap8.fsf@mcs.anl.gov Please address those concerns: put code in a "magma" subdirectory, add your test as an automated test, and fix the commit messages. — |
Closing this obsolete PR. Pull requests are handled at bitbucket. |
…tlas blas - so workarround with --download-openblas 0a1,22 > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: > [0]PETSC ERROR: level: -1, err: 2.90982e-07 > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.7.4-2087-g11119d7 GIT Date: 2016-11-30 14:06:34 -0600 > [0]PETSC ERROR: ./ex76 on a arch-cuda-single named es by petsc Wed Nov 30 15:56:26 2016 > [0]PETSC ERROR: Configure options --with-cuda=1 --with-cusp=1 -with-cusp-dir=/home/balay/soft/cusplibrary-0.4.0 --with-thrust=1 --with-precision=single --with-clanguage=c --with-cuda-arch=sm_20 --with-no-output -PETSC_ARCH=arch-cuda-single -PETSC_DIR=/sandbox/petsc/petsc.clone-2 > [0]PETSC ERROR: #1 main() line 294 in /sandbox/petsc/petsc.clone-2/src/mat/examples/tests/ex76.c > [0]PETSC ERROR: PETSc Option Table entries: > [0]PETSC ERROR: -bs 8 > [0]PETSC ERROR: -display 140.221.10.20:0.0 > [0]PETSC ERROR: -malloc_dump > [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov---------- > application called MPI_Abort(MPI_COMM_WORLD, 83) - process 0 > > =================================================================================== > = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES > = EXIT CODE: 83 > = CLEANING UP REMAINING PROCESSES > = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES > =================================================================================== /sandbox/petsc/petsc.clone-2/src/mat/examples/tests Possible problem with ex76, diffs above =========================================
…string and then exits the program This was introduced because Ed Bueler observed that the standard way to get this information $ ./ex2 -help | head Produces a confusing error message for users Newton method to solve u'' + u^{2} = f, sequentially. This example employs a user-defined monitoring routine. -------------------------------------------------------------------------- Petsc Development GIT revision: v3.7.6-3506-g5282eff101 GIT Date: 2017-04-28 13:22:25 -0500 The PETSc Team petsc-maint@mcs.anl.gov http://www.mcs.anl.gov/petsc/ See docs/changes/index.html for recent updates. See docs/faq.html for problems. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 13 Broken Pipe: Likely while reading or writing to a socket [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: or try option -log_stack [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Signal received [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.7.6-3506-g5282eff101 GIT Date: 2017-04-28 13:22:25 -0500 [0]PETSC ERROR: ./ex2 on a arch-basic named anlextwls003-074.wl.anl-external.org by barrysmith Fri Apr 28 14:09:32 2017 [0]PETSC ERROR: Configure options [0]PETSC ERROR: #1 User provided function() line 0 in unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 Commit-type: feature Time: .2 hours Thanks-to: Ed Bueler <elbueler@alaska.edu>
balay@ps4 ~/petsc/src/sys/examples/tests $ ./ex1f [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Error message [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-1788-g3483735f6a GIT Date: 2018-02-25 19:19:36 -0600 [0]PETSC ERROR: ./ex1f on a arch-mswin-c-opt named PS4. by balay Sun Feb 25 21:32:54 2018 [0]PETSC ERROR: Configure options --with-shared-libraries=0 --with-debugging=0 --with-visibility=0 --with-mpi=0 [0]PETSC ERROR: #1 User provided function() line 0 in User file My error handler Error message [0]PETSC ERROR: User provided function() line 0 in User file Error message Program received signal SIGABRT: Process abort signal. Backtrace for this error: <hang>
[0]PETSC ERROR: #1 User provided function() line 0 in User file Operating system error: Cannot allocate memory Memory allocation failure in xrealloc
Since the trigger is flipping imax,imin vars, try fixing by applying this change to the whole function. not ok snes_tutorials-ex77_2_par # [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- # [2]PETSC ERROR: Null argument, when expecting valid pointer # [2]PETSC ERROR: Null Object: Parameter # 2 # [2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. # [2]PETSC ERROR: Petsc Development GIT revision: v3.9.3-847-gbe59aeb GIT Date: 2018-07-06 13:22:44 -0500 # [2]PETSC ERROR: ../ex77 on a arch-c-exodus-dbg-builder named frog by petsc Sat Jul 7 05:30:50 2018 # [2]PETSC ERROR: Configure options --download-suitesparse --download-mumps --download-scalapack --download-chaco --download-ctetgen --download-exodusii --download-cmake --download-pnetcdf --download-generator --download-hdf5 --download-zlib=1 --download-metis --download-ml --download-netcdf --download-parmetis --download-triangle --with-cuda --with-shared-libraries PETSC_ARCH=arch-c-exodus-dbg-builder PETSC_DIR=/sandbox/petsc/petsc.next-3 # [2]PETSC ERROR: #1 DMFieldGetFEInvariance() line 383 in /sandbox/petsc/petsc.next-3/src/dm/field/interface/dmfield.c # [2]PETSC ERROR: #2 DMFieldComputeFaceData_DS() line 751 in /sandbox/petsc/petsc.next-3/src/dm/field/impls/ds/dmfieldds.c
…nse_MPIDense() arch-linux-uni: not ok mat_tests-ex94_matmatmult_2 # [0]PETSC ERROR: #1 MatTransposeMatMultNumeric_MPIDense_MPIDense() line 1966 in /sandbox/petsc/petsc.next-2/src/mat/impls/dense/mpi/mpidense.c # [0]PETSC ERROR: #2 MatTransposeMatMult_MPIDense_MPIDense() line 2017 in /sandbox/petsc/petsc.next-2/src/mat/impls/dense/mpi/mpidense.c # [0]PETSC ERROR: #3 MatTransposeMatMult() line 10114 in /sandbox/petsc/petsc.next-2/src/mat/interface/matrix.c # [0]PETSC ERROR: #4 MatPtAP_fallback() line 9393 in /sandbox/petsc/petsc.next-2/src/mat/interface/matrix.c # [0]PETSC ERROR: #5 MatPtAP() line 9485 in /sandbox/petsc/petsc.next-2/src/mat/interface/matrix.c # [0]PETSC ERROR: #6 main() line 290 in /sandbox/petsc/petsc.next-2/src/mat/examples/tests/ex94.c
arch-freebsd-cxx-cmplx-pkgs-dbg not ok dm_impls_forest_tests-ex2_p4est_2d_deg3_steps3_L2_periodic # [0]PETSC ERROR: #1 DMPforestGetTransferSF_Point() line 2536 in /usr/home/balay/petsc.next/src/dm/impls/forest/p4est/pforest.c
* jolivet/redundant-Cholesky-sbaij: PCCHOLESKY in PCREDUNDANT with SBAIJ matrices. PCREDUNDANT does not currently play nice with SBAIJ matrices out-of-the-box and you end up with something like the following. [0]PETSC ERROR: #1 MatGetFactor() line 4411 in src/mat/interface/matrix.c [0]PETSC ERROR: #2 PCSetUp_LU() line 88 in src/ksp/pc/impls/factor/lu/lu.c [0]PETSC ERROR: #3 PCSetUp() line 894 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: #4 KSPSetUp() line 377 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #5 PCSetUp_Redundant() line 179 in src/ksp/pc/impls/redundant/redundant.c This is an attempt to fix this by switching to PCCHOLESKY instead of PCLU for symmetric matrices.
Tool for querying the tests. Which tests to query? Two options: 1. Query only the tests that are run for a given configuration. 2. Query all of the test files in the source directory For #1: Use dataDict as written out by gmakegentest.py in $PETSC_ARCH/$TESTBASE For #2: Walk the entire tree parsing the files as we go along using testparse. The tree walker is simpler than what is in gmakegentest.py because it can avoid all of the logic regarding configuration. See documentation at the top of query_tests.py for further details. The dataDict is further transformed to allow fast searching. The query_test.py script by default outputs the info into a form that is able to be used by gmakefile.test to enable this functionality: make -f ${makefile} test query='requires' queryval='*MPI_PROCESS_SHARED_MEMORY*' More rich querying can be done from within ipython. This is documented in the developer's documentation.
not ok vec_is_tests-ex2_mpiio_3 # Error code: 246 # [0]PETSC ERROR: Invalid argument # [0]PETSC ERROR: Not an IS next in file # [0]PETSC ERROR: #1 ISLoad_Binary() line 53 in C:\cygwin64\home\petsc\builds\LjS-jb-N\0\petsc\petsc\src\vec\is\utils\isio.c
…sed with a different type of pmat compared to pmat[i] Always destroy previously created pmat[i] and use MAT_INITIAL_MATRIX issue #788 [0]PETSC ERROR: #1 PetscMemcmp() line 40 in /Users/vtdb72/Documents/work/src/deps/petsc/src/sys/utils/memc.c [0]PETSC ERROR: #2 MatCreateSubMatrix_SeqAIJ() line 2635 in /Users/vtdb72/Documents/work/src/deps/petsc/src/mat/impls/aij/seq/aij.c [0]PETSC ERROR: #3 MatCreateSubMatrices_SeqAIJ() line 2881 in /Users/vtdb72/Documents/work/src/deps/petsc/src/mat/impls/aij/seq/aij.c [0]PETSC ERROR: #4 MatCreateSubMatrices() line 6857 in /Users/vtdb72/Documents/work/src/deps/petsc/src/mat/interface/matrix.c ... Commit-type: bug-fix /spend 45m Reported-by: Lawrence Mitchell https://gitlab.com/wence
…gnedAddress error with cuda-12 on frog The error is not reproducible on frog with cuda-11.7/11.8 or other machines with cuda-12.0/12.1 On frog, we have the following error trace: [balay@frog tests]$ /software/mpich-4.1.1/bin/mpiexec -n 2 ./ex69 -A_mat_type aijcusparse -test 0 -k 6 -l 0 -use_shell 1 -use_gpu_aware_mpi 0 [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: GPU error [0]PETSC ERROR: cuda error 716 (cudaErrorMisalignedAddress) : misaligned address [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.18.5-1103-gedeb0b75eb7 GIT Date: 2023-03-24 13:29:04 -0500 [0]PETSC ERROR: ./ex69 on a arch-cuda121 named frog.mcs.anl.gov by balay Sat Mar 25 13:18:58 2023 [0]PETSC ERROR: Configure options --with-mpi-dir=/software/mpich-4.1.1 --with-cuda-dir=/usr/local/cuda-12.1 PETSC_ARCH=arch-cuda121 [0]PETSC ERROR: #1 PetscSFLinkMemcpy_CUDA() at /home/balay/petsc/src/vec/is/sf/impls/basic/cuda/sfcuda.cu:1075 [0]PETSC ERROR: #2 PetscSFLinkCopyRootBufferInCaseNotUseGpuAwareMPI() at /home/balay/petsc/include/../src/vec/is/sf/impls/basic/sfpack.h:297 [0]PETSC ERROR: #3 PetscSFLinkStartRequests_MPI() at /home/balay/petsc/src/vec/is/sf/impls/basic/sfmpi.c:32 [0]PETSC ERROR: #4 PetscSFLinkStartCommunication() at /home/balay/petsc/include/../src/vec/is/sf/impls/basic/sfpack.h:270 [0]PETSC ERROR: #5 PetscSFBcastBegin_Basic() at /home/balay/petsc/src/vec/is/sf/impls/basic/sfbasic.c:191 [0]PETSC ERROR: #6 PetscSFBcastWithMemTypeBegin() at /home/balay/petsc/src/vec/is/sf/interface/sf.c:1454 [0]PETSC ERROR: #7 VecScatterBegin_Internal() at /home/balay/petsc/src/vec/is/sf/interface/vscat.c:73 [0]PETSC ERROR: #8 VecScatterBegin() at /home/balay/petsc/src/vec/is/sf/interface/vscat.c:1319 [0]PETSC ERROR: #9 MatMult_MPIAIJCUSPARSE() at /home/balay/petsc/src/mat/impls/aij/mpi/mpicusparse/mpiaijcusparse.cu:451 [0]PETSC ERROR: #10 MatMult() at /home/balay/petsc/src/mat/interface/matrix.c:2573 [0]PETSC ERROR: #11 MatMult_S() at ex69.c:11 [0]PETSC ERROR: #12 MatMult_Shell() at /home/balay/petsc/src/mat/impls/shell/shell.c:1014 [0]PETSC ERROR: #13 MatMult() at /home/balay/petsc/src/mat/interface/matrix.c:2573 [0]PETSC ERROR: #14 MatProductNumeric_X_Dense() at /home/balay/petsc/src/mat/interface/matproduct.c:329 [0]PETSC ERROR: #15 MatProductNumeric() at /home/balay/petsc/src/mat/interface/matproduct.c:685 [0]PETSC ERROR: #16 main() at ex69.c:127 [0]PETSC ERROR: PETSc Option Table entries: [0]PETSC ERROR: -A_mat_type aijcusparse (source: command line) [0]PETSC ERROR: -k 6 (source: command line) [0]PETSC ERROR: -l 0 (source: command line) [0]PETSC ERROR: -test 0 (source: command line) [0]PETSC ERROR: -use_gpu_aware_mpi 0 (source: command line) [0]PETSC ERROR: -use_shell 1 (source: command line) [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov---------- application called MPI_Abort(MPI_COMM_SELF, 97) - process 0 [balay@frog tests]$ /software/mpich-4.1.1/bin/mpiexec -n 2 compute-sanitizer ./ex69 -A_mat_type aijcusparse -test 0 -k 6 -l 0 -use_shell 1 -use_gpu_aware_mpi 0 ========= COMPUTE-SANITIZER ========= COMPUTE-SANITIZER ========= Invalid __global__ write of size 16 bytes ========= at 0x470 in void cusparse::csrmv_v3_partition_kernel<(int)256, cusparse::VectorScalarMultiplyPolicy, int, int, double, double, double>(const T4 *, T3, T4, int, T3 *, cusparse::KernelCoeff<T7>, T3, T6 *) ========= by thread (0,0,0) in block (0,0,0) ========= Address 0x7f0dee800678 is misaligned ========= and is inside the nearest allocation at 0x7f0dee800600 of size 720 bytes ========= Saved host backtrace up to driver entry point at kernel launch time ========= Host Frame: [0x30b492] ========= in /lib64/libcuda.so.1 ========= Host Frame: [0x8ba1eb] ========= in /usr/local/cuda-12.1/lib64/libcusparse.so.12 ========= Host Frame: [0x9161eb] ========= in /usr/local/cuda-12.1/lib64/libcusparse.so.12 ========= Host Frame: [0x193817] ========= in /usr/local/cuda-12.1/lib64/libcusparse.so.12 ========= Host Frame: [0x19c258] ========= in /usr/local/cuda-12.1/lib64/libcusparse.so.12 ========= Host Frame: [0x1c2fc5] ========= in /usr/local/cuda-12.1/lib64/libcusparse.so.12 ========= Host Frame:cusparseSpMV [0xeb15d] ========= in /usr/local/cuda-12.1/lib64/libcusparse.so.12 ========= Host Frame:/home/balay/petsc/src/mat/impls/aij/seq/seqcusparse/aijcusparse.cu:3627:MatMultAddKernel_SeqAIJCUSPARSE(_p_Mat*, _p_Vec*, _p_Vec*, _p_Vec*, PetscBool, PetscBool) [0x1e8917b] ========= in /home/balay/petsc/arch-cuda121/lib/libpetsc.so.3.018 ========= Host Frame:/home/balay/petsc/src/mat/impls/aij/seq/seqcusparse/aijcusparse.cu:3483:MatMult_SeqAIJCUSPARSE(_p_Mat*, _p_Vec*, _p_Vec*) [0x1e85e9c] ========= in /home/balay/petsc/arch-cuda121/lib/libpetsc.so.3.018 ========= Host Frame:/home/balay/petsc/src/mat/impls/aij/mpi/mpicusparse/mpiaijcusparse.cu:452:MatMult_MPIAIJCUSPARSE(_p_Mat*, _p_Vec*, _p_Vec*) [0x1e088c7] ========= in /home/balay/petsc/arch-cuda121/lib/libpetsc.so.3.018 ========= Host Frame:/home/balay/petsc/src/mat/interface/matrix.c:2573:MatMult [0x1cdaf43] ========= in /home/balay/petsc/arch-cuda121/lib/libpetsc.so.3.018 ========= Host Frame:/home/balay/petsc/src/mat/tests/ex69.c:11:MatMult_S [0x26fb] ========= in /home/balay/petsc/src/mat/tests/./ex69 ========= Host Frame:/home/balay/petsc/src/mat/impls/shell/shell.c:1014:MatMult_Shell [0x1b354b7] ========= in /home/balay/petsc/arch-cuda121/lib/libpetsc.so.3.018 ========= Host Frame:/home/balay/petsc/src/mat/interface/matrix.c:2573:MatMult [0x1cdaf43] ========= in /home/balay/petsc/arch-cuda121/lib/libpetsc.so.3.018 ========= Host Frame:/home/balay/petsc/src/mat/interface/matproduct.c:329:MatProductNumeric_X_Dense [0x1c87033] ========= in /home/balay/petsc/arch-cuda121/lib/libpetsc.so.3.018 ========= Host Frame:/home/balay/petsc/src/mat/interface/matproduct.c:685:MatProductNumeric [0x1c8f75f] ========= in /home/balay/petsc/arch-cuda121/lib/libpetsc.so.3.018 ========= Host Frame:/home/balay/petsc/src/mat/tests/ex69.c:127:main [0x5d7e] ========= in /home/balay/petsc/src/mat/tests/./ex69 ========= Host Frame:__libc_start_call_main [0x3feb0] ========= in /lib64/libc.so.6 ========= Host Frame:__libc_start_main [0x3ff60] ========= in /lib64/libc.so.6 ========= Host Frame:_start [0x2395] ========= in /home/balay/petsc/src/mat/tests/./ex69
…ug for unknwon reasons. Even some cusparse APIs were introduced in cuda-11.3.0, we now use them only after cuda-11.4.0, which was two months younger than 11.3.0. We met an error (see below) with cuda-11.3.x, but not with cuda versions lower or higher than 11.3. This might be a petsc bug. Using ifdefs, aijcusparse code takes different paths, making the issue difficult to investigate. So we just increase the cuda version to get away. ------- snes_tests-ex13_cuda mpirun -n 4 ./ex13 -petsc_ci -dm_plex_dim 2 -benchmark_it 10 -dm_plex_box_faces 4,4 -dm_refine 2 -petscpartitioner_simple_process_grid 2,2 -petscpartitioner_simple_node_grid 1,1 -potential_petscspace_degree 2 -petscpartitioner_type simple -dm_plex_simplex 0 -snes_type ksponly -dm_view -ksp_type cg -pc_type gamg -pc_gamg_process_eq_limit 400 -ksp_norm_type unpreconditioned -ksp_converged_reason -dm_mat_type aijcusparse -dm_vec_type cuda [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [3]PETSC ERROR: GPU error [3]PETSC ERROR: cuSPARSE errorcode 7 (CUSPARSE_STATUS_INTERNAL_ERROR) : internal error [3]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [3]PETSC ERROR: Petsc Development GIT revision: v3.18.5-1152-g648263c9 GIT Date: 2023-03-26 20:01:43 +0000 [3]PETSC ERROR: ./ex13 on a arch-kokkos-dbg named hong-gce-workstation by jczhang Mon Mar 27 15:55:44 2023 [3]PETSC ERROR: Configure options --PETSC_ARCH=arch-kokkos-dbg --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --COPTFLAGS="-g -O0" --FOPTFLAGS="-g -O0" --CXXOPTFLAGS="-g -O0" --with-cuda --with-cudac=nvcc --with-strict-petscerrorcode [3]PETSC ERROR: #1 MatProductNumeric_SeqAIJCUSPARSE_SeqAIJCUSPARSE() at /scratch/jczhang/petsc/src/mat/impls/aij/seq/seqcusparse/aijcusparse.cu:2968 [3]PETSC ERROR: #2 MatProductNumeric_ABC_Basic() at /scratch/jczhang/petsc/src/mat/interface/matproduct.c:1129 [3]PETSC ERROR: #3 MatProductNumeric_MPIAIJBACKEND() at /scratch/jczhang/petsc/src/mat/impls/aij/mpi/mpiaij.c:7024 [3]PETSC ERROR: #4 MatProductNumeric() at /scratch/jczhang/petsc/src/mat/interface/matproduct.c:685 [3]PETSC ERROR: #5 MatPtAP() at /scratch/jczhang/petsc/src/mat/interface/matrix.c:9907 [3]PETSC ERROR: #6 PCSetUp_GAMG() at /scratch/jczhang/petsc/src/ksp/pc/impls/gamg/gamg.c:558 [3]PETSC ERROR: #7 PCSetUp() at /scratch/jczhang/petsc/src/ksp/pc/interface/precon.c:994 [3]PETSC ERROR: #8 KSPSetUp() at /scratch/jczhang/petsc/src/ksp/ksp/interface/itfunc.c:405 [3]PETSC ERROR: #9 KSPSolve_Private() at /scratch/jczhang/petsc/src/ksp/ksp/interface/itfunc.c:823 [3]PETSC ERROR: #10 KSPSolve() at /scratch/jczhang/petsc/src/ksp/ksp/interface/itfunc.c:1069 [3]PETSC ERROR: #11 SNESSolve_KSPONLY() at /scratch/jczhang/petsc/src/snes/impls/ksponly/ksponly.c:48 [3]PETSC ERROR: #12 SNESSolve() at /scratch/jczhang/petsc/src/snes/interface/snes.c:4666 [3]PETSC ERROR: #13 main() at ex13.c:193
Close #1281 [0]PETSC ERROR: Petsc has generated inconsistent data [0]PETSC ERROR: Depth 2 > maxdepth+1 1 [0]PETSC ERROR: #1 PetscLogNestedTreeCreate() at src/sys/logging/xmllogevent.c
Reported-by: Chris Douglas <douglas@ladhyx.polytechnique.fr> [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Object is in wrong state [0]PETSC ERROR: Mat object's type is not set: Argument # 1 [0]PETSC ERROR: #1 MatGetFactorAvailable() at src/mat/interface/matrix.c:4820 [0]PETSC ERROR: #2 PCGetDefaultType_Private() at src/ksp/pc/interface/precon.c:25 [0]PETSC ERROR: #3 PCSetUp() at src/ksp/pc/interface/precon.c:1109 [0]PETSC ERROR: #4 KSPSetUp() at src/ksp/ksp/interface/itfunc.c:406 [0]PETSC ERROR: #5 KSPSolve_Private() at src/ksp/ksp/interface/itfunc.c:826 [0]PETSC ERROR: #6 KSPSolveTranspose() at src/ksp/ksp/interface/itfunc.c:1122 [0]PETSC ERROR: #7 PCApplyTranspose_FieldSplit() at src/ksp/pc/impls/fieldsplit/fieldsplit.c:1487 [0]PETSC ERROR: #8 PCApplyTranspose() at src/ksp/pc/interface/precon.c:715 [0]PETSC ERROR: #9 KSP_PCApply() at include/petsc/private/kspimpl.h:380 [0]PETSC ERROR: #10 KSPInitialResidual() at src/ksp/ksp/interface/itres.c:63 [0]PETSC ERROR: #11 KSPSolve_GMRES() at src/ksp/ksp/impls/gmres/gmres.c:226 [0]PETSC ERROR: #12 KSPSolve_Private() at src/ksp/ksp/interface/itfunc.c:900 [0]PETSC ERROR: #13 KSPSolveTranspose() at src/ksp/ksp/interface/itfunc.c:1122 [0]PETSC ERROR: #14 main() at ex9.c:80 [0]PETSC ERROR: Reached the main program with an out-of-range error code 1. This should never happen
Pull request for MAGMA dense mat solver changes.