Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

storage crash when import data #3625

Closed
Nicole00 opened this issue Jan 4, 2022 · 2 comments · Fixed by #3553
Closed

storage crash when import data #3625

Nicole00 opened this issue Jan 4, 2022 · 2 comments · Fixed by #3553
Assignees
Labels
type/bug Type: something is unexpected
Milestone

Comments

@Nicole00
Copy link
Contributor

Nicole00 commented Jan 4, 2022

Describe the bug (required)

When importing data to the newly installed nightly version (0103), the storaged crashed.

Your Environments (required)
Nebula: nebula-graph-2022.01.03-nightly.el7.x86_64.rpm
env: 1 metad, 1 graphd, 1 storaged

How To Reproduce(required)

  1. create space test(vid_type=fixed_string(20))
  2. use tets
  3. CREATE EDGE IF NOT EXISTS HAS_MEMBER(joinDate string);
  4. import ldbc data forum_hasMember_person into nebula

Additional context

core:

[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib64/libthread_db.so.1".
Core was generated by `/home/darion/nicole/nebula_nightly_0103/bin/nebula-storaged --flagfile /home/da'.
Program terminated with signal 11, Segmentation fault.
#0  0x000000000149424a in folly::SingletonThreadLocal<nebula::meta::MetaClient::ThreadLocalInfo, folly::detail::DefaultTag, folly::detail::DefaultMake<nebula::meta::MetaClient::ThreadLocalInfo>, void>::getWrapper() ()
Missing separate debuginfos, use: debuginfo-install glibc-2.17-325.el7_9.x86_64
(gdb) bt
#0  0x000000000149424a in folly::SingletonThreadLocal<nebula::meta::MetaClient::ThreadLocalInfo, folly::detail::DefaultTag, folly::detail::DefaultMake<nebula::meta::MetaClient::ThreadLocalInfo>, void>::getWrapper() ()
#1  0x00000000014942c4 in folly::SingletonThreadLocal<nebula::meta::MetaClient::ThreadLocalInfo, folly::detail::DefaultTag, folly::detail::DefaultMake<nebula::meta::MetaClient::ThreadLocalInfo>, void>::LocalLifetime::~LocalLifetime() ()
#2  0x00000000029db796 in (anonymous namespace)::run(void*) ()
#3  0x00007fe0f8791ca2 in __nptl_deallocate_tsd () from /lib64/libpthread.so.0
#4  0x00007fe0f8791eb3 in start_thread () from /lib64/libpthread.so.0
#5  0x00007fe0f84bab0d in clone () from /lib64/libc.so.6

nebula-sotraged.INFO

I20220104 16:50:27.275475  6645 TransactionManager.cpp:184] leader get do scanPrimes space=1, part=28, term=1
I20220104 16:50:27.275493  6645 TransactionManager.cpp:190] scanPrimes(), spaceId=1, partId=28
I20220104 16:50:27.275508  6645 TransactionManager.cpp:241] insert space=1, part=28, into white list suc=true
I20220104 16:50:27.299592  6645 RaftPart.cpp:1013] [Port: 1780, Space: 1, Part: 33] Start leader election, reason: lastMsgDur 995, term 0
I20220104 16:50:27.299628  6645 RaftPart.cpp:1179] [Port: 1780, Space: 1, Part: 33] Sending out an election request (space = 1, part = 33, term = 1, lastLogId = 0, lastLogTerm = 0, candidateIP = 192.168.8.171, candidatePort = 1780), isPreVote = 1
I20220104 16:50:27.299633  6645 RaftPart.cpp:1134] [Port: 1780, Space: 1, Part: 33] Partition win prevote of term 1
I20220104 16:50:27.299638  6645 RaftPart.cpp:1179] [Port: 1780, Space: 1, Part: 33] Sending out an election request (space = 1, part = 33, term = 1, lastLogId = 0, lastLogTerm = 0, candidateIP = 192.168.8.171, candidatePort = 1780), isPreVote = 0
I20220104 16:50:27.299641  6645 RaftPart.cpp:1136] [Port: 1780, Space: 1, Part: 33] Partition is elected as the new leader for term 1
I20220104 16:50:27.300009  6648 TransactionManager.cpp:184] leader get do scanPrimes space=1, part=33, term=1
I20220104 16:50:27.300029  6648 TransactionManager.cpp:190] scanPrimes(), spaceId=1, partId=33
I20220104 16:50:27.300045  6648 TransactionManager.cpp:241] insert space=1, part=33, into white list suc=true
I20220104 16:50:36.321823  6644 MetaClient.cpp:3202] Load leader of "192.168.8.171":1779 in 1 space
I20220104 16:50:36.321857  6644 MetaClient.cpp:3208] Load leader ok
I20220104 16:50:56.363323  6644 MetaClient.cpp:3202] Load leader of "192.168.8.171":1779 in 1 space
I20220104 16:50:56.363356  6644 MetaClient.cpp:3208] Load leader ok
I20220104 16:54:13.985711  6662 EventListener.h:18] Rocksdb start compaction column family: default because of LevelL0FilesNum, status: OK, compacted 4 files into 0, base level is 0, output level is 1
I20220104 16:54:13.985965  6662 CompactionFilter.h:54] Do full/manual compaction!
I20220104 16:54:16.246752  6662 EventListener.h:28] Rocksdb compaction completed column family: default because of LevelL0FilesNum, status: OK, compacted 4 files into 2, base level is 0, output level is 1
I20220104 16:54:16.300030  6663 EventListener.h:18] Rocksdb start compaction column family: default because of LevelL0FilesNum, status: OK, compacted 6 files into 0, base level is 0, output level is 1
I20220104 16:54:16.300343  6663 CompactionFilter.h:54] Do full/manual compaction!
I20220104 16:54:16.300405  7646 CompactionFilter.h:54] Do full/manual compaction!
I20220104 16:54:17.670609  6035 SlowOpTracker.h:31] [Port: 1780, Space: 1, Part: 31] total time:463ms, Total commit: 1
I20220104 16:54:17.670670  6028 SlowOpTracker.h:31] [Port: 1780, Space: 1, Part: 47] total time:463ms, Total commit: 1
I20220104 16:54:17.671099  6034 SlowOpTracker.h:31] [Port: 1780, Space: 1, Part: 84] total time:463ms, Total commit: 2
@Nicole00 Nicole00 added the type/bug Type: something is unexpected label Jan 4, 2022
@Nicole00
Copy link
Contributor Author

Nicole00 commented Jan 4, 2022

It happend in my 1230(nebula-graph-2021.12.30-nightly.el7.x86_64.rpm) nightly version.

Duplicate issue with #3623

@cangfengzhs
Copy link
Contributor

The coredump stack is the same as stack in #3553. I have created a pull request to fix it,but meet some confict. I will merge it quickly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type/bug Type: something is unexpected
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants