Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[YSQL] GetPGVariable core dump while running benchbase voter on 2024.1 build #21947

Closed
1 task done
qvad opened this issue Apr 12, 2024 · 1 comment
Closed
1 task done
Assignees
Labels
2024.1_blocker area/ysql Yugabyte SQL (YSQL) kind/bug This issue is a bug priority/medium Medium priority issue

Comments

@qvad
Copy link
Contributor

qvad commented Apr 12, 2024

Jira Link: DB-10864

Description

Trying to start voter benchbase benchmark

Launch command

java -Xmx8g -jar benchbase.jar -b voter -c config/yugabyte/sample_voter_config.xml 
--create=true --load=true --execute=true -im 5000 2>&1

Core dump:

(lldb) target create "/home/yugabyte/yb-software/yugabyte-2024.1.0.0-b66-centos-x86_64/postgres/bin/postgres" --core "/home/yugabyte/cores/core_42540_1712917276_!home!yugabyte!yb-software!yugabyte-2024.1.0.0-b66-centos-x86_64!postgres!bin!postgres"
Core file '/home/yugabyte/cores/core_42540_1712917276_!home!yugabyte!yb-software!yugabyte-2024.1.0.0-b66-centos-x86_64!postgres!bin!postgres' (x86_64) was loaded.
(lldb) bt all
error: postgres GetDIE for DIE 0x34 is outside of its CU 0x8c4d04
error: postgres GetDIE for DIE 0x34 is outside of its CU 0x8c4d04
* thread #1, name = 'postgres', stop reason = signal SIGSEGV
  * frame #0: 0x00007f27214e2e25 libc.so.6`__strlen_avx2 + 21
    frame #1: 0x000055f2b0eb3ef9 postgres`GetPGVariable [inlined] cstring_to_text(s=0x0000000000000000) at varlena.c:151:37
    frame #2: 0x000055f2b0eb3ef1 postgres`GetPGVariable at guc.c:9941:15
    frame #3: 0x000055f2b0eb3c0a postgres`GetPGVariable(name=<unavailable>, dest=<unavailable>) at guc.c:9827:3
    frame #4: 0x000055f2b0ce28b4 postgres`standard_ProcessUtility(pstmt=0x00002736bf9b3988, queryString="SHOW ALL", context=PROCESS_UTILITY_TOPLEVEL, params=0x0000000000000000, queryEnv=0x0000000000000000, dest=0x00002736bfc31b60, completionTag="") at utility.c:719:5
    frame #5: 0x000055f2b0ce522c postgres`YBProcessUtilityDefaultHook(pstmt=0x00002736bf9b3988, queryString="SHOW ALL", context=PROCESS_UTILITY_TOPLEVEL, params=0x0000000000000000, queryEnv=0x0000000000000000, dest=0x00002736bfc31b60, completionTag="") at utility.c:3612:3
    frame #6: 0x00007f271cca66c9 pg_stat_statements.so`pgss_ProcessUtility(pstmt=0x00002736bf9b3988, queryString="SHOW ALL", context=PROCESS_UTILITY_TOPLEVEL, params=0x0000000000000000, queryEnv=0x0000000000000000, dest=0x00002736bfc31b60, completionTag="") at pg_stat_statements.c:1420:5
    frame #7: 0x00007f271cb7d4da yb_pg_metrics.so`ybpgm_ProcessUtility(pstmt=0x00002736bf9b3988, queryString="SHOW ALL", context=PROCESS_UTILITY_TOPLEVEL, params=0x0000000000000000, queryEnv=0x0000000000000000, dest=0x00002736bfc31b60, completionTag="") at yb_pg_metrics.c:772:9
    frame #8: 0x00007f271cb6105e pg_hint_plan.so`pg_hint_plan_ProcessUtility(pstmt=<unavailable>, queryString=<unavailable>, context=<unavailable>, params=<unavailable>, queryEnv=<unavailable>, dest=<unavailable>, completionTag="") at pg_hint_plan.c:3056:3
    frame #9: 0x000055f2b0ebded5 postgres`YBTxnDdlProcessUtility(pstmt=0x00002736bf9b3988, queryString="SHOW ALL", context=PROCESS_UTILITY_TOPLEVEL, params=0x0000000000000000, queryEnv=0x0000000000000000, dest=0x00002736bfc31b60, completionTag="") at pg_yb_utils.c:2140:4
    frame #10: 0x000055f2b0ce1907 postgres`PortalRunUtility [inlined] ProcessUtility(pstmt=0x00002736bf9b3988, queryString=<unavailable>, context=<unavailable>, params=<unavailable>, queryEnv=<unavailable>, dest=<unavailable>, completionTag="") at utility.c:377:3
    frame #11: 0x000055f2b0ce18f2 postgres`PortalRunUtility(portal=0x00002736bf88a118, pstmt=0x00002736bf9b3988, isTopLevel=<unavailable>, setHoldSnapshot=<unavailable>, dest=<unavailable>, completionTag="") at pquery.c:1202:2
    frame #12: 0x000055f2b0ce0f7b postgres`FillPortalStore(portal=0x00002736bf88a118, isTopLevel=<unavailable>) at pquery.c:1062:4
    frame #13: 0x000055f2b0ce0a99 postgres`PortalRun(portal=0x00002736bf88a118, count=9223372036854775807, isTopLevel=<unavailable>, run_once=<unavailable>, dest=0x00002736bfd1c558, altdest=0x00002736bfd1c558, completionTag="") at pquery.c:780:6
    frame #14: 0x000055f2b0cdf2ed postgres`yb_exec_execute_message_impl [inlined] exec_execute_message(portal_name="", max_rows=<unavailable>) at postgres.c:2116:14
    frame #15: 0x000055f2b0cdedab postgres`yb_exec_execute_message_impl(raw_ctx=<unavailable>) at postgres.c:4795:2
    frame #16: 0x000055f2b0cdc37d postgres`yb_exec_query_wrapper_one_attempt(exec_context=0x00002736bfd1c000, restart_data=0x00002736bfd1c528, functor=(postgres`yb_exec_execute_message_impl at postgres.c:4793), functor_context=0x00007ffc9b17b920, attempt=0, retry=0x00007ffc9b17bd20) at postgres.c:4727:3
    frame #17: 0x000055f2b0cd414b postgres`PostgresMain at postgres.c:4751:3
    frame #18: 0x000055f2b0cd4106 postgres`PostgresMain [inlined] yb_exec_execute_message(portal_name="", max_rows=<unavailable>, restart_data=0x00002736bfd1c528, exec_context=0x00002736bfd1c000) at postgres.c:4812:2
    frame #19: 0x000055f2b0cd40f8 postgres`PostgresMain(argc=<unavailable>, argv=<unavailable>, dbname=<unavailable>, username=<unavailable>) at postgres.c:5537:7
    frame #20: 0x000055f2b0c10b50 postgres`BackendRun(port=0x00002736bfc085a0) at postmaster.c:4736:2
    frame #21: 0x000055f2b0c0fd2d postgres`ServerLoop [inlined] BackendStartup(port=0x00002736bfc085a0) at postmaster.c:4400:3
    frame #22: 0x000055f2b0c0fc8e postgres`ServerLoop at postmaster.c:1778:7
    frame #23: 0x000055f2b0c0ae36 postgres`PostmasterMain(argc=25, argv=0x00002736bfd0e1a0) at postmaster.c:1434:11
    frame #24: 0x000055f2b0b09cca postgres`PostgresServerProcessMain(argc=25, argv=0x00002736bfd0e1a0) at main.c:234:3
    frame #25: 0x000055f2b07bc5d2 postgres`main + 34
    frame #26: 0x00007f2721451d85 libc.so.6`__libc_start_main + 229
    frame #27: 0x000055f2b07bc4ee postgres`_start + 46
  thread #2, stop reason = signal 0
    frame #0: 0x00007f27217ea45c libpthread.so.0`pthread_cond_wait@@GLIBC_2.3.2 + 508
    frame #1: 0x00007f271de6f957 libyrpc.so`boost::asio::detail::scheduler::run(boost::system::error_code&) [inlined] void boost::asio::detail::posix_event::wait<boost::asio::detail::conditionally_enabled_mutex::scoped_lock>(this=0x00002736bfddcb70, lock=0x00007f2712168198) at posix_event.hpp:119:7
    frame #2: 0x00007f271de6f93a libyrpc.so`boost::asio::detail::scheduler::run(boost::system::error_code&) [inlined] boost::asio::detail::conditionally_enabled_event::wait(this=0x00002736bfddcb68, lock=0x00007f2712168198) at conditionally_enabled_event.hpp:97:14
    frame #3: 0x00007f271de6f930 libyrpc.so`boost::asio::detail::scheduler::run(boost::system::error_code&) [inlined] boost::asio::detail::scheduler::do_run_one(this=0x00002736bfddcb00, lock=0x00007f2712168198, this_thread=0x00007f27121680c0, ec=0x00007f27121681f0) at scheduler.ipp:501:21
    frame #4: 0x00007f271de6f831 libyrpc.so`boost::asio::detail::scheduler::run(this=0x00002736bfddcb00, ec=0x00007f27121681f0) at scheduler.ipp:210:10
    frame #5: 0x00007f271de6f087 libyrpc.so`yb::rpc::IoThreadPool::Impl::Execute() [inlined] boost::asio::io_context::run(this=<unavailable>, ec=0x00007f27121681f0) at io_context.ipp:71:16
    frame #6: 0x00007f271de6f07a libyrpc.so`yb::rpc::IoThreadPool::Impl::Execute(this=<unavailable>) at io_thread_pool.cc:76:17
    frame #7: 0x00007f271d90b680 libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::__function::__value_func<void ()>::operator(this=0x00002736bfd86720)[abi:v170002]() const at function.h:517:16
    frame #8: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::function<void ()>::operator(this=0x00002736bfd86720)() const at function.h:1168:12
    frame #9: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(arg=0x00002736bfd866c0) at thread.cc:853:3
    frame #10: 0x00007f27217e41ca libpthread.so.0`start_thread + 234
    frame #11: 0x00007f2721450e73 libc.so.6`__clone + 67
  thread #3, stop reason = signal 0
    frame #0: 0x00007f2721546247 libc.so.6`epoll_wait + 87
    frame #1: 0x00007f271de7265f libyrpc.so`boost::asio::detail::epoll_reactor::run(this=0x00002736bfc4c700, usec=<unavailable>, ops=0x00007f2712969120) at epoll_reactor.ipp:501:20
    frame #2: 0x00007f271de6f9d6 libyrpc.so`boost::asio::detail::scheduler::run(boost::system::error_code&) at scheduler.ipp:476:16
    frame #3: 0x00007f271de6f831 libyrpc.so`boost::asio::detail::scheduler::run(this=0x00002736bfddcb00, ec=0x00007f27129691f0) at scheduler.ipp:210:10
    frame #4: 0x00007f271de6f087 libyrpc.so`yb::rpc::IoThreadPool::Impl::Execute() [inlined] boost::asio::io_context::run(this=<unavailable>, ec=0x00007f27129691f0) at io_context.ipp:71:16
    frame #5: 0x00007f271de6f07a libyrpc.so`yb::rpc::IoThreadPool::Impl::Execute(this=<unavailable>) at io_thread_pool.cc:76:17
    frame #6: 0x00007f271d90b680 libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::__function::__value_func<void ()>::operator(this=0x00002736bfd86600)[abi:v170002]() const at function.h:517:16
    frame #7: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::function<void ()>::operator(this=0x00002736bfd86600)() const at function.h:1168:12
    frame #8: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(arg=0x00002736bfd865a0) at thread.cc:853:3
    frame #9: 0x00007f27217e41ca libpthread.so.0`start_thread + 234
    frame #10: 0x00007f2721450e73 libc.so.6`__clone + 67
  thread #4, stop reason = signal 0
    frame #0: 0x00007f27217ea45c libpthread.so.0`pthread_cond_wait@@GLIBC_2.3.2 + 508
    frame #1: 0x00007f271de6f957 libyrpc.so`boost::asio::detail::scheduler::run(boost::system::error_code&) [inlined] void boost::asio::detail::posix_event::wait<boost::asio::detail::conditionally_enabled_mutex::scoped_lock>(this=0x00002736bfddcb70, lock=0x00007f2711967198) at posix_event.hpp:119:7
    frame #2: 0x00007f271de6f93a libyrpc.so`boost::asio::detail::scheduler::run(boost::system::error_code&) [inlined] boost::asio::detail::conditionally_enabled_event::wait(this=0x00002736bfddcb68, lock=0x00007f2711967198) at conditionally_enabled_event.hpp:97:14
    frame #3: 0x00007f271de6f930 libyrpc.so`boost::asio::detail::scheduler::run(boost::system::error_code&) [inlined] boost::asio::detail::scheduler::do_run_one(this=0x00002736bfddcb00, lock=0x00007f2711967198, this_thread=0x00007f27119670c0, ec=0x00007f27119671f0) at scheduler.ipp:501:21
    frame #4: 0x00007f271de6f831 libyrpc.so`boost::asio::detail::scheduler::run(this=0x00002736bfddcb00, ec=0x00007f27119671f0) at scheduler.ipp:210:10
    frame #5: 0x00007f271de6f087 libyrpc.so`yb::rpc::IoThreadPool::Impl::Execute() [inlined] boost::asio::io_context::run(this=<unavailable>, ec=0x00007f27119671f0) at io_context.ipp:71:16
    frame #6: 0x00007f271de6f07a libyrpc.so`yb::rpc::IoThreadPool::Impl::Execute(this=<unavailable>) at io_thread_pool.cc:76:17
    frame #7: 0x00007f271d90b680 libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::__function::__value_func<void ()>::operator(this=0x00002736bfd86840)[abi:v170002]() const at function.h:517:16
    frame #8: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::function<void ()>::operator(this=0x00002736bfd86840)() const at function.h:1168:12
    frame #9: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(arg=0x00002736bfd867e0) at thread.cc:853:3
    frame #10: 0x00007f27217e41ca libpthread.so.0`start_thread + 234
    frame #11: 0x00007f2721450e73 libc.so.6`__clone + 67
  thread #5, stop reason = signal 0
    frame #0: 0x00007f27217ea45c libpthread.so.0`pthread_cond_wait@@GLIBC_2.3.2 + 508
    frame #1: 0x00007f271de6f957 libyrpc.so`boost::asio::detail::scheduler::run(boost::system::error_code&) [inlined] void boost::asio::detail::posix_event::wait<boost::asio::detail::conditionally_enabled_mutex::scoped_lock>(this=0x00002736bfddcb70, lock=0x00007f2711166198) at posix_event.hpp:119:7
    frame #2: 0x00007f271de6f93a libyrpc.so`boost::asio::detail::scheduler::run(boost::system::error_code&) [inlined] boost::asio::detail::conditionally_enabled_event::wait(this=0x00002736bfddcb68, lock=0x00007f2711166198) at conditionally_enabled_event.hpp:97:14
    frame #3: 0x00007f271de6f930 libyrpc.so`boost::asio::detail::scheduler::run(boost::system::error_code&) [inlined] boost::asio::detail::scheduler::do_run_one(this=0x00002736bfddcb00, lock=0x00007f2711166198, this_thread=0x00007f27111660c0, ec=0x00007f27111661f0) at scheduler.ipp:501:21
    frame #4: 0x00007f271de6f831 libyrpc.so`boost::asio::detail::scheduler::run(this=0x00002736bfddcb00, ec=0x00007f27111661f0) at scheduler.ipp:210:10
    frame #5: 0x00007f271de6f087 libyrpc.so`yb::rpc::IoThreadPool::Impl::Execute() [inlined] boost::asio::io_context::run(this=<unavailable>, ec=0x00007f27111661f0) at io_context.ipp:71:16
    frame #6: 0x00007f271de6f07a libyrpc.so`yb::rpc::IoThreadPool::Impl::Execute(this=<unavailable>) at io_thread_pool.cc:76:17
    frame #7: 0x00007f271d90b680 libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::__function::__value_func<void ()>::operator(this=0x00002736bfd864e0)[abi:v170002]() const at function.h:517:16
    frame #8: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::function<void ()>::operator(this=0x00002736bfd864e0)() const at function.h:1168:12
    frame #9: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(arg=0x00002736bfd86480) at thread.cc:853:3
    frame #10: 0x00007f27217e41ca libpthread.so.0`start_thread + 234
    frame #11: 0x00007f2721450e73 libc.so.6`__clone + 67
  thread #6, stop reason = signal 0
    frame #0: 0x00007f27217ea45c libpthread.so.0`pthread_cond_wait@@GLIBC_2.3.2 + 508
    frame #1: 0x00007f2721a80622 libc++.so.1`std::__1::condition_variable::wait(std::__1::unique_lock<std::__1::mutex>&) + 18
    frame #2: 0x00007f271ded91e5 libyrpc.so`yb::rpc::(anonymous namespace)::Worker::Execute() [inlined] yb::rpc::(anonymous namespace)::Worker::PopTask(this=0x00002736bf88f960, task=0x00007f270f14e1b8) at thread_pool.cc:144:13
    frame #3: 0x00007f271ded8d68 libyrpc.so`yb::rpc::(anonymous namespace)::Worker::Execute(this=0x00002736bf88f960) at thread_pool.cc:114:11
    frame #4: 0x00007f271d90b680 libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::__function::__value_func<void ()>::operator(this=0x00002736bfd87020)[abi:v170002]() const at function.h:517:16
    frame #5: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::function<void ()>::operator(this= Function = yb::rpc::(anonymous namespace)::Worker::Execute() )() const at function.h:1168:12
    frame #6: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(arg=0x00002736bfd86fc0) at thread.cc:853:3
    frame #7: 0x00007f27217e41ca libpthread.so.0`start_thread + 234
    frame #8: 0x00007f2721450e73 libc.so.6`__clone + 67
  thread #7, stop reason = signal 0
    frame #0: 0x00007f2721546247 libc.so.6`epoll_wait + 87
    frame #1: 0x00007f271d583eb2 libev.so.4`epoll_poll + 82
    frame #2: 0x00007f271d587144 libev.so.4`ev_run + 1956
    frame #3: 0x00007f271de96d99 libyrpc.so`yb::rpc::Reactor::RunThread() [inlined] ev::loop_ref::run(this=0x00002736bf893438, flags=0) at ev++.h:211:7
    frame #4: 0x00007f271de96d8e libyrpc.so`yb::rpc::Reactor::RunThread(this=0x00002736bf893400) at reactor.cc:728:9
    frame #5: 0x00007f271d90b680 libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::__function::__value_func<void ()>::operator(this=0x00002736bfd86960)[abi:v170002]() const at function.h:517:16
    frame #6: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::function<void ()>::operator(this= Function = yb::rpc::Reactor::RunThread() )() const at function.h:1168:12
    frame #7: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(arg=0x00002736bfd86900) at thread.cc:853:3
    frame #8: 0x00007f27217e41ca libpthread.so.0`start_thread + 234
    frame #9: 0x00007f2721450e73 libc.so.6`__clone + 67
  thread #8, stop reason = signal 0
    frame #0: 0x00007f2721546247 libc.so.6`epoll_wait + 87
    frame #1: 0x00007f271d583eb2 libev.so.4`epoll_poll + 82
    frame #2: 0x00007f271d587144 libev.so.4`ev_run + 1956
    frame #3: 0x00007f271d90b680 libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::__function::__value_func<void ()>::operator(this=0x00002736bfd86cc0)[abi:v170002]() const at function.h:517:16
    frame #4: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::function<void ()>::operator(this= Function = yb::pggate::PgApiImpl::Interrupter::RunThread() )() const at function.h:1168:12
    frame #5: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(arg=0x00002736bfd86c60) at thread.cc:853:3
    frame #6: 0x00007f27217e41ca libpthread.so.0`start_thread + 234
    frame #7: 0x00007f2721450e73 libc.so.6`__clone + 67
  thread #9, stop reason = signal 0
    frame #0: 0x00007f2721546247 libc.so.6`epoll_wait + 87
    frame #1: 0x00007f271d583eb2 libev.so.4`epoll_poll + 82
    frame #2: 0x00007f271d587144 libev.so.4`ev_run + 1956
    frame #3: 0x00007f271de96d99 libyrpc.so`yb::rpc::Reactor::RunThread() [inlined] ev::loop_ref::run(this=0x00002736bf892f38, flags=0) at ev++.h:211:7
    frame #4: 0x00007f271de96d8e libyrpc.so`yb::rpc::Reactor::RunThread(this=0x00002736bf892f00) at reactor.cc:728:9
    frame #5: 0x00007f271d90b680 libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::__function::__value_func<void ()>::operator(this=0x00002736bfd863c0)[abi:v170002]() const at function.h:517:16
    frame #6: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(void*) [inlined] std::__1::function<void ()>::operator(this= Function = yb::rpc::Reactor::RunThread() )() const at function.h:1168:12
    frame #7: 0x00007f271d90b66a libyb_util.so`yb::Thread::SuperviseThread(arg=0x00002736bfd86360) at thread.cc:853:3
    frame #8: 0x00007f27217e41ca libpthread.so.0`start_thread + 234
    frame #9: 0x00007f2721450e73 libc.so.6`__clone + 67

Issue Type

kind/bug

Warning: Please confirm that this issue does not contain any sensitive information

  • I confirm this issue does not contain any sensitive information.
@qvad qvad added area/ysql Yugabyte SQL (YSQL) status/awaiting-triage Issue awaiting triage 2024.1_blocker labels Apr 12, 2024
@yugabyte-ci yugabyte-ci added kind/bug This issue is a bug priority/medium Medium priority issue labels Apr 12, 2024
@sushantrmishra sushantrmishra removed the status/awaiting-triage Issue awaiting triage label Apr 12, 2024
@sushantrmishra sushantrmishra self-assigned this Apr 12, 2024
@myang2021
Copy link
Contributor

There is an easier way to reproduce this bug locally:

yugabyte=# show all;
server closed the connection unexpectedly
        This probably means the server terminated abnormally
        before or while processing the request.

myang2021 added a commit that referenced this issue Apr 16, 2024
Summary:
To reproduce this bug:

```
./bin/ysqlsh -c "show all"
server closed the connection unexpectedly
        This probably means the server terminated abnormally
        before or while processing the request.
connection to server was lost
```

The reason is that a GUC variable must have a non-NULL short_desc that the "show
all" command needs to access. When I added the new GUC
`yb_enable_ddl_atomicity_infra` in commit
6091cc8, I added the text description to
long_desc when it should be short_desc. Fixed by swapping them.
Jira: DB-10864

Test Plan:
(1) Manual test, `show all` now works

./bin/ysqlsh -c "show all"

(2) Run the original benchbase test in the issue and it now passes
1. Start a local RF-3 cluster
   ./bin/yb-ctl create --rf 3
2. Build benchbase.jar
export JAVA_HOME=/opt/jdk-17
export PATH=$PATH:$JAVA_HOME/bin
cd $HOME/code
rm -rf benchbase
git clone https://github.com/yugabyte/benchbase.git
cd benchbase
./mvnw clean package -P yugabyte -DskipTests
cd target
tar xvzf benchbase-yugabyte.tgz
cd benchbase-yugabyte/
3. Run the test command and see no PG process crash.
/opt/jdk-17/bin/java -Xmx8g -jar benchbase.jar -b voter -c config/yugabyte/sample_voter_config.xml --create=true --load=true --execute=true --params endpoint=127.0.0.1 -im 5000 2>&1

Reviewers: fizaa

Reviewed By: fizaa

Subscribers: yql

Differential Revision: https://phorge.dev.yugabyte.com/D34147
myang2021 added a commit that referenced this issue Apr 16, 2024
…ll" command

Summary:
To reproduce this bug:

```
./bin/ysqlsh -c "show all"
server closed the connection unexpectedly
        This probably means the server terminated abnormally
        before or while processing the request.
connection to server was lost
```

The reason is that a GUC variable must have a non-NULL short_desc that the "show
all" command needs to access. When I added the new GUC
`yb_enable_ddl_atomicity_infra` in commit
6091cc8, I added the text description to
long_desc when it should be short_desc. Fixed by swapping them.
Jira: DB-10864

Original commit: 7036256 / D34147

Test Plan:
(1) Manual test, `show all` now works

./bin/ysqlsh -c "show all"

(2) Run the original benchbase test in the issue and it now passes
1. Start a local RF-3 cluster
   ./bin/yb-ctl create --rf 3
2. Build benchbase.jar
export JAVA_HOME=/opt/jdk-17
export PATH=$PATH:$JAVA_HOME/bin
cd $HOME/code
rm -rf benchbase
git clone https://github.com/yugabyte/benchbase.git
cd benchbase
./mvnw clean package -P yugabyte -DskipTests
cd target
tar xvzf benchbase-yugabyte.tgz
cd benchbase-yugabyte/
3. Run the test command and see no PG process crash.
/opt/jdk-17/bin/java -Xmx8g -jar benchbase.jar -b voter -c config/yugabyte/sample_voter_config.xml --create=true --load=true --execute=true --params endpoint=127.0.0.1 -im 5000 2>&1

Reviewers: fizaa, smishra

Reviewed By: smishra

Subscribers: yql

Tags: #jenkins-ready

Differential Revision: https://phorge.dev.yugabyte.com/D34185
myang2021 added a commit that referenced this issue Apr 17, 2024
Summary:
This is a follow up diff that addresses a previous review comment for
7036256 by adding a new unit test to
ensure "show all" command works without crashing PG backend. Earlier the unit
test could not run successfully on my local dev vm because my /bin/java was
too old (openjdk-8). After I upgraded to openjdk-11 the unit test now passes.

Jira: DB-10864

Test Plan: ./yb_build.sh release --java-test org.yb.pgsql.TestPgConfiguration

Reviewers: fizaa

Reviewed By: fizaa

Subscribers: yql

Differential Revision: https://phorge.dev.yugabyte.com/D34196
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
2024.1_blocker area/ysql Yugabyte SQL (YSQL) kind/bug This issue is a bug priority/medium Medium priority issue
Projects
None yet
Development

No branches or pull requests

5 participants