Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pcrf crash with malloc(): invalid size (unsorted) #1313

Closed
WingPig99 opened this issue Jan 11, 2022 · 13 comments
Closed

pcrf crash with malloc(): invalid size (unsorted) #1313

WingPig99 opened this issue Jan 11, 2022 · 13 comments

Comments

@WingPig99
Copy link

Hi, following is the crash log. Could you please give me some idea?

pcrf         | Open5GS daemon v2.4.1-4-g2ed35f2
pcrf         | 
pcrf         | 01/11 00:58:01.592: [app] INFO: Configuration: '/open5gs/install/etc/open5gs/pcrf.yaml' (../lib/app/ogs-init.c:129)
pcrf         | 01/11 00:58:01.592: [app] INFO: File Logging: '/open5gs/install/var/log/open5gs/pcrf.log' (../lib/app/ogs-init.c:132)
pcrf         | 01/11 00:58:01.594: [dbi] INFO: MongoDB URI: 'mongodb://172.22.0.2/open5gs' (../lib/dbi/ogs-mongoc.c:130)
pcrf         | 01/11 00:58:01.636: [app] INFO: PCRF initialize...done (../src/pcrf/app-init.c:31)
pcrf         | 01/11 00:58:01.639: [diam] INFO: CONNECTED TO 'pcscf.ims.mnc092.mcc466.3gppnetwork.org' (TCP,soc#16): (../lib/diameter/common/logger.c:108)
pcrf         | 01/11 00:58:01.639: [diam] INFO: CONNECTED TO 'smf.epc.mnc092.mcc466.3gppnetwork.org' (SCTP,soc#15): (../lib/diameter/common/logger.c:108)
pcrf         | 01/11 00:58:33.823: [pcrf] WARNING: Not supported(260) (../src/pcrf/pcrf-rx-path.c:430)
pcrf         | malloc(): invalid size (unsorted)
pcrf         | /open5gs_init.sh: line 96:    35 Aborted                 (core dumped) ./open5gs-pcrfd
pcrf exited with code 134
@acetcom
Copy link
Member

acetcom commented Jan 11, 2022

@WingPig99

Could you send me a gdb log as below?

$ gdb ./install/bin/open5gs-pcrfd
gdb> run
..(Aborted)
gdb> bt full

Thanks a lot!
Sukchan

@s5uishida
Copy link
Contributor

Hi @acetcom and @WingPig99

I got the same error here as well. So, when go back and checked the commits,

  1. commit b988e7e - Use talloc for all memory pool (#1263)

It was output even if it goes back to the above commit. Before that,

  1. commit 49d9ed0 - [MME] fix the crash (#1263)

In the above version this error was not occurred.

@acetcom
Copy link
Member

acetcom commented Jan 11, 2022

@s5uishida

This was fine with the old memory pool, but this is what happened when I changed the talloc.

Can you run gdb ./install/bin/open5gs-pcrfd and get the 'bt full' log?

Thanks a lot!
Sukchan

@WingPig99
Copy link
Author

@acetcom Yes, sure thing. Following is the log.

01/11 07:52:55.608: [app] INFO: Signal-NUM[28] received (Window changed) (../src/main.c:63)
01/11 07:53:16.829: [app] INFO: Signal-NUM[28] received (Window changed) (../src/main.c:63)
^[^[01/11 07:54:24.477: [pcrf] WARNING: Not supported(260) (../src/pcrf/pcrf-rx-path.c:430)
malloc(): invalid size (unsorted)

Thread 9 "fd-dispatch" received signal SIGABRT, Aborted.
[Switching to Thread 0x7f1e91ffb700 (LWP 141)]
__GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:50
50	../sysdeps/unix/sysv/linux/raise.c: No such file or directory.
(gdb) bt full
#0  __GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:50
        set = {__val = {18446744066193091079, 139769634556974, 139769249058956, 139766691533056, 4, 139766691533056, 139769249058956, 94809793886000, 139768973230304, 0, 0, 0, 0, 281470681751456, 0, 0}}
        pid = <optimized out>
        tid = <optimized out>
        ret = <optimized out>
#1  0x00007f1ea72f5859 in __GI_abort () at abort.c:79
        save_stage = 1
        act = {__sigaction_handler = {sa_handler = 0x0, sa_sigaction = 0x0}, sa_mask = {__val = {0, 0, 0, 0, 0, 0, 0, 0, 0, 140720555039552, 139769275198848, 139769275176912, 139769632995892, 139769275177096, 
              94809826727424, 98784247808}}, sa_flags = -1576698176, sa_restorer = 0x80000ee0}
        sigs = {__val = {32, 0 <repeats 15 times>}}
#2  0x00007f1ea73603ee in __libc_message (action=action@entry=do_abort, fmt=fmt@entry=0x7f1ea748a285 "%s\n") at ../sysdeps/posix/libc_fatal.c:155
        ap = {{gp_offset = 24, fp_offset = 0, overflow_arg_area = 0x7f1e91ff5890, reg_save_area = 0x7f1e91ff5820}}
        fd = <optimized out>
        list = <optimized out>
        nlist = <optimized out>
        cp = <optimized out>
#3  0x00007f1ea736847c in malloc_printerr (str=str@entry=0x7f1ea748ca50 "malloc(): invalid size (unsorted)") at malloc.c:5347
No locals.
#4  0x00007f1ea736b234 in _int_malloc (av=av@entry=0x7f1e80000020, bytes=bytes@entry=143) at malloc.c:3736
        next = <optimized out>
        iters = <optimized out>
        nb = <optimized out>
        idx = 10
        bin = <optimized out>
        victim = <optimized out>
        size = <optimized out>
        victim_index = <optimized out>
        remainder = <optimized out>
        remainder_size = <optimized out>
        block = <optimized out>
        bit = <optimized out>
        map = <optimized out>
        fwd = <optimized out>
        bck = <optimized out>
        tcache_unsorted_count = 0
        tcache_nb = 160
        tc_idx = 8
        return_cached = <optimized out>
        __PRETTY_FUNCTION__ = "_int_malloc"
#5  0x00007f1ea736d419 in __GI___libc_malloc (bytes=143) at malloc.c:3066
        ar_ptr = 0x7f1e80000020
        victim = <optimized out>
        hook = <optimized out>
        tbytes = <optimized out>
        tc_idx = <optimized out>
        __PRETTY_FUNCTION__ = "__libc_malloc"
#6  0x00007f1ea750d9e7 in talloc_named_const () from /lib/x86_64-linux-gnu/libtalloc.so.2
No symbol table info available.
#7  0x00007f1ea7664255 in ogs_talloc_size (ctx=0x563aa1ff5b80, size=47, name=0x563aa01604f0 "../src/pcrf/pcrf-rx-path.c:368") at ../lib/core/ogs-memory.c:64
        ptr = 0x0
        __func__ = "ogs_talloc_size"
#8  0x0000563aa015af4e in pcrf_rx_aar_cb (msg=0x7f1e91ffaaf0, avp=0x7f1e84002f10, sess=0x7f1e80000cb0, opaque=0x0, act=0x7f1e91ffab08) at ../src/pcrf/pcrf-rx-path.c:368
        rv = 0
        ret = 0
        len = 56
        from_str = 0x7f1e840039ed "from 192.168.101.2 5060 to 172.22.0.21 5060"
        to_str = 0x7f1e84003a05 "to 172.22.0.21 5060"
--Type <RET> for more, q to quit, c to continue without paging--c
        rx_flow = 0x7f1e840039e0 "permit in ip from 192.168.101.2 5060 to 172.22.0.21 5060"
        to_port = 0x0
        to_ip = 0x0
        from_ip = 0x7f1e840039f2 "192.168.101.2 5060 to 172.22.0.21 5060"
        from_port = 0x7f1e840039ff " 5060 to 172.22.0.21 5060"
        ans = 0x7f1e80000ef0
        qry = 0x7f1e84001b30
        avpch1 = 0x7f1e84003310
        avpch2 = 0x7f1e84001640
        avpch3 = 0x7f1e840036d0
        hdr = 0x7f1e84003730
        val = {os = {data = 0x7f1e00000001 "", len = 0}, i32 = 1, i64 = 139766825746433, u32 = 1, u64 = 139766825746433, f32 = 1.40129846e-45, f64 = 6.9053987029592386e-310}
        sess_data = 0x7f1e9002b010
        sidlen = 52
        rx_message = {cmd_code = 265, result_code = 0, ims_data = {num_of_msisdn = 0, msisdn = {{buf = "\000\000\000\000\000\000\000", len = 0, bcd = '\000' <repeats 15 times>}, {buf = "\000\000\000\000\000\000\000", len = 0, bcd = '\000' <repeats 15 times>}}, media_component = {{media_component_number = 1, media_type = 0, max_requested_bandwidth_dl = 0, max_requested_bandwidth_ul = 0, min_requested_bandwidth_dl = 0, min_requested_bandwidth_ul = 0, rr_bandwidth = 0, rs_bandwidth = 0, flow_status = 0, sub = {{flow_number = 1, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x7f1e800031d0 "pcscf.ims.mnc092.mcc466.3gppnetwork.org;3207804565;1permit out ip from 172.22.0.21 5060 to any 5060"}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 1}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}}, num_of_sub = 0}, {media_component_number = 0, media_type = 0, max_requested_bandwidth_dl = 0, max_requested_bandwidth_ul = 0, min_requested_bandwidth_dl = 0, min_requested_bandwidth_ul = 0, rr_bandwidth = 0, rs_bandwidth = 0, flow_status = 0, sub = {{flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}, {flow_number = 0, flow_usage = 0, flow = {{direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}, {direction = 0 '\000', description = 0x0}}, num_of_flow = 0}}, num_of_sub = 0} <repeats 15 times>}, num_of_media_component = 0}}
        media_component = 0x7f1e91ff5b48
        sub = 0x7f1e91ff5b88
        flow = 0x7f1e91ff5ba0
        buf = "\200\250\377\221\036\177\000\000\000\000\000\000\000\000\000\000>\033\262\016\374\177\000\000?\033\262\016\374\177\000\000@\033\262\016\374\177\000\000\240\266\005\242:V"
        gx_sid = 0x7f1e7c001160 "smf.epc.mnc092.mcc466.3gppnetwork.org;1641887364;2;app_gx"
        result_code = 5065
        __FUNCTION__ = "pcrf_rx_aar_cb"
#9  0x00007f1ea75424b0 in fd_disp_call_cb_int (cb_list=0x563aa20baca0, msg=0x7f1e91ffaaf0, avp=0x0, sess=0x7f1e80000cb0, action=0x7f1e91ffab08, obj_app=0x563aa21a3430, obj_cmd=0x563aa20babe0, obj_avp=0x0, obj_enu=0x0, drop_reason=0x7f1e91ffaae8, drop_msg=0x7f1e91ffaaf8) at ../subprojects/freeDiameter/libfdproto/dispatch.c:98
        __ret__ = 0
        hdl = 0x563aa208c3f0
        senti = 0x563aa20baca0
        li = 0x563aa208c418
        r = 22074
        __FUNCTION__ = "fd_disp_call_cb_int"
#10 0x00007f1ea755ba1a in fd_msg_dispatch (msg=0x7f1e91ffaaf0, session=0x7f1e80000cb0, action=0x7f1e91ffab08, error_code=0x7f1e91ffaae0, drop_reason=0x7f1e91ffaae8, drop_msg=0x7f1e91ffaaf8) at ../subprojects/freeDiameter/libfdproto/messages.c:2841
        __ret__ = 32764
        __cancel_buf = {__cancel_jmp_buf = {{__cancel_jmp_buf = {0, -3956954507765948669, 140720555039550, 140720555039551, 140720555039552, 139769275198848, -3956954507782725885, -3956835021907239165}, __mask_was_saved = 0}}, __pad = {0x7f1e91ffabd0, 0x0, 0x0, 0x7f1e840012d0}}
        __cancel_routine = 0x7f1ea7549c0f <fd_cleanup_rwlock>
        __cancel_arg = 0x7f1ea75797c0 <fd_disp_lock>
        __not_first_call = 0
        dict = 0x563aa2057ec0
        app = 0x563aa21a3430
        cmd = 0x563aa20babe0
        avp = 0x0
        cb_list = 0x563aa20baca0
        ret = 0
        r2 = 0
        __FUNCTION__ = "fd_msg_dispatch"
#11 0x00007f1ea75e6493 in msg_dispatch (msg=0x7f1e84001b30) at ../subprojects/freeDiameter/libfdcore/routing_dispatch.c:510
        __ret__ = -921300265
        hdr = 0x7f1e84001b90
        is_req = 128
        sess = 0x7f1e80000cb0
        action = DISP_ACT_CONT
        ec = 0x0
        em = 0x0
        msgptr = 0x7f1e80000ef0
        error = 0x0
        __FUNCTION__ = "msg_dispatch"
#12 0x00007f1ea75eb284 in process_thr (arg=0x563aa207bf68, action_cb=0x7f1ea75e5d22 <msg_dispatch>, queue=0x563aa206fd10, action_name=0x7f1ea761855e "Dispatch") at ../subprojects/freeDiameter/libfdcore/routing_dispatch.c:1119
        __ret__ = 0
        msg = 0x7f1e84001b30
        __cancel_buf = {__cancel_jmp_buf = {{__cancel_jmp_buf = {0, -3956954507969372413, 140720555039550, 140720555039551, 140720555039552, 139769275198848, -3956954507709325565, -3956834944604774653}, __mask_was_saved = 0}}, __pad = {0x7f1e91ffacd0, 0x0, 0x0, 0x0}}
        __cancel_routine = 0x7f1ea75eaac6 <cleanup_state>
        __cancel_arg = 0x563aa207bf68
        __not_first_call = 0
        __FUNCTION__ = "process_thr"
#13 0x00007f1ea75eb4ef in dispatch_thr (arg=0x563aa207bf68) at ../subprojects/freeDiameter/libfdcore/routing_dispatch.c:1139
No locals.
#14 0x00007f1ea74cb609 in start_thread (arg=<optimized out>) at pthread_create.c:477
        ret = <optimized out>
        pd = <optimized out>
        unwind_buf = {cancel_jmp_buf = {{jmp_buf = {139769275201280, 3975325145450077955, 140720555039550, 140720555039551, 140720555039552, 139769275198848, -3956954507958886653, -3956834823567778045}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}}
        not_first_call = 0
#15 0x00007f1ea73f2293 in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:95
No locals.

@acetcom
Copy link
Member

acetcom commented Jan 11, 2022

Hi @WingPig99

I don't know why talloc caused this problem. I would like to simulate the above scenario.

Could you share the related pcap? I need a s1ap and diameter protocol.

Thanks a lot!
Sukchan

@s5uishida
Copy link
Contributor

Hi @acetcom

I got this error when I updated Open5GS for the first time this year and used herlesupreeth/docker_open5gs.

I don't know if it's useful because it's a case where PCRF doesn't crash, but there are a few log files here.

herlesupreeth/docker_open5gs#58

For your reference.

@s5uishida
Copy link
Contributor

Hi @acetcom

I got log with the following version of which PCRF crashes.

  • commit b988e7e - Use talloc for all memory pool (#1263)

My environment is as follows.


     eNodeB                Docker Host
    ---------             -------------
   |         |           |             |
   |         |           |             |
    ---------             -------------
        |                       |
-------------------------------------------------------
       .52                     .150    192.168.66.0/24

In addition, I set the static routing of eNodeB(Baicells) as follows.

ip r add 172.22.0.0/24 via 192.168.66.150

Phone1) 441900000000511 (phone number: 1511)

  • internet APN: 192.168.100.2
  • ims APN: 192.168.101.2

Smartphone is the following model.

  • FCNT arrows BZ02

For each NF of Open5GS in Docker Host, the IP addresses described in docker_open5gs/.env are used as it is.
The differences are as follows.

--- .env.orig   2021-11-26 04:07:10.061207367 +0000
+++ .env        2021-11-18 02:59:18.461660018 +0000
@@ -1,11 +1,11 @@
 # Set proper timezone to sync times between docker host and containers
 #TZ=Europe/Berlin
 
-MCC=001
-MNC=01
+MCC=441
+MNC=90
 
 TEST_NETWORK=172.22.0.0/24
-DOCKER_HOST_IP=192.168.1.223
+DOCKER_HOST_IP=192.168.66.150
 
 # MONGODB
 MONGO_IP=172.22.0.2
@@ -19,7 +19,7 @@
 # SGW
 SGWC_IP=172.22.0.5
 SGWU_IP=172.22.0.6
-SGWU_ADVERTISE_IP=172.22.0.6
+SGWU_ADVERTISE_IP=192.168.66.150
 
 # SMF
 SMF_IP=172.22.0.7

The time range of the log is eNodeB startup --> UE startup --> PCRF crash.

You can easily see the *.log files in color with the less -R command.

For your reference.

@acetcom
Copy link
Member

acetcom commented Jan 11, 2022

Ho @WingPig99 and @s5uishida

I've fixed the bug and updated the modification to the issues1313 branch.

Let me know if you get the test results.

Thanks a lot!
Sukchan

@s5uishida
Copy link
Contributor

Hi @acetcom

It worked fine. As I wrote here, VoLTE was also successful.

Many thanks!

@acetcom
Copy link
Member

acetcom commented Jan 12, 2022

Hi @s5uishida

You reported the situation so well that I was able to fix this bug.

Thank you very much.
Sukchan

@s5uishida
Copy link
Contributor

Hi @acetcom

In addition, I would like to point out a little.

As pointed out here1 here2, I think that the log level should be better WARN instead of ERROR.

Thanks.

@acetcom
Copy link
Member

acetcom commented Jan 12, 2022

@s5uishida

You're right. I've changed the log level from ERROR to WARN.

Thanks a lot!
Sukchan

@WingPig99
Copy link
Author

Thank you for your quick fix.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants