Current Behavior
After upgrading to apisix 3.16.0 in our staging environment and activating the new comprehensive tracing feature, by setting apisix.tracing: true in config.yaml apisix shows 500 errors on HTTPS connections instead of serving the correct pages.
We are using apisix in api driven mode in k8s with tls termination within apisix behind our cloud providers load balancer with proxy protocol. Hence the HTTPS connections are kept open. The bug however is easier to reproduce without any plugins as described below.
Claude Code helped me with the local reproduction of the bug and already created a root cause analysis and a proposed fix that I'm not able to verify on my own. Here is its output.
Use it at your own discretion:
tracer.release() in apisix/tracer.lua returns the tracing table to a memory pool (which zeroes all its fields, including spans) but never sets ctx.tracing = nil:
function _M.release(ctx)
local tracing = ctx.tracing
if not tracing then return end
for _, sp in ipairs(tracing.spans) do sp:release() end
tablepool.release("tracing_spans", tracing.spans) -- zeroes tracing.spans
tablepool.release("tracing", tracing) -- zeroes entire table
-- BUG: ctx.tracing is not set to nil
end
tracer.start() guards initialisation with if not tracing then — which evaluates to false for a stale non-nil pointer — so it skips setup and immediately crashes at table.insert(tracing.spans, self) (span.lua:62) because spans is nil.
The trigger is that ngx.ctx is shared across all HTTP requests on the same TLS keepalive connection in OpenResty. APISIX calls tracer.start(ngx_ctx) in ssl_client_hello_phase() (confirmed by ngx.ctx.matched_ssl and ngx.ctx.client_hello_sni being written there and read later in http_access_phase without repopulation). After the first HTTP request's http_log_phase calls tracer.release(), ngx_ctx.tracing is left stale. The second HTTP request on the same connection inherits it and crashes at init.lua:693.
Plain HTTP is not affected because there is no SSL handshake phase — tracing is initialised fresh inside http_access_phase on every request and the stale-pointer path is never reached.
Fix — one line in tracer.release():
tablepool.release("tracing", tracing)
ctx.tracing = nil -- add this
Workaround:
disable apisix.tracing: true in config.yaml.
Expected Behavior
The distributed tracing feature introduced with apisix 3.16.0 doesn't break on HTTPS connections.
Error Logs
The original masked logs from our staging k8s cluster:
2026/04/09 12:40:04 [error] 51#51: *2136 lua entry thread aborted: runtime error: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
coroutine 0:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/init.lua:693: in function 'http_access_phase'
access_by_lua(nginx.conf:396):2: in main chunk, client: w.x.y.z, server: _, request: "GET /callback?state=7e1ef2007f378c7483c018fd706e27c7&session_state=6be03944-8f05-4e43-a4f5-e4d082cf14c4&iss=https%3A%2F%2Fkc.test.de%2Fauth%2Frealms%2Ftesting&code=42d3f6c3-5164-4e51-8749-813ed8810d34.6be03944-8f05-4e43-a4f5-e4d082cf14c4.d249ae51-c1d2-4f02-890e-0383d9ba42b3 HTTP/2.0", host: "xxx.yyy.test.de", request_id: "414cfde9f1276f6a277cf9aadf8f611b"
2026/04/09 12:40:04 [error] 51#51: *2136 failed to run header_filter_by_lua*: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/init.lua:899: in function 'http_header_filter_phase'
header_filter_by_lua(nginx.conf:440):2: in main chunk, client: w.x.y.z, server: _, request: "GET /callback?state=7e1ef2007f378c7483c018fd706e27c7&session_state=6be03944-8f05-4e43-a4f5-e4d082cf14c4&iss=https%3A%2F%2Fkc.test.de%2Fauth%2Frealms%2Ftesting&code=42d3f6c3-5164-4e51-8749-813ed8810d34.6be03944-8f05-4e43-a4f5-e4d082cf14c4.d249ae51-c1d2-4f02-890e-0383d9ba42b3 HTTP/2.0", host: "xxx.yyy.test.de", request_id: "414cfde9f1276f6a277cf9aadf8f611b"
2026/04/09 12:40:04 [error] 51#51: *2136 failed to run log_by_lua*: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/plugin.lua:1333: in function 'run_global_rules'
/usr/local/apisix/apisix/init.lua:482: in function 'common_phase'
/usr/local/apisix/apisix/init.lua:1086: in function 'http_log_phase'
log_by_lua(nginx.conf:448):2: in main chunk while logging request, client: w.x.y.z, server: _, request: "GET /callback?state=7e1ef2007f378c7483c018fd706e27c7&session_state=6be03944-8f05-4e43-a4f5-e4d082cf14c4&iss=https%3A%2F%2Fkc.test.de%2Fauth%2Frealms%2Ftesting&code=42d3f6c3-5164-4e51-8749-813ed8810d34.6be03944-8f05-4e43-a4f5-e4d082cf14c4.d249ae51-c1d2-4f02-890e-0383d9ba42b3 HTTP/2.0", host: "xxx.yyy.test.de", request_id: "414cfde9f1276f6a277cf9aadf8f611b"
2026/04/09 12:40:05 [error] 51#51: *2136 lua entry thread aborted: runtime error: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
coroutine 0:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/init.lua:693: in function 'http_access_phase'
access_by_lua(nginx.conf:396):2: in main chunk, client: w.x.y.z, server: _, request: "GET /callback?state=7e1ef2007f378c7483c018fd706e27c7&session_state=6be03944-8f05-4e43-a4f5-e4d082cf14c4&iss=https%3A%2F%2Fkc.test.de%2Fauth%2Frealms%2Ftesting&code=42d3f6c3-5164-4e51-8749-813ed8810d34.6be03944-8f05-4e43-a4f5-e4d082cf14c4.d249ae51-c1d2-4f02-890e-0383d9ba42b3 HTTP/2.0", host: "xxx.yyy.test.de", referrer: "https://yyy.test.de/", request_id: "3947105387852a1c5b34673beaeef84d"
2026/04/09 12:40:05 [error] 51#51: *2136 failed to run header_filter_by_lua*: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/init.lua:899: in function 'http_header_filter_phase'
header_filter_by_lua(nginx.conf:440):2: in main chunk, client: w.x.y.z, server: _, request: "GET /callback?state=7e1ef2007f378c7483c018fd706e27c7&session_state=6be03944-8f05-4e43-a4f5-e4d082cf14c4&iss=https%3A%2F%2Fkc.test.de%2Fauth%2Frealms%2Ftesting&code=42d3f6c3-5164-4e51-8749-813ed8810d34.6be03944-8f05-4e43-a4f5-e4d082cf14c4.d249ae51-c1d2-4f02-890e-0383d9ba42b3 HTTP/2.0", host: "xxx.yyy.test.de", referrer: "https://yyy.test.de/", request_id: "3947105387852a1c5b34673beaeef84d"
2026/04/09 12:40:05 [error] 51#51: *2136 failed to run log_by_lua*: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/plugin.lua:1333: in function 'run_global_rules'
/usr/local/apisix/apisix/init.lua:482: in function 'common_phase'
/usr/local/apisix/apisix/init.lua:1086: in function 'http_log_phase'
log_by_lua(nginx.conf:448):2: in main chunk while logging request, client: w.x.y.z, server: _, request: "GET /callback?state=7e1ef2007f378c7483c018fd706e27c7&session_state=6be03944-8f05-4e43-a4f5-e4d082cf14c4&iss=https%3A%2F%2Fkc.test.de%2Fauth%2Frealms%2Ftesting&code=42d3f6c3-5164-4e51-8749-813ed8810d34.6be03944-8f05-4e43-a4f5-e4d082cf14c4.d249ae51-c1d2-4f02-890e-0383d9ba42b3 HTTP/2.0", host: "xxx.yyy.test.de", referrer: "https://yyy.test.de/", request_id: "3947105387852a1c5b34673beaeef84d"
104.28.237.3 - - [09/Apr/2026:12:40:04 +0000] xxx.yyy.test.de "GET /login?return_url=https://yyy.test.de HTTP/2.0" 302 217 0.347 "https://yyy.test.de/" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/147.0.0.0 Safari/537.36" - - - "http://xxx.yyy.test.de"
w.x.y.z - - [09/Apr/2026:12:40:04 +0000] xxx.yyy.test.de "GET /callback?state=7e1ef2007f378c7483c018fd706e27c7&session_state=6be03944-8f05-4e43-a4f5-e4d082cf14c4&iss=https%3A%2F%2Fkc.test.de%2Fauth%2Frealms%2Ftesting&code=42d3f6c3-5164-4e51-8749-813ed8810d34.6be03944-8f05-4e43-a4f5-e4d082cf14c4.d249ae51-c1d2-4f02-890e-0383d9ba42b3 HTTP/2.0" 500 0 0.000 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/147.0.0.0 Safari/537.36" - - - "http://xxx.yyy.test.de"
w.x.y.z - - [09/Apr/2026:12:40:05 +0000] xxx.yyy.test.de "GET /callback?state=7e1ef2007f378c7483c018fd706e27c7&session_state=6be03944-8f05-4e43-a4f5-e4d082cf14c4&iss=https%3A%2F%2Fkc.test.de%2Fauth%2Frealms%2Ftesting&code=42d3f6c3-5164-4e51-8749-813ed8810d34.6be03944-8f05-4e43-a4f5-e4d082cf14c4.d249ae51-c1d2-4f02-890e-0383d9ba42b3 HTTP/2.0" 500 0 0.000 "https://yyy.test.de/" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/147.0.0.0 Safari/537.36" - - - "http://xxx.yyy.test.de"
2026/04/09 12:40:10 [error] 51#51: *2136 lua entry thread aborted: runtime error: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
coroutine 0:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/init.lua:693: in function 'http_access_phase'
access_by_lua(nginx.conf:396):2: in main chunk, client: w.x.y.z, server: _, request: "GET /callback?state=7e1ef2007f378c7483c018fd706e27c7&session_state=6be03944-8f05-4e43-a4f5-e4d082cf14c4&iss=https%3A%2F%2Fkc.test.de%2Fauth%2Frealms%2Ftesting&code=42d3f6c3-5164-4e51-8749-813ed8810d34.6be03944-8f05-4e43-a4f5-e4d082cf14c4.d249ae51-c1d2-4f02-890e-0383d9ba42b3 HTTP/2.0", host: "xxx.yyy.test.de", referrer: "https://yyy.test.de/", request_id: "42a1ff9cff0f1e665c31bba2243cd59a"
2026/04/09 12:40:10 [error] 51#51: *2136 failed to run header_filter_by_lua*: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/init.lua:899: in function 'http_header_filter_phase'
header_filter_by_lua(nginx.conf:440):2: in main chunk, client: w.x.y.z, server: _, request: "GET /callback?state=7e1ef2007f378c7483c018fd706e27c7&session_state=6be03944-8f05-4e43-a4f5-e4d082cf14c4&iss=https%3A%2F%2Fkc.test.de%2Fauth%2Frealms%2Ftesting&code=42d3f6c3-5164-4e51-8749-813ed8810d34.6be03944-8f05-4e43-a4f5-e4d082cf14c4.d249ae51-c1d2-4f02-890e-0383d9ba42b3 HTTP/2.0", host: "xxx.yyy.test.de", referrer: "https://yyy.test.de/", request_id: "42a1ff9cff0f1e665c31bba2243cd59a"
2026/04/09 12:40:10 [error] 51#51: *2136 failed to run log_by_lua*: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/plugin.lua:1333: in function 'run_global_rules'
/usr/local/apisix/apisix/init.lua:482: in function 'common_phase'
/usr/local/apisix/apisix/init.lua:1086: in function 'http_log_phase'
log_by_lua(nginx.conf:448):2: in main chunk while logging request, client: w.x.y.z, server: _, request: "GET /callback?state=7e1ef2007f378c7483c018fd706e27c7&session_state=6be03944-8f05-4e43-a4f5-e4d082cf14c4&iss=https%3A%2F%2Fkc.test.de%2Fauth%2Frealms%2Ftesting&code=42d3f6c3-5164-4e51-8749-813ed8810d34.6be03944-8f05-4e43-a4f5-e4d082cf14c4.d249ae51-c1d2-4f02-890e-0383d9ba42b3 HTTP/2.0", host: "xxx.yyy.test.de", referrer: "https://yyy.test.de/", request_id: "42a1ff9cff0f1e665c31bba2243cd59a"
w.x.y.z - - [09/Apr/2026:12:40:10 +0000] xxx.yyy.test.de "GET /callback?state=7e1ef2007f378c7483c018fd706e27c7&session_state=6be03944-8f05-4e43-a4f5-e4d082cf14c4&iss=https%3A%2F%2Fkc.test.de%2Fauth%2Frealms%2Ftesting&code=42d3f6c3-5164-4e51-8749-813ed8810d34.6be03944-8f05-4e43-a4f5-e4d082cf14c4.d249ae51-c1d2-4f02-890e-0383d9ba42b3 HTTP/2.0" 500 0 0.000 "https://yyy.test.de/" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/147.0.0.0 Safari/537.36" - - - "http://xxx.yyy.test.de"
The logs when reproducing the bug locally:
/usr/local/openresty//luajit/bin/luajit ./apisix/cli/apisix.lua init
/usr/local/openresty//luajit/bin/luajit ./apisix/cli/apisix.lua init_etcd
2026/04/10 06:56:18 [warn] 1#1: [lua] config_yaml.lua:198: read_apisix_config(): config file /usr/local/apisix/conf/apisix.yaml reloaded.
nginx: [warn] [lua] config_yaml.lua:198: read_apisix_config(): config file /usr/local/apisix/conf/apisix.yaml reloaded.
2026/04/10 06:56:18 [warn] 70#70: *13 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 70#70: *13 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 47#47: *7 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 46#46: *6 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 46#46: *6 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 47#47: *7 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 52#52: *2 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 52#52: *2 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 51#51: *5 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 51#51: *5 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 49#49: *8 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 49#49: *8 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 60#60: *11 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 60#60: *11 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 44#44: *4 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 44#44: *4 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 50#50: *9 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 50#50: *9 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 45#45: *3 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 45#45: *3 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 48#48: *1 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 48#48: *1 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 65#65: *15 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 54#54: *10 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 54#54: *10 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 65#65: *15 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 53#53: *12 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 53#53: *12 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 66#66: *14 [lua] plugin.lua:237: load(): new plugins: {}, context: init_worker_by_lua*
2026/04/10 06:56:18 [warn] 66#66: *14 [lua] plugin.lua:287: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"traffic-split":true,"mqtt-proxy":true,"syslog":true}, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 70#70: *13 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 51#51: *5 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 46#46: *6 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 47#47: *7 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 52#52: *2 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 50#50: *9 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 66#66: *14 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 49#49: *8 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 54#54: *10 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 48#48: *1 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 65#65: *15 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 45#45: *3 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 44#44: *4 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 53#53: *12 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:18 [error] 60#60: *11 [lua] plugin.lua:159: load_plugin(): failed to load plugin [syslog] err: /usr/local/apisix/apisix/plugins/prometheus/exporter.lua:54: lua_shared_dict "prometheus-cache" not configured, context: init_worker_by_lua*
2026/04/10 06:56:22 [error] 46#46: *518 lua entry thread aborted: runtime error: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
coroutine 0:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/init.lua:693: in function 'http_access_phase'
access_by_lua(nginx.conf:282):2: in main chunk, client: 10.89.0.3, server: _, request: "GET /get HTTP/2.0", host: "localhost:9443", request_id: "19d856ac5aec32b28b24237ece910294"
2026/04/10 06:56:22 [error] 46#46: *518 failed to run header_filter_by_lua*: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/init.lua:899: in function 'http_header_filter_phase'
header_filter_by_lua(nginx.conf:308):2: in main chunk, client: 10.89.0.3, server: _, request: "GET /get HTTP/2.0", host: "localhost:9443", request_id: "19d856ac5aec32b28b24237ece910294"
2026/04/10 06:56:22 [error] 46#46: *518 failed to run log_by_lua*: /usr/local/apisix/apisix/tracer.lua:80: bad argument #1 to 'ipairs' (table expected, got nil)
stack traceback:
[C]: in function 'ipairs'
/usr/local/apisix/apisix/tracer.lua:80: in function 'release'
/usr/local/apisix/apisix/init.lua:1106: in function 'http_log_phase'
log_by_lua(nginx.conf:316):2: in main chunk while logging request, client: 10.89.0.3, server: _, request: "GET /get HTTP/2.0", host: "localhost:9443", request_id: "19d856ac5aec32b28b24237ece910294"
10.89.0.3 - - [10/Apr/2026:06:56:19 +0000] localhost:9080 "GET / HTTP/1.1" 200 9593 0.036 "-" "curl/8.14.1" 10.89.0.2:80 200 0.027 "http://localhost:9080" "9a3b42b85727a63ea30e6c58b47b57af"
10.89.0.3 - - [10/Apr/2026:06:56:19 +0000] localhost:9443 "GET / HTTP/2.0" 200 9593 0.019 "-" "curl/8.14.1" 10.89.0.2:80 200 0.004 "http://localhost:9443" "615eccdffa91ed129ba17a6ffe325cb3"
10.89.0.3 - - [10/Apr/2026:06:56:22 +0000] localhost:9443 "GET /get HTTP/2.0" 200 222 0.034 "-" "curl/8.14.1" 10.89.0.2:80 200 0.017 "http://localhost:9443" "364e2ce25b63ad2ed2ca41a1bda2873e"
10.89.0.3 - - [10/Apr/2026:06:56:22 +0000] localhost:9443 "GET /get HTTP/2.0" 500 0 0.000 "-" "curl/8.14.1" - - - "http://localhost:9443" "19d856ac5aec32b28b24237ece910294
Steps to Reproduce
Unzip attached reproduce.zip, and run ./reproduce.sh.
jens@[de-stage]:~/reproduce$ ./reproduce.sh
==> Using: podman compose
==> Starting services...
>>>> Executing external compose provider "/usr/bin/podman-compose". Please see podman-compose(1) for how to disable this message. <<<<
Error: creating container storage: the container name "reproduce_upstream_1" is already in use by badffba2ace7ae731b452d84df9c8b5dee1fb1aae6656c0b0ebcdae2ca2360f3. You have to remove that container to be able to reuse that name: that name is already in use, or use --replace to instruct Podman to do so.
reproduce_upstream_1
Error: creating container storage: the container name "reproduce_apisix_1" is already in use by 99aa23e4f03d830eb172f0f126394589e3d9e79a8505cc39c4d46403c617b651. You have to remove that container to be able to reuse that name: that name is already in use, or use --replace to instruct Podman to do so.
reproduce_apisix_1
==> Waiting for APISIX HTTP (9080)...
==> Waiting for APISIX HTTPS (9443)...
════════════════════════════════════════════════════════════
Two requests on the same HTTPS keepalive connection
════════════════════════════════════════════════════════════
Sending two requests in one curl invocation (connection reuse)...
HTTP statuses: 200000
CRASH CONFIRMED
════════════════════════════════════════════════════════════
LOG ANALYSIS
════════════════════════════════════════════════════════════
span.lua errors:
2026/04/10 06:56:22 [error] 46#46: *518 lua entry thread aborted: runtime error: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
coroutine 0:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/init.lua:693: in function 'http_access_phase'
access_by_lua(nginx.conf:282):2: in main chunk, client: 10.89.0.3, server: _, request: "GET /get HTTP/2.0", host: "localhost:9443", request_id: "19d856ac5aec32b28b24237ece910294"
2026/04/10 06:56:22 [error] 46#46: *518 failed to run header_filter_by_lua*: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/init.lua:899: in function 'http_header_filter_phase'
header_filter_by_lua(nginx.conf:308):2: in main chunk, client: 10.89.0.3, server: _, request: "GET /get HTTP/2.0", host: "localhost:9443", request_id: "19d856ac5aec32b28b24237ece910294"
2026/04/10 06:56:22 [error] 46#46: *518 failed to run log_by_lua*: /usr/local/apisix/apisix/tracer.lua:80: bad argument #1 to 'ipairs' (table expected, got nil)
stack traceback:
--
2026/04/10 07:04:33 [error] 48#48: *68818 lua entry thread aborted: runtime error: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
coroutine 0:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/init.lua:693: in function 'http_access_phase'
access_by_lua(nginx.conf:282):2: in main chunk, client: 10.89.0.3, server: _, request: "GET /get HTTP/2.0", host: "localhost:9443", request_id: "1ac4dd450ec2fc15a291d741865dafe4"
2026/04/10 07:04:33 [error] 48#48: *68818 failed to run header_filter_by_lua*: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
stack traceback:
[C]: in function 'insert'
/usr/local/apisix/apisix/utils/span.lua:62: in function 'new'
/usr/local/apisix/apisix/tracer.lua:53: in function 'start'
/usr/local/apisix/apisix/init.lua:899: in function 'http_header_filter_phase'
header_filter_by_lua(nginx.conf:308):2: in main chunk, client: 10.89.0.3, server: _, request: "GET /get HTTP/2.0", host: "localhost:9443", request_id: "1ac4dd450ec2fc15a291d741865dafe4"
2026/04/10 07:04:33 [error] 48#48: *68818 failed to run log_by_lua*: /usr/local/apisix/apisix/tracer.lua:80: bad argument #1 to 'ipairs' (table expected, got nil)
stack traceback:
lua runtime errors:
2026/04/10 06:56:22 [error] 46#46: *518 lua entry thread aborted: runtime error: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
2026/04/10 07:04:33 [error] 48#48: *68818 lua entry thread aborted: runtime error: /usr/local/apisix/apisix/utils/span.lua:62: bad argument #1 to 'insert' (table expected, got nil)
==> To clean up: podman compose down
Environment
- APISIX version (run
apisix version):
/usr/local/openresty//luajit/bin/luajit ./apisix/cli/apisix.lua version
3.16.0
- Operating system (run
uname -a):
Linux 99aa23e4f03d 6.17.0-20-generic #20-Ubuntu SMP PREEMPT_DYNAMIC Fri Mar 13 20:07:29 UTC 2026 x86_64 x86_64 x86_64 GNU/Linux
- OpenResty / Nginx version (run
openresty -V or nginx -V):
nginx version: openresty/1.27.1.2
built by gcc 13.3.0 (Ubuntu 13.3.0-6ubuntu2~24.04.1)
built with OpenSSL 3.4.1 11 Feb 2025
TLS SNI support enabled
configure arguments: --prefix=/usr/local/openresty/nginx --with-cc-opt='-O2 -DAPISIX_RUNTIME_VER=1.3.3 -DNGX_LUA_ABORT_AT_PANIC -I/usr/local/openresty/zlib/include -I/usr/local/openresty/pcre/include -I/usr/local/openresty/openssl3/include' --add-module=../ngx_devel_kit-0.3.3 --add-module=../echo-nginx-module-0.63 --add-module=../xss-nginx-module-0.06 --add-module=../ngx_coolkit-0.2 --add-module=../set-misc-nginx-module-0.33 --add-module=../form-input-nginx-module-0.12 --add-module=../encrypted-session-nginx-module-0.09 --add-module=../srcache-nginx-module-0.33 --add-module=../ngx_lua-0.10.28 --add-module=../ngx_lua_upstream-0.07 --add-module=../headers-more-nginx-module-0.37 --add-module=../array-var-nginx-module-0.06 --add-module=../memc-nginx-module-0.20 --add-module=../redis2-nginx-module-0.15 --add-module=../redis-nginx-module-0.3.9 --add-module=../ngx_stream_lua-0.0.16 --with-ld-opt='-Wl,-rpath,/usr/local/openresty/luajit/lib -Wl,-rpath,/usr/local/openresty/wasmtime-c-api/lib -L/usr/local/openresty/zlib/lib -L/usr/local/openresty/pcre/lib -L/usr/local/openresty/openssl3/lib -Wl,-rpath,/usr/local/openresty/zlib/lib:/usr/local/openresty/pcre/lib:/usr/local/openresty/openssl3/lib' --add-module=/tmp/tmp.y0ZMBsRAqW/openresty-1.27.1.2/../mod_dubbo-1.0.2 --add-module=/tmp/tmp.y0ZMBsRAqW/openresty-1.27.1.2/../ngx_multi_upstream_module-1.3.2 --add-module=/tmp/tmp.y0ZMBsRAqW/openresty-1.27.1.2/../apisix-nginx-module-1.19.3 --add-module=/tmp/tmp.y0ZMBsRAqW/openresty-1.27.1.2/../apisix-nginx-module-1.19.3/src/stream --add-module=/tmp/tmp.y0ZMBsRAqW/openresty-1.27.1.2/../apisix-nginx-module-1.19.3/src/meta --add-module=/tmp/tmp.y0ZMBsRAqW/openresty-1.27.1.2/../wasm-nginx-module-0.7.0 --add-module=/tmp/tmp.y0ZMBsRAqW/openresty-1.27.1.2/../lua-var-nginx-module-v0.5.3 --add-module=/tmp/tmp.y0ZMBsRAqW/openresty-1.27.1.2/../lua-resty-events-0.2.0 --with-poll_module --with-pcre-jit --with-stream --with-stream_ssl_module --with-stream_ssl_preread_module --with-http_v2_module --with-http_v3_module --without-mail_pop3_module --without-mail_imap_module --without-mail_smtp_module --with-http_stub_status_module --with-http_realip_module --with-http_addition_module --with-http_auth_request_module --with-http_secure_link_module --with-http_random_index_module --with-http_gzip_static_module --with-http_sub_module --with-http_dav_module --with-http_flv_module --with-http_mp4_module --with-http_gunzip_module --with-threads --with-compat --with-stream --with-http_ssl_module
- etcd version, if relevant (run
curl http://127.0.0.1:9090/v1/server_info): n/a
- APISIX Dashboard version, if relevant: n/a
- Plugin runner version, for issues related to plugin runners: n/a
- LuaRocks version, for installation issues (run
luarocks --version): n/a
Current Behavior
After upgrading to apisix 3.16.0 in our staging environment and activating the new comprehensive tracing feature, by setting
apisix.tracing: trueinconfig.yamlapisix shows500errors on HTTPS connections instead of serving the correct pages.We are using apisix in api driven mode in k8s with tls termination within apisix behind our cloud providers load balancer with proxy protocol. Hence the HTTPS connections are kept open. The bug however is easier to reproduce without any plugins as described below.
Claude Code helped me with the local reproduction of the bug and already created a root cause analysis and a proposed fix that I'm not able to verify on my own. Here is its output.
Use it at your own discretion:
tracer.release()inapisix/tracer.luareturns the tracing table to a memory pool (which zeroes all its fields, including spans) but never setsctx.tracing = nil:tracer.start()guards initialisation with if not tracing then — which evaluates to false for a stale non-nil pointer — so it skips setup and immediately crashes attable.insert(tracing.spans, self) (span.lua:62)because spans is nil.The trigger is that
ngx.ctxis shared across all HTTP requests on the same TLS keepalive connection in OpenResty. APISIX callstracer.start(ngx_ctx)inssl_client_hello_phase()(confirmed byngx.ctx.matched_sslandngx.ctx.client_hello_snibeing written there and read later inhttp_access_phasewithout repopulation). After the first HTTP request's http_log_phase callstracer.release(),ngx_ctx.tracingis left stale. The second HTTP request on the same connection inherits it and crashes atinit.lua:693.Plain HTTP is not affected because there is no SSL handshake phase — tracing is initialised fresh inside http_access_phase on every request and the stale-pointer path is never reached.
Fix — one line in
tracer.release():Workaround:
disable
apisix.tracing: trueinconfig.yaml.Expected Behavior
The distributed tracing feature introduced with apisix 3.16.0 doesn't break on HTTPS connections.
Error Logs
The original masked logs from our staging k8s cluster:
The logs when reproducing the bug locally:
Steps to Reproduce
Unzip attached reproduce.zip, and run
./reproduce.sh.Environment
apisix version):uname -a):openresty -Vornginx -V):curl http://127.0.0.1:9090/v1/server_info): n/aluarocks --version): n/a