Skip to content

bug: http-logger Request Body is Empty #12781

@hlg212

Description

@hlg212

Current Behavior

Version: apache/apisix:3.14.0-debian

I added a global http-logger plugin to collect req_body and resp_body. However, I found that in the logs received by the backend, req_body is empty, while resp_body is captured correctly.

Then, I switched the logging plugin to kafka-logger. Testing showed that req_body could be obtained normally.

Here's how I verified the issue:
I added log outputs in the code of log-util.lua, then overwrote the file in the image. Specifically, within the get_request_body method, I added a log: core.log.warn("get_request_body, req_body: ", req_body). After making a request, I checked the APISIX logs. When using the http-logger, the printed req_body was nil. In contrast, with the kafka-logger, it printed the normal request data.

docker-compose.yaml

version: "3"

services:
  apisix:
    image: apache/apisix:${APISIX_IMAGE_TAG:-3.14.0-debian}
    restart: always
    volumes:
      - ./apisix_conf/config.yaml:/usr/local/apisix/conf/config.yaml:ro
      - ./apisix-master/apisix/utils/log-util.lua:/usr/local/apisix/apisix/utils/log-util.lua:ro
    depends_on:
      - etcd
    ##network_mode: host
    ports:
      - "9180:9180/tcp"
      - "9080:9080/tcp"
      - "9091:9091/tcp"
      - "9443:9443/tcp"
    networks:
      apisix:

  etcd:
    image: artifact.srdcloud.cn/public-docker-virtual/bitnami/etcd:3.5.11
    restart: always
    volumes:
      - etcd_data:/bitnami/etcd
    environment:
      ETCD_ENABLE_V2: "true"
      ALLOW_NONE_AUTHENTICATION: "yes"
      ETCD_ADVERTISE_CLIENT_URLS: "http://etcd:2379"
      ETCD_LISTEN_CLIENT_URLS: "http://0.0.0.0:2379"
    ports:
      - "2379:2379/tcp"
    networks:
      apisix:
  apisix_dashboard:
    image: apache/apisix-dashboard:latest
    restart: always
    ports:
      - "9000:9000"
    volumes:
      - "./dashboard_conf/conf.yaml:/usr/local/apisix-dashboard/conf/conf.yaml"
    networks:
      apisix:
networks:
  apisix:
    driver: bridge

volumes:
  etcd_data:
    driver: local

http-logger configuration:

{
  "_meta": {
    "disable": false
  },
  "include_req_body": true,
  "include_resp_body": true,
  "uri": "http://mockbin:8580/bin/b3feb263-77e0-468e-bbb2-71a7900c93f8"
}

kafka-logger configuration:

{
  "_meta": {
    "disable": false
  },
  "brokers": [
    {
      "host": "kafka",
      "port": 31423
    }
  ],
  "include_req_body": true,
  "include_resp_body": true,
  "kafka_topic": "ability-logs",
  "key": "key1"
}

log-util.lua:

--
-- Licensed to the Apache Software Foundation (ASF) under one or more
-- contributor license agreements.  See the NOTICE file distributed with
-- this work for additional information regarding copyright ownership.
-- The ASF licenses this file to You under the Apache License, Version 2.0
-- (the "License"); you may not use this file except in compliance with
-- the License.  You may obtain a copy of the License at
--
--     http://www.apache.org/licenses/LICENSE-2.0
--
-- Unless required by applicable law or agreed to in writing, software
-- distributed under the License is distributed on an "AS IS" BASIS,
-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-- See the License for the specific language governing permissions and
-- limitations under the License.
--
local core = require("apisix.core")
local plugin = require("apisix.plugin")
local expr = require("resty.expr.v1")
local content_decode = require("apisix.utils.content-decode")
local ngx = ngx
local pairs = pairs
local ngx_now = ngx.now
local ngx_header = ngx.header
local os_date = os.date
local str_byte = string.byte
local str_sub  = string.sub
local math_floor = math.floor
local ngx_update_time = ngx.update_time
local req_get_body_data = ngx.req.get_body_data
local is_http = ngx.config.subsystem == "http"
local req_get_body_file = ngx.req.get_body_file
local MAX_REQ_BODY      = 524288      -- 512 KiB
local MAX_RESP_BODY     = 524288      -- 512 KiB
local io                = io

local lru_log_format = core.lrucache.new({
    ttl = 300, count = 512
})

local _M = {}


local function get_request_body(max_bytes)
    local req_body = req_get_body_data()
    core.log.warn("get_request_body,req_body: ", req_body)

    if req_body then
        if max_bytes and #req_body >= max_bytes then
            req_body = str_sub(req_body, 1, max_bytes)
        end
        return req_body
    end

    local file_name = req_get_body_file()
    if not file_name then
        return nil
    end

    core.log.info("attempt to read body from file: ", file_name)

    local f, err = io.open(file_name, 'r')
    if not f then
        return nil, "fail to open file " .. err
    end

    req_body = f:read(max_bytes)
    f:close()

    return req_body
end


local function gen_log_format(format)
    local log_format = {}
    for k, var_name in pairs(format) do
        if var_name:byte(1, 1) == str_byte("$") then
            log_format[k] = {true, var_name:sub(2)}
        else
            log_format[k] = {false, var_name}
        end
    end
    core.log.info("log_format: ", core.json.delay_encode(log_format))
    return log_format
end


local function get_custom_format_log(ctx, format, max_req_body_bytes)
    local log_format = lru_log_format(format or "", nil, gen_log_format, format)
    local entry = core.table.new(0, core.table.nkeys(log_format))
    for k, var_attr in pairs(log_format) do
        if var_attr[1] then
            local key = var_attr[2]
            if key == "request_body" then
                local max_req_body_bytes = max_req_body_bytes or MAX_REQ_BODY
                local req_body, err = get_request_body(max_req_body_bytes)
                if err then
                    core.log.error("fail to get request body: ", err)
                else
                    entry[k] = req_body
                end
            else
                entry[k] = ctx.var[var_attr[2]]
            end
        else
            entry[k] = var_attr[2]
        end
    end

    local matched_route = ctx.matched_route and ctx.matched_route.value
    if matched_route then
        entry.service_id = matched_route.service_id
        entry.route_id = matched_route.id
    end
    return entry
end
-- export the log getter so we can mock in tests
_M.get_custom_format_log = get_custom_format_log


-- for test
function _M.inject_get_custom_format_log(f)
    get_custom_format_log = f
    _M.get_custom_format_log = f
end


local function latency_details_in_ms(ctx)
    local latency = (ngx_now() - ngx.req.start_time()) * 1000
    local upstream_latency, apisix_latency = nil, latency

    if ctx.var.upstream_response_time then
        upstream_latency = ctx.var.upstream_response_time * 1000
        apisix_latency = apisix_latency - upstream_latency

        -- The latency might be negative, as Nginx uses different time measurements in
        -- different metrics.
        -- See https://github.com/apache/apisix/issues/5146#issuecomment-928919399
        if apisix_latency < 0 then
            apisix_latency = 0
        end
    end

    return latency, upstream_latency, apisix_latency
end
_M.latency_details_in_ms = latency_details_in_ms


local function get_full_log(ngx, conf)
    local ctx = ngx.ctx.api_ctx
    local var = ctx.var
    local service_id
    local route_id
    local url = var.scheme .. "://" .. var.host .. ":" .. var.server_port
                .. var.request_uri
    local matched_route = ctx.matched_route and ctx.matched_route.value

    if matched_route then
        service_id = matched_route.service_id or ""
        route_id = matched_route.id
    else
        service_id = var.host
    end

    local consumer
    if ctx.consumer then
        consumer = {
            username = ctx.consumer.username
        }
    end

    local latency, upstream_latency, apisix_latency = latency_details_in_ms(ctx)

    local log =  {
        request = {
            url = url,
            uri = var.request_uri,
            method = ngx.req.get_method(),
            headers = ngx.req.get_headers(),
            querystring = ngx.req.get_uri_args(),
            size = var.request_length
        },
        response = {
            status = ngx.status,
            headers = ngx.resp.get_headers(),
            size = var.bytes_sent
        },
        server = {
            hostname = core.utils.gethostname(),
            version = core.version.VERSION
        },
        upstream = var.upstream_addr,
        service_id = service_id,
        route_id = route_id,
        consumer = consumer,
        client_ip = core.request.get_remote_client_ip(ngx.ctx.api_ctx),
        start_time = ngx.req.start_time() * 1000,
        latency = latency,
        upstream_latency = upstream_latency,
        apisix_latency = apisix_latency
    }

    if conf.include_resp_body then
        log.response.body = ctx.resp_body
    end
    core.log.warn("include_req_body: ",include_req_body)
    if conf.include_req_body then

        local log_request_body = true
         core.log.warn("log_request_body: true")
        if conf.include_req_body_expr then

            if not conf.request_expr then
                local request_expr, err = expr.new(conf.include_req_body_expr)
                if not request_expr then
                    core.log.error('generate request expr err ' .. err)
                    return log
                end
                conf.request_expr = request_expr
            end

            local result = conf.request_expr:eval(ctx.var)

            if not result then
                log_request_body = false
            end
        end
        core.log.warn("alter log_format: ", log_request_body)
        if log_request_body then
            local max_req_body_bytes = conf.max_req_body_bytes or MAX_REQ_BODY
            local body, err = get_request_body(max_req_body_bytes)
            if err then
                core.log.error("fail to get request body: ", err)
                return
            end
            log.request.body = body
            core.log.warn("body : ", body)
        end
    end

    return log
end
_M.get_full_log = get_full_log


-- for test
function _M.inject_get_full_log(f)
    get_full_log = f
    _M.get_full_log = f
end


local function is_match(match, ctx)
    local match_result
    for _, m in pairs(match) do
        local expr, _ = expr.new(m)
        match_result = expr:eval(ctx.var)
        if match_result then
            break
        end
    end

    return match_result
end


function _M.get_log_entry(plugin_name, conf, ctx)
    -- If the "match" configuration is set and the matching conditions are not met,
    -- then do not log the message.
    if conf.match and not is_match(conf.match, ctx) then
        return
    end

    local metadata = plugin.plugin_metadata(plugin_name)
    core.log.info("metadata: ", core.json.delay_encode(metadata))

    local entry
    local customized = false

    local has_meta_log_format = metadata and metadata.value.log_format
        and core.table.nkeys(metadata.value.log_format) > 0

    if conf.log_format or has_meta_log_format then
        customized = true
        entry = get_custom_format_log(ctx, conf.log_format or metadata.value.log_format,
                                      conf.max_req_body_bytes)
    else
        if is_http then
            entry = get_full_log(ngx, conf)
        else
            -- get_full_log doesn't work in stream
            core.log.error(plugin_name, "'s log_format is not set")
        end
    end

    if ctx.llm_summary then
        entry.llm_summary = ctx.llm_summary
    end
    if ctx.llm_request then
        entry.llm_request = ctx.llm_request
    end
    if ctx.llm_response_text then
        entry.llm_response_text = ctx.llm_response_text
    end
    return entry, customized
end


function _M.get_req_original(ctx, conf)
    local data = {
        ctx.var.request, "\r\n"
    }
    for k, v in pairs(ngx.req.get_headers()) do
        core.table.insert_tail(data, k, ": ", v, "\r\n")
    end
    core.table.insert(data, "\r\n")

    if conf.include_req_body then
        local max_req_body_bytes = conf.max_req_body_bytes or MAX_REQ_BODY
        local req_body = get_request_body(max_req_body_bytes)
        core.table.insert(data, req_body)
    end

    return core.table.concat(data, "")
end


function _M.check_log_schema(conf)
    if conf.include_req_body_expr then
        local ok, err = expr.new(conf.include_req_body_expr)
        if not ok then
            return nil, "failed to validate the 'include_req_body_expr' expression: " .. err
        end
    end
    if conf.include_resp_body_expr then
        local ok, err = expr.new(conf.include_resp_body_expr)
        if not ok then
            return nil, "failed to validate the 'include_resp_body_expr' expression: " .. err
        end
    end
    return true, nil
end


function _M.collect_body(conf, ctx)
    if conf.include_resp_body then
        local log_response_body = true

        if conf.include_resp_body_expr then
            if not conf.response_expr then
                local response_expr, err = expr.new(conf.include_resp_body_expr)
                if not response_expr then
                    core.log.error('generate response expr err ' .. err)
                    return
                end
                conf.response_expr = response_expr
            end

            if ctx.res_expr_eval_result == nil then
                ctx.res_expr_eval_result = conf.response_expr:eval(ctx.var)
            end

            if not ctx.res_expr_eval_result then
                log_response_body = false
            end
        end

        if log_response_body then
            local max_resp_body_bytes = conf.max_resp_body_bytes or MAX_RESP_BODY

            if ctx._resp_body_bytes and ctx._resp_body_bytes >= max_resp_body_bytes then
                return
            end
            local final_body = core.response.hold_body_chunk(ctx, true, max_resp_body_bytes)
            if not final_body then
                return
            end

            local response_encoding = ngx_header["Content-Encoding"]
            if not response_encoding then
                ctx.resp_body = final_body
                return
            end

            local decoder = content_decode.dispatch_decoder(response_encoding)
            if not decoder then
                core.log.warn("unsupported compression encoding type: ",
                              response_encoding)
                ctx.resp_body = final_body
                return
            end

            local decoded_body, err = decoder(final_body)
            if err ~= nil then
                core.log.warn("try decode compressed data err: ", err)
                ctx.resp_body = final_body
                return
            end

            ctx.resp_body = decoded_body
        end
    end
end


function _M.get_rfc3339_zulu_timestamp(timestamp)
    ngx_update_time()
    local now = timestamp or ngx_now()
    local second = math_floor(now)
    local millisecond = math_floor((now - second) * 1000)
    return os_date("!%Y-%m-%dT%T.", second) .. core.string.format("%03dZ", millisecond)
end


return _M

Expected Behavior

No response

Error Logs

http-logger log:

2025/11/27 09:34:50 [warn] 54#54: *498217 [lua] log-util.lua:206: get_full_log(): include_req_body: nil while logging request, client: 172.31.0.1, server: _, request: "POST /header/test HTTP/1.1", host: "localhost:9080"

2025/11/27 09:34:50 [warn] 54#54: *498217 [lua] log-util.lua:210: get_full_log(): log_request_body: true while logging request, client: 172.31.0.1, server: _, request: "POST /header/test HTTP/1.1", host: "localhost:9080"

2025/11/27 09:34:50 [warn] 54#54: *498217 [lua] log-util.lua:228: get_full_log(): alter log_format: true while logging request, client: 172.31.0.1, server: _, request: "POST /header/test HTTP/1.1", host: "localhost:9080"

2025/11/27 09:34:50 [warn] 54#54: *498217 [lua] log-util.lua:46: get_request_body(): get_request_body,req_body: nil while logging request, client: 172.31.0.1, server: _, request: "POST /header/test HTTP/1.1", host: "localhost:9080"

2025/11/27 09:34:50 [warn] 54#54: *498217 [lua] log-util.lua:237: get_full_log(): body : nil while logging request, client: 172.31.0.1, server: _, request: "POST /header/test HTTP/1.1", host: "localhost:9080"

kafka-logger log:

2025/11/27 09:36:11 [warn] 63#63: *506672 [lua] log-util.lua:206: get_full_log(): include_req_body: nil while logging request, client: 172.31.0.1, server: _, request: "POST /header/test HTTP/1.1", host: "localhost:9080"

2025/11/27 09:36:11 [warn] 63#63: *506672 [lua] log-util.lua:210: get_full_log(): log_request_body: true while logging request, client: 172.31.0.1, server: _, request: "POST /header/test HTTP/1.1", host: "localhost:9080"

2025/11/27 09:36:11 [warn] 63#63: *506672 [lua] log-util.lua:228: get_full_log(): alter log_format: true while logging request, client: 172.31.0.1, server: _, request: "POST /header/test HTTP/1.1", host: "localhost:9080"

2025/11/27 09:36:11 [warn] 63#63: *506672 [lua] log-util.lua:46: get_request_body(): get_request_body,req_body: {

"a111":"c222"

} while logging request, client: 172.31.0.1, server: _, request: "POST /header/test HTTP/1.1", host: "localhost:9080"

2025/11/27 09:36:11 [warn] 63#63: *506672 [lua] log-util.lua:237: get_full_log(): body : {

"a111":"c222"

} while logging request, client: 172.31.0.1, server: _, request: "POST /header/test HTTP/1.1", host: "localhost:9080"

172.31.0.1 - - [27/Nov/2025:09:36:11 +0000] localhost:9080 "POST /header/test HTTP/1.1" 200 14 0.000 "-" "PostmanRuntime/7.29.2" - - - "http://localhost:9080"

Steps to Reproduce

1.Run docker-compose up -d
2.Access the APISIX Dashboard at: http://localhost:9000/
3.Configure a global plugin in the Plugin section on the left
4.Make a request to your route and observe the logs printed by APISIX

Environment

  • APISIX version (run apisix version):
    apisix:3.14.0-debian
  • Operating system (run uname -a):
    win11 docker
  • OpenResty / Nginx version (run openresty -V or nginx -V):
  • etcd version, if relevant (run curl http://127.0.0.1:9090/v1/server_info):
    bitnami/etcd:3.5.11
  • APISIX Dashboard version, if relevant:
    apache/apisix-dashboard:latest

Example

After unzipping apisix-example-log.zip, navigate to the directory and execute docker-compose up -d.
Note: The bitnami/etcd:3.5.11 image might need to be replaced with a mirror from an internal repository.
apisix-example-log.zip
Access the APISIX Dashboard (admin/admin) at http://localhost:9200/plugin/list. Here, I have configured two loggers: http-logger and kafka-logger.

Image:

Image

Test request:

curl --location --request POST 'http://localhost:9280/header/test' \
--header 'log-save-req-body: true' \
--header 'log-save-resp-body: true' \
--header 'Content-Type: application/json' \
--data-raw '{
"a111":"c222"
}'

Then check the APISIX logs, and you'll see content similar to the following: body: nil

2025/12/01 02:02:07 [warn] 60#60: *44057 [lua] log-util.lua:237: get_full_log(): body : nil while logging request, client: 192.168.160.1, server: _, request: "POST /header/test HTTP/1.1", host: "localhost:9280"

Disable the http-logger and switch to the kafka-logger, and you'll find that the body contains data:

2025/12/01 02:02:54 [warn] 60#60: *44057 [lua] log-util.lua:237: get_full_log(): body : {

"a111":"c222"

} while logging request, client: 192.168.160.1, server: _, request: "POST /header/test HTTP/1.1", host: "localhost:9280"

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionlabel for questions asked by users

    Type

    No type

    Projects

    Status

    ✅ Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions