I currently experience an error with curl when running haproxy compresion reg-test suite. We use curl as an http-client with the option --compressed --limit-rate 300K. Curl cannot decompress the output and returns various error such as "Error while processing content unencoding: invalid block type". I do not encounter the issue with wget.
To reproduce the issue, I'm using haproxy compiled with lua support and curl.
Here is the haproxy config file. I named it compression.conf.
compression algo gzip
compression type text/plain
server big_payload_hap 127.0.0.1:20081
http-request use-service lua.fileloader-http01
I use a lua script to be able to generate a big payload. Place it in the same directory as haproxy config with the name 'lua_validation.lua'.
local data = "abcdefghijklmnopqrstuvwxyz"
local responseblob = ""
for i = 1,10000 do
responseblob = responseblob .. "\r\n" .. i .. data:sub(1, math.floor(i % 27))
http01applet = function(applet)
local response = responseblob
for i = 1,10 do
core.register_service("fileloader-http01", "http", http01applet)
Run haproxy :
$ haproxy -db -f compression.conf
Then the curl client to reproduce the error :
$ curl -o /dev/null --compressed --limit-rate 300K http://127.0.0.1:20080/
For what it's worth, I have bisected curl repository (I cannot reproduce the issue with 7.74). The issue seems to be related to the following commit :
commit b68dc34af341805aeb7b371541a2b4074da76217 (HEAD, refs/bisect/bad)
multi: set the PRETRANSFER time-stamp when we switch to PERFORM
... since the state machine might go to RATELIMITING and then back to
PERFORMING doing once-per-transfer inits in that function is wrong and
it caused problems with receiving chunked HTTP and it set the
PRETRANSFER time much too often...
Regression from b68dc34 (shipped in 7.75.0)
Reported-by: Amaury Denoyelle