TODO: Support Multiple Content-Encodings #2002

Closed
wants to merge 1 commit into
from

Conversation

Projects
None yet
2 participants
Contributor

danielbankhead commented Oct 20, 2017

RFC 7231 Section 3.1.2.2 allows multiple encodings for a single request. Adding this feature may result in lower bandwidth and promotes a more resource-friendly web. Currently, Chrome and Firefox support multiple encodings for requests.

For reference, here are a few examples:

Content-Encoding: gzip, identity
Content-Encoding: deflate, gzip
Content-Encoding: gzip, gzip

While I'm not well-versed in C, I wrote an implementation in JavaScript for a Node.js module, hopefully it helps!

@bagder bagder added the HTTP label Oct 25, 2017

Owner

bagder commented Oct 25, 2017

Adding this feature may result in lower bandwidth

Can you show me a single resource anywhere that gets smaller by using multiple encodings? Multiple ones don't help, they just complicate matters. But since the browsers support this, I think curl should too. I don't consider it terribly important though as I've never seen such a resource and it can still be fixed after the fact on downloaded content.

Contributor

danielbankhead commented Oct 25, 2017

Can you show me a single resource anywhere that gets smaller by using multiple encodings?

The home pages of Facebook, Yahoo, Twitter, and YouTube would all take up less bandwidth when compressed with gzip, gzip rather than gzip or deflate alone (calculated using the body of the response). Generally, larger, text-based resources are candidates for multiple content-encodings.

they just complicate matters

I totally agree here, it does make debugging and requests a bit more complicated.

I don't consider it terribly important though as I've never seen such a resource

I agree here as well, not at all a pressing issue. I believe we don't see multiple encodings in the wild because of the following reasons:

  • very few request libraries currently support it
  • Implementation can be tricky for both web servers and request libraries.

If we fix the first issue more web servers will be able to take advantage. One way web servers can take advantage is by determining the right amount of compression before hand, then sending the response that would provide the best results.

Owner

bagder commented Oct 28, 2017

The home pages of Facebook, Yahoo, Twitter, and YouTube would all take up less bandwidth when compressed with gzip, gzip

How on earth does that make sense? Why would compressing gzip again - with gzip - make anything smaller?

Besides, I wasn't asking for that. I was asking for existing web resources that use double-encoding.

Personally, I think adding support for brotli is a way better investment in time and energy and will make even better compression utilized...

@bagder bagder closed this in 1d0c8de Oct 28, 2017

Owner

bagder commented Oct 28, 2017

Merged, thanks!

monnerat added a commit that referenced this pull request Nov 5, 2017

HTTP: support multiple Content-Encodings
This is implemented as an output streaming stack of unencoders, the last
calling the client write procedure.

New test 230 checks this feature.

Bug: #2002
Reported-By: Daniel Bankhead
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment