Hi Connect Creators,
Right now, compress.js doesn't seem to adhere to W3C standards when it comes to reading the 'accept-encoding' header (unless I'm completely mistaken). For example, if a request has an accept-encoding heading of "gzip;q=0", the browser would be telling the server that it does not want any gzipped files, while this server would return a gzip file.
Is compress.js being planned on being made W3C compliant at some point, or will we need to modify it/write our own zlib-using-connect-middleware in order to make this happen?
IMO quality values flawed (same goes for Accept etc) but I'm fine with supporting them, let me know if you want to patch or i can whip up a utility when i have time
I guess I don't know too much about how browsers that don't deal with gzip. I'm just trying to cover all my fronts - essentially if the app I'm working on becomes popular and a browser specifically requests non-gzip files because it can't gunzip them, then I wouldn't want that user to be unable to view the page.
I realize that it isn't that large of an issue anymore, but apparently 10% or so of the internet's users still use real shitty browsers... do you think it's something that's necessary?
not that it's not a concern, but I've never seen a client send q=0, they just dont set it at all
ah ok. i'll trust that you know more about this than i do. if my site picks up traffic and i end up noticing issues of these sorts, i'd be happy to write a patch for connect that addresses the problem.
a spec is a spec, but in practice i dont think you'll run into that much, but certainly let me know if you do come across any
Changed accept-encoding header to identity
The compress module on the node connect server doesn't seem to honor q=0 on gzip, so to make the proxy work with connect, I had to change the accept-encoding to "identity". See senchalabs/connect#414