-
Notifications
You must be signed in to change notification settings - Fork 106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handling of a-gzip compressed content #77
Comments
What do the transfer encoding headers look like then? |
The relevant headers look like these:
|
I have a working solution where before piping the response to |
oh wow, this is... odd. Apple is not doing HTTP correctly unless this is double-compressed, which I doubt. Typically, As far as This is a strange one. |
I fed it to the same function that handles gzip and it works just fine, but I also was puzzled to see that it does not match any well known encoding. Which is why, rather than adding agzip to the list of bent known encodings, I'd rather have bent return the raw buffer to the client and let the client deal with it. FWIW it is not the first time that Apple blatantly misuses standard protocols.... nor will it alas be the last... |
wow, strange. closing for now. |
For anyone else who comes here -- Apple is, indeed, double-compressing the data.
|
I am trying to retrieve data using Apple's Appstore Connect API and they return data using a-gzip compression. Zlib is capable of handling it, but since the encoding name is not exactly gzip bent throws an error (which by the way is not caught despite the call being in a try block in my code)
I see two paths to solve this problem, one is adding a key 'agzip' to the compression object defined in nodejs.js and have it pointing to the same function as 'gzip'. Alternatively change the behaviour so that the getResponse method would not try to decompress the content if the encoding/compression is not known (my favourite option).
I can provide a pull-request for either solution, but I would leave the choice of which path to follow to @mikeal
The text was updated successfully, but these errors were encountered: