-
Notifications
You must be signed in to change notification settings - Fork 28.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Zlib error for some requests using fetch #46359
Comments
I don't understand why you expect that. The error clearly indicates the response is truncated. |
Sure, the error "clearly indicates the response is truncated". Is it true though? See the evidence: If you open that URL in any browser, it displays the entire file 100% every time, no truncation. If you do that in node.js:
If all browsers and curl and python are able to read the file, maybe there is a bug in node.js somewhere...? I don't doubt that the server response for this file is kind of weird, but this is something that they ask you to configure a proxy redirect in nginx or haproxy or any other way, so I would assume nginx and haproxy or any load balancer would also understand this request and properly handle it. Yes, firebase can be buggy sometimes (I am having my fair share of weird bugs with them) and I am in fact trying to do something weird (in an attempt to solve the aforementioned bugs), but at least give the Firebase devs a little bit of credit :) |
@bnoordhuis to add to the previous comment, it seems like it's a
|
I think node is right to complain. You can test it yourself with curl:
That's exactly 256k, almost 12k short. I rest my case. |
@bnoordhuis I now think this is not a problem exclusive to node, but rather a problem on zlib that affects node. Please wake up your case again. And btw, I'm not doing this to irritate you. This issue demands time from me (and from you) to investigate or work around. I definitely learned things here with you, so I'm up for further collaboration. Now, please, as a node.js developer, would collaborate with me on this issue, can we start fresh without this weird skeptical/competitive language we're using since the beginning? I'm also sorry for the language in my second response. Can we collaborate on this issue by taking each other (and the issue itself) seriously? I really think this is at least a curious problem and it goes deeper than just 'test on curl'. Let's see if you reach the same conclusion as me. First, I want to take a different approach to at least convince you something is wrong, because this has been the main challenge on this thread. You can skip this paragraph and go directly to the next ones if you just want the technical details, but there's an interesting implication here. If you look at the wider context of this issue, you would see this is more serious than it looks if the Firebase Auth server's gzipped response really is wrong. Mind you, this is a widely used service. And this request, in particular, is for a JS file that runs in the browser. If the response was really truncated for real, this could be considered a massive bug in Firebase Auth, one that engineers would already be working to solve it since I opened this issue (or before), but this is definitely not the case for Firebase Auth. Browsers would report the server sent an incorrect response. However, this isn't happening. Firebase Auth is working (after you work around the SDK bugs). This weakens the hypothesis that the server response of Firebase Auth is actually truncating, and as you'll see, browsers download the file just fine, gzipped. And Yes, the same argument can be made for Node: Node is perhaps even more used than Firebase, if node is doing the wrong thing, then the impact would be equally absolutely massive. Here's where the rest of this issue comes, and where I think Node is actually doing the correct thing, but zlib is not. Back to the evidence. Yes, definitely I can see the same issue on curl. I also tried with http2, because I disabled http3 on firefox, and it's using http2 now. This should be an apples to apples comparison:
Ok, curl and node might have the same problem. And I didn't know Also, the HTTP protocol does not seem to influence the issue, it fails on both. But again, let's compare with other implementations. This is python, with the accept-encoding: gzip that I missed previously.
Interesting. The content-length is 85989 just like curl. And it works. Is it the same actual contents, though? Maybe there's an invalid character in this response? Maybe curl and node are downloading like, 1 corrupted byte for some reason? Maybe there's an extra
Ok, the compressed responses are the same. No weird corrupted download. And as you could see in the first python requests example, it does decompress all of the 274237 bytes, instead of having a truncation error. And just for the sake of more evidence, if the browser couldn't decompress this file correctly, it would be a massive issue for the firebase engineers. But here's Firefox (I disabled http3, just in case): Other browsers also work just fine. After doing all of this back-and-forth with you + gathering more evidence, I conclude the HTTP layer of node is probably working fine, just like curl. But for some reason, some implementations cannot decompress the response, while others can. I also tested this https://extract.me/ service, which returns a truncated response. Interesting, no? So it really seems to me an issue on zlib rather than undici as I first reported. Regardless, I think it's in the best interest of Node.js to correctly decompress these contents. |
For what's worth, Java's implementation also does not work, but in a weird way.
This prints the entire file instead of truncating it (like A similar example in C# works without errors. It seems unpredictable which implementations work or not, so maybe there's something Firebase can do on their side to slightly change the byte stream such that it works more generally. They tested it in browsers, and by some miracle all, browser implementations accept that stream. I'll open an issue with the Firebase team when I have time, but this still doesn't mean Node or Deno or whoever shouldn't take a look, there might be acceptable payloads that don't work. |
The zlib developer confirms that this stream is not entirely correct. It's correct enough for browsers to interpret it (i.e. the data is there, there's no 12kb missing anywhere), but the ending of the stream has some issues. I opened a bug report for the Firebase team. The fact is that some libraries just roll with it instead of throwing the error, including the ones used in browsers. Sadly (or maybe thankfully?) zlib is not one of those. Since nodejs uses zlib and this is how zlib rolls, and I have a somewhat effective workaround, I'm closing the issue, with a big disappointment on how this was handled. |
Can you elaborate? I'm not sure I understand what your complaint is. |
Version
v18.13.0
Platform
Darwin MacBook-Pro-de-Ricardo.local 22.2.0 Darwin Kernel Version 22.2.0: Fri Nov 11 02:03:51 PST 2022; root:xnu-8792.61.2~4/RELEASE_ARM64_T6000 arm64
Subsystem
Undici
What steps will reproduce the bug?
Put this in a file and just run it.
The bug also happens if you do it in the node REPL console.
How often does it reproduce? Is there a required condition?
Always
What is the expected behavior?
That it prints the entire script downloaded from the URL
What do you see instead?
Additional information
To be completely honest, I am indeed trying to do something a little bit cheeky. In an attempt to solve a bunch of issues I'm having with Firebase, I found that I could proxy the firebase auth request through my domain and request the firebase auth myself. Since I'll be deploying on vercel, I don't have nginx or something like that, so I'm doing it in application code.
One of the proxied requests is for a javascript file, but all requests in this proxying mechanism are failing.
I tried even using
response.body.getReader()
but that results in the same error when you keep reading the file. If I just read once, I can print a small slice of the file, but not the entire file.I also tried previous node versions, and even next versions (node 19), without success.
The text was updated successfully, but these errors were encountered: