-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Decompressing gzip stream from node-fetch response leads to different output in Deno vs Node #20456
Comments
Interestingly, if you remove the |
Also interesting that this does not happen on Linux (Ubuntu 22). |
Aha, so the problem is in Removing |
It looks like the broken bytes are only at the beginning of the data. In case of |
If I concat all chunks from import crypto from "node:crypto";
import fs from "node:fs";
import zlib from "node:zlib";
import { pipeline } from 'node:stream';
import { promisify } from 'node:util'
import fetch from "node-fetch";
import { Buffer } from "node:buffer"
const streamPipeline = promisify(pipeline);
try {
fs.unlinkSync("./schema-engine.exe");
} catch {}
const response = await fetch('https://binaries.prisma.sh/all_commits/2804dc98259d2ea960602aca6b8e7fdc03c1758f/windows/schema-engine.exe.gz');
const gunzip = zlib.createGunzip();
streamPipeline(response.body, gunzip);
const output = fs.createWriteStream("./schema-engine.exe");
const writeStream = gunzip.pipe(output);
writeStream.on("close", () => {
const writtenFile = fs.readFileSync("./schema-engine.exe")
console.log("Expected: d71565ea5e98b3cbbced66f4220d62bd221fa5cebd038d2875d66c83b29643c6")
console.log("Actual: " + sha256(writtenFile));
console.log("sha1(unzipped data)", sha256(gunzippedBuf));
});
let gunzippedBuf = Buffer.alloc(0);
gunzip.on("data", d => {
gunzippedBuf = Buffer.concat([gunzippedBuf, d]);
})
function sha256(b) {
return crypto.createHash("sha256").update(b).digest("hex")
} This prints:
I think there's something wrong with |
I found smaller reproduction, which doesn't depend on either gunzip stream or node-fetch: import fs from "node:fs";
const writeStream = fs.createWriteStream("out.txt");
for (const i of Array(20).keys()) {
writeStream.write(i + "\n");
await new Promise((resolve) => setTimeout(resolve, 0));
} $ deno run -A x.mjs && cat out.txt
1
0
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19 (note: first line could be other numbers) It seems that if |
Simpler repro (doesn't depend on even import { Writable } from "node:stream";
async function test() {
const chunks = [];
const writable = new Writable({
construct(cb) {
setTimeout(cb, 10);
},
write(chunk, _, cb) {
chunks.push(chunk);
cb();
}
});
for (const i of Array(20).keys()) {
writable.write(Uint8Array.from([i]));
await new Promise((resolve) => setTimeout(resolve, 1));
}
if (chunks[0][0] === 0) {
console.log("success");
} else {
console.log("fail, first chunk is ", chunks[0])
}
}
for (const _ of Array(10)) {
await test();
} If I replace nodejs/node@355bcbc This seems like the fix we need. (Node.js had a very similar issue nodejs/node#46765 |
This change applies the same fix as nodejs/node#46818, and the original example given in #20456 works as expected. closes #20456
This change applies the same fix as nodejs/node#46818, and the original example given in #20456 works as expected. closes #20456 (cherry picked from commit bf42467)
This change applies the same fix as nodejs/node#46818, and the original example given in #20456 works as expected. closes #20456
package.json
main.mjs
Causes #20098
Causes #19544
The text was updated successfully, but these errors were encountered: