-
Notifications
You must be signed in to change notification settings - Fork 506
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Zip streaming on Cloudflare Workers - WritableStream has been closed #513
Comments
Thank you for the report, I will try to reproduce the issue. Could you meanwhile tell me if it works when replacing |
Thank you! To reproduce, just clone my repo, npm install, npm start, open browser at localhost:8787, file.zip is downloaded, errors are in the npm console. |
I've had a look at the issue and I think it's specific to the CloudFlare Workers execution environment. As it happens, I can get around it by passing For the record, here is the stack trace of the error below (commit: https://github.com/gildas-lormeau/zip.js/tree/9c06f4f6c95acc77e49da5400aac59c6d8bbe984).
|
I confirm there is something strange with the native On paper, this code should not be able to fix any bug, but it does. globalThis.CompressionStream = class {
constructor(format) {
const compressionStream = new CompressionStream(format);
const writer = compressionStream.writable.getWriter();
return new TransformStream({
async transform(chunk) {
await writer.write(chunk);
},
async flush() {
await writer.close();
}
});
}
}
const { TextReader, ZipWriter } = await import("@zip.js/zip.js");
export default {
async fetch(request, env, ctx) {
const { readable, writable } = new TransformStream();
const zipWriter = new ZipWriter(writable);
ctx.waitUntil(
(async () => {
await zipWriter.add("hello.txt", new TextReader("Hello world!"));
await zipWriter.close();
})()
);
return new Response(readable, {
headers: {
"Content-Disposition": 'attachment; filename="file.zip"',
"Content-Type": "application/zip",
"Cache-Control": "no-cache",
},
});
},
}; A way to circumvent this bug in a "cleaner" way would be to use import { TextReader, ZipWriter, configure } from "@zip.js/zip.js";
configure({
useCompressionStream: false,
CompressionStream: class {
constructor(format) {
const compressionStream = new CompressionStream(format);
const writer = compressionStream.writable.getWriter();
return new TransformStream({
async transform(chunk) {
await writer.write(chunk);
},
async flush() {
await writer.close();
}
});
}
}
});
export default {
async fetch(request, env, ctx) {
const { readable, writable } = new TransformStream();
const zipWriter = new ZipWriter(writable);
ctx.waitUntil(
(async () => {
await zipWriter.add("hello.txt", new TextReader("Hello world!"));
await zipWriter.close();
})()
);
return new Response(readable, {
headers: {
"Content-Disposition": 'attachment; filename="file.zip"',
"Content-Type": "application/zip",
"Cache-Control": "no-cache",
},
});
},
}; |
Finally, I can confirm this is not a bug in zip.js. I can reproduce it with the code below which does not depend on zip.js. I don't know if it's a bug in Node.js or the Worker environment. export default {
async fetch() {
const { readable, writable } = new CompressionStream("deflate-raw");
const helloWorldStream = new ReadableStream({
start(controller) {
controller.enqueue(new TextEncoder().encode("Hello, World!"));
controller.close();
}
});
helloWorldStream.pipeTo(writable);
return new Response(readable, {
headers: {
"Content-Disposition": 'attachment; filename="file.raw"',
"Content-Type": "application/octet-stream",
"Cache-Control": "no-cache",
},
});
}
}; |
It's probably a bug in the Worker environment since the code below runs fine in Node.js. const { readable, writable } = new CompressionStream("deflate-raw");
const helloWorldStream = new ReadableStream({
start(controller) {
controller.enqueue(new TextEncoder().encode("Hello, World!"));
controller.close();
}
});
helloWorldStream.pipeTo(writable);
const result = new Response(readable, {
headers: {
"Content-Disposition": 'attachment; filename="file.raw"',
"Content-Type": "application/octet-stream",
"Cache-Control": "no-cache",
},
});
result.blob().then(console.log); |
I've found the bug, see cloudflare/workerd#992 |
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
I'm trying to set up downloading and zipping multiple files on-the-fly with Cloudflare Workers.
This library seems to be the only one out there that supports Web standards, is not dependent on node.js libs and supports zip streaming, which is awesome, thank you!
With my code, the files are zipped and downloaded correctly, but for every file added to the archive, these errors show up in the console:
They are generated by Cloudflare's own implementation of Streams, see below for more details:
https://github.com/cloudflare/workerd/blob/main/src/workerd/api/streams/writable.c%2B%2B
I haven't been successful to identify the issue, nor I don't know if it's a problem on zip.js side or Cloudflare's, but I thought I'll give it a try and ask here if you can take a look.
Here is a simple example of my code: https://github.com/petrzelka/zip-cloudflare-repro
I was following CF example of usage streaming with request/response:
https://developers.cloudflare.com/workers/examples/openai-sdk-streaming/
It doesn't matter if I use await for add() and close() or not, it always downloads the zip correctly and throws the error to the console.
Without waitUntil() it works without awaiting - that seems to me like the proper way but I was just following the example (and trying everything).
I tried to debug it, but I haven't found any suspicious place where TransformStream would be created and closed twice. Not sure it would be happening due to writing to closed stream because the file itself looks ok.
Writing or closing closed stream are the only ways how to get this error according to the workerd code.
Anyway, it's quite surprising to me that I did not find any attempts to zip files on-the-fly within Cloudflare Workers and that no one has reported such error that I'm seeing.
Thanks for any help!
Honza
The text was updated successfully, but these errors were encountered: