New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fetch leaks resources if body is not resolved. #4735
Comments
|
@olaven You should close response body. import { assertEquals } from "https://deno.land/std/testing/asserts.ts";
Deno.test("fetching TODOs", async () => {
const response = await fetch("https://jsonplaceholder.typicode.com/todos/");
await response.body.close();
assertEquals(200, response.status);
}); |
|
Thanks! I don't believe I have come across this before 馃 |
|
It's not in browser's behavior. It's Deno's behavior (not in fetch standard). resp, _ := http.Get("http://example.com/")
defer resp.Body.Close() // Body must be closed explicitlyHowever, I also think this could confuse JS land's programmers 馃槄 |
|
Well its a side effect of how |
|
I've recently removed type declarations for await response.arrayBuffer();to fix the above test failure. |
|
As a followup question to this issue, wouldn't it be beneficial to close the resource after iterating over the body? In other words, the following code fails as well: Deno.test({
name: "test",
fn: async () => {
const response = await fetch("https://api.github.com/users/http4ts/repos");
await readToEnd(response.body!);
}
});
async function readToEnd(it: AsyncIterable<Uint8Array>) {
let chunkCount = 0;
for await (const chunk of it) {
chunkCount += chunk.length;
}
console.log(`Finished reading ${chunkCount} bytes.`);
} |
Deno bug: denoland/deno#4735
const response = await fetch("https://tc39.github.io/ecma262");
for await (const value of response.body) {
console.log(
"Received %O bytes",
value.length
);
}
console.log("Response fully received");Very similar to @alisabzevari's code. |
|
This is likely due to inefficient mechanism used in |
This issue no longer exists with (cc @bartlomieju) |
|
@satyarohith That's good to hear, but has the primary issue here been fixed? I.e., are Response bodies correctly tracked and GCed? |
|
@CrimsonCodes0 No. This is a larger undertaking that requires some more core infrastructure to be built. |
|
I am proposing to add unused fetch bodies limit alongside GC finalizers (based on cloudflare workers design) This would allow runtime to cancel old unread response bodies if user set (or default) threshold is exceeded benefits:
cons:
example (6 leaking bodies allowed): const f1 = await fetch('https://example.com').then(res => res.status);
const f2 = await fetch('https://example.com').then(res => res.status);
const f3 = await fetch('https://example.com').then(res => res.body.getReader()); // or res.text/json/arrayBuffer() // body is used
const f4 = await fetch('https://example.com').then(res => res.status);
const f5 = await fetch('https://example.com').then(res => res.status);
const f6 = await fetch('https://example.com').then(res => res.status);
const f7 = await fetch('https://example.com').then(res => res.status);
const f8 = await fetch('https://example.com').then(res => res.status); // cancels f1 body
const f9 = await fetch('https://example.com').then(res => res.status); // cancels f2 body
const f10 = await fetch('https://example.com').then(res => res.status); // cancels f4 body |
|
This is still an issue: const res = await fetch(decodeURIComponent(url), {
method: 'PUT',
body: file,
headers: { 'x-amz-meta-import': 'car' },
})
const buf = await res.arrayBuffer() |

Hi 馃憢
I believe I have found an issue where the
fetch-api leaks resources if thebody is not resolved, e.g. with
.text()/.json().Given the following test:
Produces the following output:
The issue is only present from
v0.37.0.The text was updated successfully, but these errors were encountered: