Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fetch leaks resources if body is not resolved. #4735

Closed
olaven opened this issue Apr 13, 2020 · 13 comments 路 Fixed by #11467
Closed

Fetch leaks resources if body is not resolved. #4735

olaven opened this issue Apr 13, 2020 · 13 comments 路 Fixed by #11467
Labels
bug Something isn't working ext/fetch related to the ext/fetch

Comments

@olaven
Copy link

olaven commented Apr 13, 2020

Hi 馃憢
I believe I have found an issue where the fetch-api leaks resources if the
body is not resolved, e.g. with .text()/.json().

Given the following test:

import { assertEquals } from "https://deno.land/std/testing/asserts.ts";

Deno.test("fetching TODOs", async () => {

        const response = await fetch("https://jsonplaceholder.typicode.com/todos/");
        //await response.json(); //NOTE: adding this line removes the issue (or .text()/similar)
        assertEquals(200, response.status);
});

Produces the following output:

running 1 tests
test fetching TODOs ... FAILED (94ms)

failures:

fetching TODOs
Error: Test case is leaking resources.
Before: {
  "0": "stdin",
  "1": "stdout",
  "2": "stderr"
}
After: {
  "0": "stdin",
  "1": "stdout",
  "2": "stderr",
  "3": "httpBody"
}
    at Object.assert ($deno$/util.ts:25:11)
    at Object.resourceSanitizer [as fn] ($deno$/testing.ts:75:5)
    at async TestApi.[Symbol.asyncIterator] ($deno$/testing.ts:259:11)
    at async Object.runTests ($deno$/testing.ts:341:20)

failures:

	fetching TODOs

test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out (95ms)

The issue is only present from v0.37.0.

@syumai
Copy link
Contributor

syumai commented Apr 13, 2020

@olaven You should close response body.
Deno's fetch returns a Response with Body which implements Deno.ReadCloser.
text() and json() are calling Body.#bodyBuffer() which closes body implicitly.
Here's an example to resolve this issue.

import { assertEquals } from "https://deno.land/std/testing/asserts.ts";

Deno.test("fetching TODOs", async () => {
  const response = await fetch("https://jsonplaceholder.typicode.com/todos/");
  await response.body.close();
  assertEquals(200, response.status);
});

@olaven
Copy link
Author

olaven commented Apr 13, 2020

Thanks! I don't believe I have come across this before 馃
Does this apply to browser implementations as well?
Is this behaviour intended?

@syumai
Copy link
Contributor

syumai commented Apr 13, 2020

It's not in browser's behavior. It's Deno's behavior (not in fetch standard).
Deno's fetch is more controllable than browser's fetch.
We can controll response body's resource explicitly.
Maybe Deno refers Go's HTTP implementation like this:

resp, _ := http.Get("http://example.com/")
defer resp.Body.Close() // Body must be closed explicitly

However, I also think this could confuse JS land's programmers 馃槄

@lucacasonato
Copy link
Member

Well its a side effect of how fetch is implemented in Deno, but I think its a bug nonetheless because our fetch implementation should follow the spec, and this is not part of the spec.

@ry
Copy link
Member

ry commented Apr 13, 2020

I've recently removed type declarations for response.body.close() as it's non-standard. I'm about to undertake a rewriting of fetch which will address these API issues. That said, even after this refactor, you will still need to consume the body in tests. Our tests check that resources are cleaned up. I suggest adding something like

await response.arrayBuffer();

to fix the above test failure.

@alisabzevari
Copy link

As a followup question to this issue, wouldn't it be beneficial to close the resource after iterating over the body?

In other words, the following code fails as well:

Deno.test({
  name: "test",
  fn: async () => {
    const response = await fetch("https://api.github.com/users/http4ts/repos");

    await readToEnd(response.body!);
  }
});

async function readToEnd(it: AsyncIterable<Uint8Array>) {
  let chunkCount = 0;
  for await (const chunk of it) {
    chunkCount += chunk.length;
  }
  console.log(`Finished reading ${chunkCount} bytes.`);
}

@ghost
Copy link

ghost commented Dec 29, 2020

const response = await fetch("https://tc39.github.io/ecma262");

for await (const value of response.body) {
	console.log(
		"Received %O bytes",
		value.length
	);
}

console.log("Response fully received");

Very similar to @alisabzevari's code.
I'm getting an OOM here, can't tell if it's because TC39's site is too large (too much memory allocated), or because of Deno's memory leakage, but it causes an OOM before finishing the loop, so I presume the former is the cause?

@bartlomieju
Copy link
Member

This is likely due to inefficient mechanism used in fetch API that JSON-stringifies payload, should be fixed by #8826

@kitsonk kitsonk added bug Something isn't working ext/fetch related to the ext/fetch labels Dec 29, 2020
@satyarohith
Copy link
Member

const response = await fetch("https://tc39.github.io/ecma262");

for await (const value of response.body) {
	console.log(
		"Received %O bytes",
		value.length
	);
}

console.log("Response fully received");

Very similar to @alisabzevari's code.
I'm getting an OOM here, can't tell if it's because TC39's site is too large (too much memory allocated), or because of Deno's memory leakage, but it causes an OOM before finishing the loop, so I presume the former is the cause?

This issue no longer exists with 1.9.0 (Fixed by #9036).

(cc @bartlomieju)

@ghost
Copy link

ghost commented Apr 18, 2021

@satyarohith That's good to hear, but has the primary issue here been fixed? I.e., are Response bodies correctly tracked and GCed?

@lucacasonato
Copy link
Member

@CrimsonCodes0 No. This is a larger undertaking that requires some more core infrastructure to be built.

@evanwashere
Copy link
Contributor

evanwashere commented Jun 20, 2021

I am proposing to add unused fetch bodies limit alongside GC finalizers (based on cloudflare workers design)

This would allow runtime to cancel old unread response bodies if user set (or default) threshold is exceeded

benefits:

  • predictable memory usage
  • less reliance on garbage collector
  • fixed amount of leaking unread bodies (not yet garbage collected)

cons:

  • runtime can cancel bodies which are going to be used later (rare/uncommon according to cloudflare docs)

example (6 leaking bodies allowed):

const f1 = await fetch('https://example.com').then(res => res.status);
const f2 = await fetch('https://example.com').then(res => res.status);
const f3 = await fetch('https://example.com').then(res => res.body.getReader()); // or res.text/json/arrayBuffer() // body is used
const f4 = await fetch('https://example.com').then(res => res.status);
const f5 = await fetch('https://example.com').then(res => res.status);
const f6 = await fetch('https://example.com').then(res => res.status);
const f7 = await fetch('https://example.com').then(res => res.status);
const f8 = await fetch('https://example.com').then(res => res.status); // cancels f1 body
const f9 = await fetch('https://example.com').then(res => res.status); // cancels f2 body
const f10 = await fetch('https://example.com').then(res => res.status); // cancels f4 body

chart visualizing benefit
chart

@talentlessguy
Copy link

This is still an issue:

const res = await fetch(decodeURIComponent(url), {
  method: 'PUT',
  body: file,
  headers: { 'x-amz-meta-import': 'car' },
})

const buf = await res.arrayBuffer()
error: Leaking resources:
  - A fetch response body (rid 18) was created during the test, but not consumed during the test. Consume or close the response body `ReadableStream`, e.g `await resp.text()` or `await resp.body.cancel()`.
  -

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working ext/fetch related to the ext/fetch
Projects
None yet
Development

Successfully merging a pull request may close this issue.

10 participants