Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"MaxListenersExceededWarning: Possible EventTarget memory leak detected" from fetch with AbortSignal #46525

Closed
joshkel opened this issue Feb 6, 2023 · 3 comments · Fixed by nodejs/undici#1910

Comments

@joshkel
Copy link

joshkel commented Feb 6, 2023

Version

v19.6.0

Platform

Darwin 22.2.0 Darwin Kernel Version 22.2.0: Fri Nov 11 02:03:51 PST 2022; root:xnu-8792.61.2~4/RELEASE_ARM64_T6000 arm64 arm Darwin

Subsystem

fetch

What steps will reproduce the bug?

async function main() {
  const controller = new AbortController();
  for (let i = 0; i < 15; i++) {
    try {
      await fetch('http://example.com', { signal: controller.signal });
    } catch (e) {}
  }
}

main();

How often does it reproduce? Is there a required condition?

Every time

What is the expected behavior?

No errors or warnings

What do you see instead?

(node:57538) MaxListenersExceededWarning: Possible EventTarget memory leak detected. 11 abort listeners added to [AbortSignal]. Use events.setMaxListeners() to increase limit

Additional information

It looks like Node's built-in fetch fails to properly clean up its abort listener when used with an AbortSignal.

@joshkel
Copy link
Author

joshkel commented Feb 6, 2023

See nodejs/undici#939.

If I use undici directly, I get the MaxListenersExceededWarning, but there does not appear to be a memory leak.

If I use Node.js's built-in fetch (which I understand uses undici), I get the MaxListenersExceededWarning, and memory usage steadily increases.

// Uncomment to compare implementations
//const {fetch} = require('undici');

async function main() {
  const controller = new AbortController();
  while (true) {
    try {
      await fetch('http://example.com', { signal: controller.signal });
    } catch (e) {
    }
  }
}

main();

@KhafraDev
Copy link
Member

undici cleans up the listeners properly, the issue is that the signal is never being aborted and the gc won't run in the loop so the listeners aren't being removed instantly.

@joshkel
Copy link
Author

joshkel commented Feb 6, 2023

Thanks for the reply, @KhafraDev. I'm not sure that that's what's going on - or maybe I'm not understanding things:

  • Since the for loop uses await and has to wait for lots of HTTP requests, why wouldn't the GC run?

  • If I run the script indefinitely (see comment), then using Node's fetch causes an OOM after 70 minutes. This makes it look like the GC is running but isn't able to reclaim memory?

    [83018:0x118040000]  4300762 ms: Mark-sweep 3966.5 (4137.6) -> 3957.5 (4142.3) MB, 2935.9 / 1.9 ms  (average mu = 0.923, current mu = 0.716) task; scavenge might not succeed
    [83018:0x118040000]  4313057 ms: Mark-sweep 3973.0 (4143.8) -> 3964.0 (4148.8) MB, 4739.6 / 1.5 ms  (average mu = 0.846, current mu = 0.615) task; scavenge might not succeed
    
    
    <--- JS stacktrace --->
    
    FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
    
  • If I run the script indefinitely using Undici directly, I still get the MaxListenersExceededWarning message, but memory usage appears fairly stable (100MB after 15 minutes), so it appears that Undici 5.16 behaves differently than Undici via Node.js's built-in fetch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants