-
-
Notifications
You must be signed in to change notification settings - Fork 6.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory Leak on ridiculously simple repo #7874
Comments
I found this out recently but you can use the Chrome console to debug Node scripts! You can try using the Chrome console to profile Jest while it's running to try and dig into the issue. I believe the command is: |
Do I understand correctly that using the workaround to force GC runs makes the heap size remain constant? In that case it's not really a memory leak, just v8 deciding not to run the GC because there is enough memory available. If I try running the repro with 50MB heap size node --max_old_space_size=50 node_modules/.bin/jest --logHeapUsage --runInBand --config=jest.config.js the tests still complete successfully, supporting this assumption. |
@milesj I ran through some memory dumps, but couldn't make much sense of them, I'm not too experienced with pursuing leaks and I didn't want to point in the wrong direction without something solid to count on. @jeysal you are right of course! The thing is our tests freeze in the middle of running since (I assume and could be wrong) we run out of memory. After spending a lot of time trying to figure this out, I found #7274. It seemed to me from the discussion that the behaviour I encountered here is not intended. wdyt @SimenB ? |
Bueller? |
My tests are also leaking massively on CI but the exact same setup locally doesn't really leak (much at least). It's so bad, I'm considering disabling tests on CI until I can make sense of what the difference is beside the OS. ): |
Hey guys! I simplified the memory leak case to a single file which runs tautological tests and eventually throws an exception due to a memory leak. I'm not sure how to move forward with this... help? @SimenB @jeysal @milesj
|
Similar here, jest + ts-jest, simple tests get over 1GB of memory and eventually crash. |
crashes for us too |
@javinor For a test file containing a ridiculous number of tests, I'm not sure there's much we can do, we have to keep the test objects around until the test file is finished - this is the heap while the tests are running: |
FYI @scotthovestadt is currently working on holistically improving memory efficiency of Jest, so improvements are coming (some of them in the next minor version). |
I wonder, why isn't it possible for Jest to spawn a process for each test file, which will guarantee that memory will be freed? Ok, it can be slower, of course, but in my case - it's much better to be slower rather than get a crash from out-of-memory and be blocked to use Jest alltogether... Maybe an option? Or a separate "runner" (not sure if I understand architecture and terminology right)? Is it architecturally possible? Or, will Node-experimental-workers solve it?.. |
I've made a few improvements to memory in the next release: I have a future plan to improve memory in a couple of ways:
The problem with your suggestion of just spawning a new worker for each test is that it would be very slow. A better suggestion along the same lines would be to monitor the memory usage of the processes and auto-restart them at some threshold. I have some concerns about that in general, I'd rather always fix memory leaks than paper them over, but if a PR did that I would accept it. Let me know if the release next week helps with the problems you've been experiencing. |
@scotthovestadt thanks for the info! I'll definitely check with the next release. My actual issue is reported here: #8247 |
Thanks for the responses guys! I think I can break this down to different two problems:
We're running thousands of tests, each creating a relatively big setup so we get bitten twice. The original screenshot showing the consumption growing from test file to test file, hinting to a leak between tests - I have a few guesses as to why this happens, but nothing solid yet. The exception I referred to later, as far as I can tell, really has to do with what @jeysal pointed out - having a large number of tests in the file. In our case, we have only hundreds of tests but with a very large setup. I'll try to provide a better reproduction of this. I'll update after the next release, when I get to poke around a bit more and see the additional fixes in action. Thanks guys! |
There must be something else wrong because I'm currently using Jest v23.6 and everything works fine, no memory leaks, no anything. If I upgrade to latest Jest then the memory leaks start to happen, but only on the GiLab CI runner. Works fine locally. |
New release is out: https://github.com/facebook/jest/releases/tag/v24.6.0 |
Meh, it's still leaking in my setup ):
|
After updating to 24.6.0, we are seeing the similar issue running our CI tests. When logging the heap usage, we see an increase of memory usage after each test file. |
This should help: #8282 Will be released soon. |
How soon? )': |
For those reading along at home, this went out in 24.8.0. |
This would also be a huge breaking change. |
Also encounter Originally I thought it was some circular dependency on my source code, but may be |
@unional if you're on Circle, make sure EDIT: To be clear, you should proactively specify |
@Supernats thanks. I think I did have that set during the failure, currently I'm running it with But it still fail once in a while: |
I have Jest 24.8.0 and #8282 doesn't seem to help. Also Pleaaaaaaase fix this ... |
Yes, following this thread for long since it still fails for us and in ~10% of the cases runs with "out of memory" for CircleCI 2Gb RAM instances. |
Same issue here. It's better with node 16.10, but it still arrives at 841 MB heap size (580 tests) |
Same issue here too. Downgrading to node 16.10 fixed the memory issue with Jest. |
Same issue for me. Downgrading to node 16.10 fixed the memory leak with Jest. I was seeing 3.4GB heap sizes with node 16.14, down to ~400MB with node 16.10. |
Looking for some forward momentum here as well; CI pipeline is locked at 16.10 and explicitly calls It hasn't resolved the issue for all repositories, so I suspect this issue more complex. Previous chromium issue linked by @mbyrne00 is labeled as Is there any solution in mind? |
The chromium issue points back to node and jest and how they are implementing certain features. As one user on the Chromium issue thread pointed out, this is the closest thing to an attempt to resolve the issue so hopefully it gets legs: https://github.com/facebook/jest/pull/12205/files#diff-c0d5b59e96fdc7ffc98405e8afb46d525505bc7b1c24916b5c8482de5a186c00R1359-R1373 Even more interesting is this specific comment where someone has published a jest runtime lib to fix the problem until the PR is merged #12205 (comment) |
Just FYI, the interesting news on the node / v8 upstream bug can be found at #11956 ("Memory consumption issues on Node JS 16.11.0+") - where some mitigations for the upcoming jest 29 and a workaround (albeit a very slow one) for jest 27 were discussed just 4 days ago. As far as I understood @SimenB's intention with reopening this issue, it is NOT to discuss known causes of memory leaks, but to track the overall state of jest leakages outside of identified and known issues and causes. That is to say, if your leak goes away when downgrading to node 16.10, this is not the issue for you. Go to #11956 instead. :) |
I've run some tests considering various configurations. Hope it helps someone.
|
Versions of Axios 1.x are causing issues with Jest (axios/axios#5101). Jest 28 and 29, where this issue is resolved, has other issues surrounding memory leaks (jestjs/jest#7874). Allow `>=0.25.0` for applications that cannot upgrade Jest at this moment.
From a preliminary run or two, it looks to me like going back to 16.10 is resolving these errors for us as well. |
All the info on the regression that specifically affects node >= 16.11 is found in this issue: #11956 |
Just spend about 2 days figuring out how to overcome this, until I discovered #11956. TLDR; regression introduced in node 16.11, fixed in 21.1. |
In case anyone stumbles across this and wants a simple solution, node 20.10.0 contains a fix for this. |
Reading the linked issue, it says '21.1', but might as well being already backported to '20.x', leaving '18.x' to be waiting for a fix🤷. For our team, switching our ci-builds to '21.x' did the deal, even if this might introduce runtime-confusion 😉. |
I'm currently investigating memory leaks in Jest (after #11956 solution), so I was interested in this case. I followed these steps:
The initial execution of the 30 test files barely reached 100MB at the last file, so I added more duplicates, totaling in 651 files. The tests slowly reached a close to 1GB heap size, but just before that, the heap was cleaned back to minimum (56MB): Results
I think it's safe to conclude that the initial issue is resolved. It's not to say that there aren't memory leaks, but only that they are not reproducible using a simple repo. For information on my progress with the other leaks, see #15215. |
You guys do an awesome job and we all appreciate it! 🎉
🐛 Bug Report
On a work project we discovered a memory leak choking our CI machines. Going down the rabbit hole, I was able to recreate the memory leak using
Jest
alone.Running many test files causes a memory leak. I created a stupid simple repo with only
Jest
installed and 40 tautological test files.I tried a number of solutions from #7311 but to no avail. I couldn't find any solutions in the other memory related issues, and this seems like the most trivial repro I could find.
Workaround :'(
We run tests with
--expose-gc
flag and adding this to each test file:To Reproduce
Steps to reproduce the behavior:
Expected behavior
Each test file should take the same amount of memory (give or take)
Link to repl or repo (highly encouraged)
https://github.com/javinor/jest-memory-leak
Run
npx envinfo --preset jest
Paste the results here:
The text was updated successfully, but these errors were encountered: