Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add workerIdleMemoryLimit to support worker recycling in the event of node >16.11.0 memory leaks #13056

Merged
merged 51 commits into from Aug 5, 2022

Conversation

phawxby
Copy link
Contributor

@phawxby phawxby commented Jul 22, 2022

Summary

#11956 (comment)

Test plan

Numerous new test cases added except it's hard to test for the specific issue as the memory usage problem is extremely hard to replicate outside of very complicated applications

Ideally I would like to end-end test this but I have no idea how I would do that

@phawxby phawxby marked this pull request as ready for review July 22, 2022 13:45
@phawxby phawxby changed the title feat: process recycling feat: add workerIdleMemoryLimit to support worker recycling in the event of node >16.11.0 memory leaks Jul 22, 2022
@phawxby
Copy link
Contributor Author

phawxby commented Jul 22, 2022

I'm going to try and write some more tests for this so even if approved don't merge yet

@@ -478,6 +480,7 @@ export type ProjectConfig = {
transformIgnorePatterns: Array<string>;
watchPathIgnorePatterns: Array<string>;
unmockedModulePathPatterns?: Array<string>;
workerIdleMemoryLimit?: number;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should be part of global config, not project

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can see the case for both. You may want to set a sensible global default perhaps running on a machine with known memory usage limitations but then configure specifically for projects where you may use less threads and allow more memory. In all honesty I could go either way, if you want it removing I can remove it.

@SimenB
Copy link
Member

SimenB commented Jul 24, 2022

exciting! thanks for looking into it!

@phawxby
Copy link
Contributor Author

phawxby commented Jul 25, 2022

@SimenB this should be better now. I've created edge case tests using actual workers (as opposed to mocked ones) to validate that they behave as expected.

docs/Configuration.md Show resolved Hide resolved
docs/Configuration.md Outdated Show resolved Hide resolved
@phawxby
Copy link
Contributor Author

phawxby commented Jul 25, 2022

Are some of these tests expected to be failing?

@SimenB
Copy link
Member

SimenB commented Jul 25, 2022

Are some of these tests expected to be failing?

There is some flake, but the failures now seems to be the new tests?

@phawxby
Copy link
Contributor Author

phawxby commented Jul 25, 2022

Are some of these tests expected to be failing?

There is some flake, but the failures now seems to be the new tests?

Some of them seem to be a dependency bug. chalk/supports-color#135

@phawxby
Copy link
Contributor Author

phawxby commented Jul 25, 2022

Are some of these tests expected to be failing?

There is some flake, but the failures now seems to be the new tests?

Some of them seem to be a dependency bug. chalk/supports-color#135

Actually I think I'm being dumb

@phawxby
Copy link
Contributor Author

phawxby commented Jul 25, 2022

@SimenB do you have any good ideas why those 2 tests seem to be failing in CI? They work absolutely fine locally with the same commands.

Copy link
Member

@SimenB SimenB left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This new option is not supported in thread workers?

jest.config.ci.mjs Outdated Show resolved Hide resolved
@phawxby
Copy link
Contributor Author

phawxby commented Jul 27, 2022

This new option is not supported in thread workers?

I hadn't planned on it but I actually don't think it would be too hard to port over. I'll take a quick crack at it

@phawxby
Copy link
Contributor Author

phawxby commented Jul 27, 2022

This new option is not supported in thread workers?

Actually, do we even know if this is necessary? This is primarily to deal with a memory leak in process workers. As far as I know there's currently no way to enable worker threads in jest via a config/cli option. It may be completely unnecessary in threaded workers.

@SimenB
Copy link
Member

SimenB commented Jul 28, 2022

As far as I know there's currently no way to enable worker threads in jest via a config/cli option. It may be completely unnecessary in threaded workers.

There's not, but the same capability should be added to both types of workers. jest-worker is used outside of Jest itself by many projects, so there should be feature parity as far as possible. Jest will also switch at some point in the future

@phawxby phawxby requested a review from SimenB July 28, 2022 14:41
@phawxby
Copy link
Contributor Author

phawxby commented Aug 1, 2022

I think this PR is generally done, I'm just not really sure what to do to resolve the flaky tests. They seem to work more often than not, but I know that's still not ideal.

Also the worker memory limit is specified in bytes, I wonder if that should be MB to make the number a bit easier to understand.

@SimenB
Copy link
Member

SimenB commented Aug 1, 2022

Also the worker memory limit is specified in bytes, I wonder if that should be MB to make the number a bit easier to understand.

We could do shorthand? Bytes if no unit, but support e.g. 500m (converted to 1024 * 1024 * 500 in normalize)

@phawxby
Copy link
Contributor Author

phawxby commented Aug 1, 2022

@SimenB done. I think I may have solved my flaky test problem too

Copy link
Member

@SimenB SimenB left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks! this looks good to me 👍

.github/workflows/test.yml Outdated Show resolved Hide resolved
docs/Configuration.md Outdated Show resolved Hide resolved
e2e/__tests__/workerForceExit.test.ts Outdated Show resolved Hide resolved
@phawxby phawxby requested a review from SimenB August 4, 2022 21:47
Copy link
Member

@SimenB SimenB left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks!

@SimenB SimenB merged commit 37200eb into jestjs:main Aug 5, 2022
@phawxby
Copy link
Contributor Author

phawxby commented Aug 6, 2022

@SimenB what's the release process/cycle for Jest?

@SimenB
Copy link
Member

SimenB commented Aug 7, 2022

@SimenB
Copy link
Member

SimenB commented Aug 7, 2022

@SimenB what's the release process/cycle for Jest?

Stable release next week

@AndrewLeedham
Copy link
Contributor

Was trying this out this morning and ran into a config validation issue:

$ yarn jest --version

29.0.0-alpha.3

$ yarn jest

Validation Warning:

  Unknown option "workerIdleMemoryLimit" with value 0.2 was found.
  This is probably a typing mistake. Fixing it will remove this message.

  Configuration Documentation:
  https://jestjs.io/docs/configuration

The jest.config.js has workerIdleMemoryLimit: 0.2, at the root of the config object.

Seems to me like workerIdleMemoryLimit needs to be added to https://github.com/facebook/jest/blob/main/packages/jest-config/src/ValidConfig.ts ?

@github-actions
Copy link

github-actions bot commented Sep 8, 2022

This pull request has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Sep 8, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants