You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.
Just a question, not a bug report or feature request ...
I've now experimented with Jest on a number of different projects — projects of varying sizes and requirements (e.g. some using snapshots, some not; some doing real async work, some not) — and I have one consistent complaint: Jest is too slow. I'm filing this issue in order to see if this problem is on the team's radar, if there are intentions & plans to improve speed. I see so much advertisement of how fast Jest is, so I'm confused that my actual experience has been the opposite. (Is the advertised speed only in comparison to AVA?)
There's an issue or two and a PR pointing at the problem as it manifests in CI environments. But I'm also seeing this locally, though it's less a "bug" in this case because the tests still work, just slowly. Tests that blur by in Tape or Mocha plod along in Jest.
All the issues related to performance that I could find in this repo end up pointing at causes outside Jest. After my experiences, though, I question that dismissal. I've always found that Tape and Mocha are significantly faster in the same environment doing the same thing — never seen the opposite. When I have switched a project to Jest it has always been at the expense of performance.
I know this issue might seem "unactionable", but it's kind of my last-ditch effort to try to keep using Jest.
The text was updated successfully, but these errors were encountered:
A wrinkle in the comparison with Tape above: Apparently there's some significant difference between what happens when running Tape with and without Babel, such that it can be faster with Babel when testing a huge amount of files o_O. Ran across this situation working on stylelint: it now turns out that although locally run Jest is significantly slower than babel-tape-runner was, it is actually faster than Tape itself without Babel, once the total test count passes several thousand or so ¯(°_o)/¯. Unfortunately, still can't use Jest on CI because of memory failures ...
Anyway, I'm very sorry if any of this comes off as me being a stupid jerk. I clearly don't really understand what's going on, and am frustrated that these days I seem to be butting up against the limitations of each JS test runner I try; so I'm just hoping Jest can end up being the magic bullet that solves this 😬
@yaycmyk Thanks for chiming in. Yeah, I've used Jest in both environments. I'm ready to accept some slowness when using JSDOM and snapshots, because I know JSDOM can be slow and there's some I/O involved when reading snapshots. However, it's worth noting that before using Jest I was using JSDOM in Mocha with expect-jsx assertions, and those moved noticeably faster than the Jest tests that replace them. So I wonder if there's room for improvement, I guess.
@davidtheclark thanks for all the concern! But I'll close this issue, as we're constantly seeking to optimise performance. Hope you don't mind.
We've recently tracked a memory-leak in JSDOM which slows us down, but we believe this will be addressed soon.
Also the watch-mode has been rewritten with some nice perf gains on larger repositories, ready to be released in the next major version.
If you have any ideas on how to improve it further, please do so in separate issues 🙂
This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.