Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tsserver.js CPU/Memory Spike #58182

Open
amvFrontendMonkey opened this issue Apr 13, 2024 · 3 comments
Open

tsserver.js CPU/Memory Spike #58182

amvFrontendMonkey opened this issue Apr 13, 2024 · 3 comments
Labels
Needs More Info The issue still hasn't been fully clarified

Comments

@amvFrontendMonkey
Copy link

🔎 Search Terms

tsserver.js electron-nodejs memory cpu

🕗 Version & Regression Information

This performance issue began some time ago. Bisecting isn't an option due to automated swagger -> Zod processing.

I am looking for smoking guns (file-wise) so the investigation can be accelerated (please see attached verbose tsserver logs)

⏯ Playground Link

No response

💻 Code

Performance related; not code

🙁 Actual behavior

Performance related; not code

🙂 Expected behavior

Performance related; not code

Additional information about the issue

Attached is a verbose tsserver trace log.

Note that the behaviour is triggered when TS intellisense is triggered - i.e. a type lookup is made / auto import attempted etc.

I see huge spike in memory and cpu. It returns after about 45 seconds

tsserver.log.zip

@fatcerberus
Copy link

Bisecting isn't an option due to automated swagger -> Zod processing.

Not even using every-ts?

@amvFrontendMonkey
Copy link
Author

We make heavy use of Zod and autogenerate these types from upstream Swagger definitions. I strongly suspect this slowdown is related to, ultimately, simply having just too many types. Here's an example of issues we've been having at the same time as this general TS slowdown

So, long story short, I'd really like some insight into the attached tsserver logs or some tips on how to interpret it ourselves. A tip or two on mitigating the slowdowns would be super useful too!

@RyanCavanaugh
Copy link
Member

It returns after about 45 seconds

I wasn't able to correlate where in the log this occurred

Notable lines I did see:

Perf 323  [21:55:53.488] 2::updateOpen: elapsed time (in milliseconds) 5954.0864
Perf 1146 [21:56:02.693] 4::updateOpen: elapsed time (in milliseconds) 1280.1678
Perf 1161 [21:56:04.858] 10::encodedSemanticClassifications-full: elapsed time (in milliseconds) 2154.9555
Perf 1168 [21:56:09.939] 13::documentHighlights: elapsed time (in milliseconds) 5078.6349
Perf 1319 [21:56:29.667] 54::encodedSemanticClassifications-full: elapsed time (in milliseconds) 1917.7468
Perf 1393 [21:56:42.308] 78::encodedSemanticClassifications-full: elapsed time (in milliseconds) 1987.7479
Perf 1442 [21:56:53.190] 90::encodedSemanticClassifications-full: elapsed time (in milliseconds) 2137.3590

5-6 seconds of initial updateOpen is a bit unexpected, but given the project size (1,830 files), not that unusual.

For best performance, it's always best to turn off semantic highlight. IMO the gain in colorization is not worth the perf hit in large contexts. That said, something weird happened here:

Info 1160 [21:56:02.703] request:
    {
      "seq": 10,
      "type": "request",
      "command": "encodedSemanticClassifications-full",
      "arguments": {
        "file": "/Users/penric000/dev/autotrack-fe/apps/web/app/[locale]/a/[slug]/page.tsx",
        "start": 5884,
        "length": 6698,
        "format": "2020"
      }
    }
Perf 1161 [21:56:04.858] 10::encodedSemanticClassifications-full: elapsed time (in milliseconds) 2154.9555

There's apparently some type in that span which is very, very expensive to compute. That approximate range shows up a few times in the log.

It's possibly also referenced right here:

Info 1167 [21:56:04.861] request:
    {
      "seq": 13,
      "type": "request",
      "command": "documentHighlights",
      "arguments": {
        "file": "/Users/penric000/dev/autotrack-fe/apps/web/app/[locale]/a/[slug]/page.tsx",
        "line": 288,
        "offset": 67,
        "filesToSearch": [
          "/Users/penric000/dev/autotrack-fe/apps/web/app/[locale]/a/[slug]/page.tsx"
        ]
      }
    }
Perf 1168 [21:56:09.939] 13::documentHighlights: elapsed time (in milliseconds) 5078.6349

@RyanCavanaugh RyanCavanaugh added the Needs Proposal This issue needs a plan that clarifies the finer details of how it could be implemented. label Apr 17, 2024
@DanielRosenwasser DanielRosenwasser added Needs More Info The issue still hasn't been fully clarified and removed Needs Proposal This issue needs a plan that clarifies the finer details of how it could be implemented. labels Apr 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Needs More Info The issue still hasn't been fully clarified
Projects
None yet
Development

No branches or pull requests

4 participants