Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Regression]:  ERROR  Invalid string length (RangeError: Invalid string length) #7079

Open
armenr opened this issue Sep 11, 2023 · 19 comments

Comments

@armenr
Copy link

armenr commented Sep 11, 2023

Last pnpm version that worked

8.7.0

pnpm version

These don't work:
8.7.4 and 8.7.5

Code to reproduce the issue

I don't have any specific or special code to reproduce it. It happens on any standard repo. It appears to be endemic to workspace-based repos...I think

To test it --> Use this

You can test this out on the following boilerplate (it happens here, at least on my machine):

https://github.com/antfu/vitesse

  • Just add a pnpm-workspace.yaml
  • move everything into into packages/<something>
  • try to run an installation

Expected behavior

Installation works without errors...we're just installing a set of packages, and it's not even a huge set of packages.

Actual behavior

Attempting to install the packages in my workspace, using pnpm 8.7.4

❯ npx pnpm@latest i
Need to install the following packages:
  pnpm@8.7.4
Ok to proceed? (y) Y
Scope: all 4 workspace projects
 ERROR  Invalid string length

RangeError: Invalid string length
    at JSON.stringify (<anonymous>)
    at calcDepState (/Users/x3r0/.npm/_npx/de6729f694090229/node_modules/pnpm/dist/pnpm.cjs:122538:29)
    at /Users/x3r0/.npm/_npx/de6729f694090229/node_modules/pnpm/dist/pnpm.cjs:124082:69

Check current version of (working) pnpm

❯ pnpm --version
8.7.0

Run current working version of pnpm@8.7.0

❯ pnpm i
Scope: all 4 workspace projects
../../amplify/backend/function/arangoTest/node_modules/.pnpm/esbuild@0.19.2/node_modules/esbuild: Running postinstall script, done in 137ms
node_modules/.pnpm/simple-git-hooks@2.9.0/node_modules/simple-git-hooks: Running postinstall script, done in 93ms
node_modules/.pnpm/esbuild@0.18.20/node_modules/esbuild: Running postinstall script, done in 133ms
node_modules/.pnpm/cypress@13.1.0/node_modules/cypress: Running postinstall script, done in 307ms
node_modules/.pnpm/sharp@0.32.5/node_modules/sharp: Running install script, done in 1.1s
node_modules/.pnpm/vue-demi@0.14.6_vue@3.3.4/node_modules/vue-demi: Running postinstall script, done in 67ms
node_modules/.pnpm/json-editor-vue@0.10.15_vanilla-jsoneditor@0.17.10_vue@3.3.4/node_modules/json-editor-vue: Running preinstall script, done in 34ms
node_modules/.pnpm/json-editor-vue@0.10.15_vanilla-jsoneditor@0.17.10_vue@3.3.4/node_modules/json-editor-vue: Running postinstall script, done in 33ms
. postinstall$ npx simple-git-hooks
│ [INFO] Successfully set the pre-commit with command: npx lint-staged
│ [INFO] Successfully set all git hooks
└─ Done in 476ms
Done in 6.2s

Additional information

A regression? Okay...but of what?

I looked for any related issues, and found the following: #4949

The behavior I'm observing appears to be consistent with what's in this bug report, therefore I assume that this is a regression.

Node.js version

18.16.1

Operating System

macOS

@armenr
Copy link
Author

armenr commented Dec 8, 2023

This is still happening. No communication, nothing. Something appears to have gotten merged - and then this was closed, without comment.

I'm just sitting here with a broken package manager, having to use a many-releases-old pnpm version as a workaround.

What's up with that?

@armenr
Copy link
Author

armenr commented Dec 8, 2023

8.11.0 still b0rked.

@samtsai
Copy link

samtsai commented Jan 23, 2024

I'm facing the same issue with 8.14.3:

[2024-01-23T15:42:09.746Z]  ERROR  Invalid string length
[2024-01-23T15:42:09.746Z] 
[2024-01-23T15:42:09.746Z] RangeError: Invalid string length
[2024-01-23T15:42:09.746Z]     at JSON.stringify (<anonymous>)
[2024-01-23T15:42:09.746Z]     at calcDepState (/home/builder/.cache/node/corepack/pnpm/8.14.3/dist/pnpm.cjs:124197:29)
[2024-01-23T15:42:09.746Z]     at /home/builder/.cache/node/corepack/pnpm/8.14.3/dist/pnpm.cjs:125690:69

@samtsai
Copy link

samtsai commented Jan 23, 2024

I was able to get further with pnpm install --fix-lockfile

Never mind, I think it was failing on something earlier from another change. I'm still hitting this once I get to pnpm install -r --offline step.

@zkochan
Copy link
Member

zkochan commented Jan 23, 2024

We should fix it for sure but I can't reproduce the issue.

This is probably the same issue as this one: #5056

Looks like using an object hash instead of JSON.stringify could be the solution but I wasn't able to break JSON.stringify yet to verify the fix.

As a temporary fix, I think you can just disable side effects cache by setting side-effects-cache to false.

@sammarks
Copy link

Happening for me on 9.1.4; posting here my comment from #8072:

I get this same error with 9.1.4, so ignore the 9.1.3 in the path in the error report. I've upgraded to 9.1.4 and tried again to find the same issue.

pnpm: Invalid string length
    at writeBlockMapping (/Users/sammarks/Library/pnpm/global/5/.pnpm/pnpm@9.1.3/node_modules/pnpm/dist/pnpm.cjs:16817:20)
    at writeNode (/Users/sammarks/Library/pnpm/global/5/.pnpm/pnpm@9.1.3/node_modules/pnpm/dist/pnpm.cjs:16883:13)
    at writeBlockMapping (/Users/sammarks/Library/pnpm/global/5/.pnpm/pnpm@9.1.3/node_modules/pnpm/dist/pnpm.cjs:16808:14)
    at writeNode (/Users/sammarks/Library/pnpm/global/5/.pnpm/pnpm@9.1.3/node_modules/pnpm/dist/pnpm.cjs:16883:13)
    at Object.dump (/Users/sammarks/Library/pnpm/global/5/.pnpm/pnpm@9.1.3/node_modules/pnpm/dist/pnpm.cjs:16977:11)
    at yamlStringify (/Users/sammarks/Library/pnpm/global/5/.pnpm/pnpm@9.1.3/node_modules/pnpm/dist/pnpm.cjs:114456:32)
    at writeLockfiles (/Users/sammarks/Library/pnpm/global/5/.pnpm/pnpm@9.1.3/node_modules/pnpm/dist/pnpm.cjs:114470:23)
    at async Promise.all (index 0)
    at async _installInContext (/Users/sammarks/Library/pnpm/global/5/.pnpm/pnpm@9.1.3/node_modules/pnpm/dist/pnpm.cjs:192699:9)
    at async installInContext (/Users/sammarks/Library/pnpm/global/5/.pnpm/pnpm@9.1.3/node_modules/pnpm/dist/pnpm.cjs:192812:16)

Modified that bit of code in the 9.1.4 transpiled JS to be:

} else {
  pairBuffer += ": ";
}
pairBuffer += state.dump;
try {
  _result += pairBuffer;
} catch (err) {
  console.error('would have run into a problem here')
  console.info(_result.length)
  console.info(pairBuffer.length)
  process.exit(1)
}

And that outputs:

would have run into a problem here
19313147
533455214

So looks like we're running into some very large strings here. Even if it were able to handle that, it seems like the lockfile would be a few hundred MB in size.

That was with:

auto-install-peers=false
dedupe-peer-dependents=true

Running it with:

auto-install-peers=false
dedupe-peer-dependents=false

results in the same error, but it seems to happen later in the process. This time here's the output I get:

would have run into a problem here
526644599
77564213

Happy to help debug some more if you tell me what variables you want to see, but unfortunately I'm not able to provide access to the specific repository where it's happening.

If it helps here's my full config:

link-workspace-packages=deep
strict-peer-dependencies=false

auto-install-peers=false
resolution-mode=lowest-direct
dedupe-peer-dependents=false
disallow-workspace-cycles=true
resolve-peers-from-workspace-root=false

@zkochan
Copy link
Member

zkochan commented Jun 1, 2024

From the stacktrace it looks like the error happens because the lockfile is too big.

@sammarks
Copy link

sammarks commented Jun 1, 2024

That's why I think it's moreso related to #8072 than this particular issue. Coupled with my findings in my first comment on that issue, the graph being generated related to peer dependencies seems like it grows exponentially if you have a project with a lot of dependencies:

In my case I have two repositories:

  • A monorepo with 200 packages inside it that get published on a regular basis. These packages all have several peer dependency relationships with other packages in the repository.
  • Another (much smaller) monorepo that depends on many of the 200 packages from the first monorepo.

My guess is since there are so many peer dependency relationships here, it's really stressing the new peer dependency resolution algorithm and is causing the size of the lockfile to balloon out of control.

The original performance issue I was experiencing was the time required to process the peer dependency graph to find cycles. That appears to have been fixed through optimization in 9.1.4, but the underlying issue of an extremely large peer graph still persists.

To be sure, this was never an issue in v8 and running the installation still works in the smaller monorepo if I reinstall pnpm v8, so it must be related to the new peer dependency resolution.

Though even when I disable auto-install-peers and dedupe-peer-dependents this issue still appears to happen. I guess disabling those doesn't cause the new peer dep resolution to disable entirely? That would make sense as I'm sure it's necessary in other parts.

@sammarks
Copy link

sammarks commented Jun 3, 2024

Did some more digging and exported the string it's trying to append to the lockfile. Here's a snippet of it:

CleanShot 2024-06-03 at 14 00 01

These are our internal packages that all depend on one another. The standard-sets line is 100MB alone in size. Coincidentally, it also has the most peer dependencies of all of the packages included in that list.

Some of the other packages that don't have as many peer dependencies are only 500KB in size, so this has to be growing exponentially somewhere.

Looking at the lockfile PNPM v8 generated, it looks like 8 outputs essentially just a list of the peer dependencies instead of the entire graph of the peer dependencies like v9 does. Here's that same line inside a v8 lockfile (formatted to wrap for clarity):

version: 0.13.13(@apollo/client@3.5.10)(@thesisedu/feature-apollo-react@0.111.1)(@thesisedu/feature-assignments-react@0.142.5)(@thesisedu/feature-attachments-react@0.89.31)(@thesisedu/feature-classes-react@0.108.6)(@thesisedu/feature-course-assignments-react@0.101.4)(@thesisedu/feature-course-reports-react@0.64.16)(@thesisedu/feature-courses-react@0.122.6)(@thesisedu/feature-districts-web@0.77.9)(@thesisedu/feature-react@0.99.2)(@thesisedu/feature-reports-react@0.9.13)(@thesisedu/feature-sites-react@0.76.6)(@thesisedu/feature-sites-web@0.80.32)(@thesisedu/feature-tags-react@0.79.5)(@thesisedu/feature-users-react@0.18.11)(@thesisedu/feature-web@0.118.6)(@thesisedu/feature-widgets-react@0.30.1)(@thesisedu/feature@0.104.0)(@thesisedu/react@0.40.3)(@thesisedu/ui@0.29.0)(@thesisedu/web@0.139.1)(antd@4.24.8)(graphql@15.8.0)(react-dom@18.3.1)(react-native@0.67.2)(react-router-dom@6.0.0-beta.0)(react@18.3.1)(styled-components@6.1.8)

Looking at the package.json that is consistent with the peer dependencies listed inside feature-standard-sets-react.

@zkochan
Copy link
Member

zkochan commented Jun 3, 2024

One line is 100MB? Can you attach the whole line? It shouldn't contain any private stuff, just the package names and versions.

v8 did not resolve peer dependencies of peer dependencies correctly. Unfortunately, when a peer dependency has a peer dependency of its own, the resolution becomes really hard really fast. And to distinguish all the different combinations that could happen, the key becomes long.

@sammarks
Copy link

sammarks commented Jun 3, 2024

Here's a zip of the lockfile part for the package that's causing the issue. The 100MB line is with feature-standard-sets-react. The file should be ~530MB when extracted.

largepart.dat.zip

Well, it compressed down to 6MB 😛 - so there's another giveaway that there seems to be some serious repetition going on.

@sammarks
Copy link

sammarks commented Jun 3, 2024

What about using a hash to represent the different combinations instead of listing all of them out? Unless you later parse that string of combinations (as I assume you do)?

@zkochan
Copy link
Member

zkochan commented Jun 6, 2024

We could hash that. Although, a lot of additional hashing will make installation slower. What is the speed of installation at the moment in that workspace?

@zkochan
Copy link
Member

zkochan commented Jun 6, 2024

@sammarks try the PR I created: #8177

Let me know if it solves the issue.

@sammarks
Copy link

sammarks commented Jun 6, 2024

I can time it later today when I'm at my machine but it's not terrible. Maybe 1 minute or so with a 15 second pause when doing this peer calculation math after building.

@sammarks
Copy link

sammarks commented Jun 6, 2024

Took 2 minutes, 41 seconds before running into the error. Tried again on main, same error (as expected), same amount of time.

Tried again on shorter-peers-suffix, success in 54 seconds. Lockfile is only 2.2MB in size, and confirmed the hashes are actually stored in the lockfile:

CleanShot 2024-06-06 at 10 56 31

Looks like it works! 🎉

@zkochan
Copy link
Member

zkochan commented Jun 6, 2024

Unfortunately, this would be a breaking change... Unless maybe I make it a really big number. Does it work if you change 200 to 1000 here?

  if (dirName.length > 200) {
    dirName = createBase32Hash(dirName)
  }

@sammarks
Copy link

sammarks commented Jun 6, 2024

Still works with 1000; done in 51 seconds.

CleanShot 2024-06-06 at 11 33 26

spreadsheets was a hash before and is no longer; standard-sets remains as a hash showing the 1000 had an effect.

Lockfile is 2.5MB now so still much smaller.

zkochan added a commit that referenced this issue Jun 7, 2024
@zkochan
Copy link
Member

zkochan commented Jun 10, 2024

🚢 9.3.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Priority
Development

No branches or pull requests

4 participants