Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛 BUG: Uploading large KV entries throws an exception #1151

Closed
mihaip opened this issue May 31, 2022 · 3 comments
Closed

🐛 BUG: Uploading large KV entries throws an exception #1151

mihaip opened this issue May 31, 2022 · 3 comments
Labels
bug Something that isn't working

Comments

@mihaip
Copy link

mihaip commented May 31, 2022

What version of Wrangler are you using?

2.0.7

What operating system are you using?

macOS 11.6.5

Describe the Bug

I'm migrating a worker that uses a Worker Site from wrangler 1 to wrangler 2, and I'm running into an exception during upload step

✘ [ERROR] Error on remote worker: RangeError: Invalid string length

      at JSON.stringify (<anonymous>)
      at putKVBulkKeyValue
  (/Users/mihai/.npm-global/lib/node_modules/wrangler/wrangler-dist/cli.js:120856:18)
      at syncAssets
  (/Users/mihai/.npm-global/lib/node_modules/wrangler/wrangler-dist/cli.js:121017:5)
      at async start
  (/Users/mihai/.npm-global/lib/node_modules/wrangler/wrangler-dist/cli.js:121118:22)

It looks like putBulkKeyValue has some basic chunking (added in 18ac439), where it will do 5,000 key/value entries at a time. However, if the values are large, then this ends up running into the Node/V8 maximum string length limit (of 512MB).

Looking at Wrangler 1, it appeared to also enforce a maximum of 50MB in the maximum request size (https://github.com/cloudflare/wrangler/blob/c097240e1449e5ebf41f017014460bca21feb19d/src/kv/bulk.rs#L23) -- porting some form of size-based chunking may be reasonable here too.

In the meantime I've worked around this by lowering BATCH_KEY_MAX in my local copy.

@mihaip mihaip added the bug Something that isn't working label May 31, 2022
@petebacondarwin
Copy link
Contributor

Thanks for reporting this. Yes - we should be a bit more careful about the size of the batches.

@threepointone
Copy link
Contributor

We added batching a while ago that should have fixed this. I'm closing this issue, but happy to discuss/reopen if I've missed something.

@mihaip
Copy link
Author

mihaip commented Jul 28, 2022

@threepointone I was experiencing this with batching (see the reference to 18ac439 in my comment). The problem is that the batching is purely count based, but if the values are large then the request still ends up being too large, since there is a maximum string size.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something that isn't working
Projects
Archived in project
Development

No branches or pull requests

3 participants