Fix crashes caused by browser string size limit for large captures #244
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
When making some very large captures, such as those with thousands of commands and the "Full Capture" option enabled, serialization of the resulting capture to JSON can fail due to browser max string size limitations. Even though there is some chunking done on the generated JSON string to avoid other message port limits, that doesn't help with the actual generation of the JSON string.
This checks for the error created when the resulting JSON string would be too long and attempts to handle it with a fallback implementation. It does this by replacing the inner
commands
list from the capture with a placeholder string before serializing and then doing some manual string concatenation to generate chunks that produce a valid JSON when received by the client.Testing locally, this fixes the issue and allows me to perform captures I wasn't able to before. I still get some captures that fail due to what I think are global memory limits on the extension or JS runtime as a whole or something, but idk if there's a way to fix those without doing big changes to the app.