New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix serialization for large buffers #16
Conversation
@sirpy you able to take a peek? |
src/serializeBinary.ts
Outdated
const ArrayBufferSerializer: Serializer<ArrayBuffer, string> = { | ||
id: "ArrayBuffer", | ||
isType: (o: any) => o instanceof ArrayBuffer, | ||
|
||
// from https://developers.google.com/web/updates/2012/06/How-to-convert-ArrayBuffer-to-and-from-String | ||
// modified to use Int8Array so that we can hold odd number of bytes | ||
toObject: async (ab: ArrayBuffer) => { | ||
return String.fromCharCode.apply(null, new Int8Array(ab)); | ||
return new Int8Array(ab).reduce(function (data, byte) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@fitouch according to https://www.measurethat.net/Benchmarks/Show/1219/23/arraybuffer-to-base64-string
using apply is order of magnitude faster then going char by char
I suggest you split the ab
buffer into chunks each of MAX_BUFFER_LENGTH
and do the charCode.apply on each of these buffers and concat them
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sirpy here's a new commit based on your suggestion, thank you
…charCode.apply on each of these buffers and concat them
src/serializeBinary.ts
Outdated
return data + String.fromCharCode(byte); | ||
}, ""); | ||
const bytes = new Int8Array(ab); | ||
const MAX_BUFFER_LENGTH = 8 * 1024; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why 8k? whats the buffer sizes you have issues with?
also please add code comment about the bug you encountered and why this buffer splitting strategy is required to solve it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, 64k is the buffer sizes I have issue with. I just re-commit again. This is for large base64 string.
@fitouch merged and released |
@sirpy thank you |
Fixes
Maximum call stack size exceeded
error which triggers when passing large buffers.