This repository has been archived by the owner on Aug 1, 2024. It is now read-only.
base64 encodeByteArray is very slow #1161
Labels
sunset
Issues/PRs auto-closed when repo was archived
(reposted from grpc/grpc-web#1187)
TL; DR: grpc-web uses goog.crypt.base64.encodeByteArray to encode the payload. This function can only encode at the throughput of ~16MB/s. js-base64 package is 50x faster, so rewriting encodeByteArray using ideas from js-base64 (or other fast base64 package) is greatly appreciated.
Background: we have a grpc-web-based app that performs bulk data upload, and it can sustain only about ~15MB/s regardless of the underlying network speed. The reason is that the base64 encoder used by grpc-web (goog.crypt.base64) is quite slow. I did a quick benchmark between goog.crypt.base64.encodeByteArray vs Base64.fromUint8Array (from https://www.npmjs.com/package/js-base64), and the former is about 50x slower than js-base64. E.g., encoding a random 16MB Uint8Array payload takes about ~1000ms with goog.crypt.base64, 19ms with js-base64.
FYI, here's a node script that I used.
result:
The text was updated successfully, but these errors were encountered: