You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Correct. The node.js heap is limited in size. You could implement your own sub-class of Buffer that reads incrementally. However, you would likely still have an issue with the resulting object.
If the proto file is that large, it's highly likely that resulting object is also not going to fit on the heap. You probably want some type of sax / oboe type interface for receiving the data as well.
For this to work, both ends of pbf would need to change. The Pbf class would need some changes to support a stream and the generated code would need to support a stream as well.
You could re-use many of the methods from Pbf for parsing the tags, but there are some fundamental issues that are very different from what this project does.
Do you know what about your file is so large? if it includes large binary data, you still might have this issue even with streaming, unless the streaming also supports streams for a single field.
As far as I know, Protobuf is not designed to hold huge amounts of data. Instead, it's most suited for encoding many relatively small objects. If this 2G file consists of thousands of smaller PBF objects, the proper way to deal with this would be using https://github.com/mafintosh/length-prefixed-stream for splitting a stream of PBF objects, and then use pbf for decoding each individually.
I try to open a large pbf file (>2Go), but get an error:
It seems nodejs buffers can't be bigger than 2Go:
nodejs/node#6560 (comment)
So my question:
Is there a way to use a stream instead of buffer as argument to access the file? Or maybe an other way to access file bigger than 2Go?
Thanks.
node version: 4.2.6
The text was updated successfully, but these errors were encountered: