You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to use bl to help be buffer a large http stream to an async/callback designed api.
So i want to pipe the http stream into bl and read using shallowSlice/slice API.
My http stream will be of 2Gbs, so i want to destroy data as i read thourh theme.
Will the consume API help me here - i.e. free the memory as i'm seeking (it it's the case, 'ill PR the documentation accordingly)
The text was updated successfully, but these errors were encountered:
131
changed the title
Does "consume" free up memory
Does "consume" free up memory ?
Nov 26, 2018
Good question, with a complex answer. It would be great if you could pull-request something about this onto the README to inform others.
Have a look at BufferList.prototype.consume. You'll see that where you're consuming bytes that cross the boundary of individual Buffers in the backing array of Buffers, they get shift()ed off and freed. Where you're eating in to an individual Buffer, which you invariably do unless you happen to hit the precise boundary of a Buffer on the backing array, it does a slice() and only keeps reference to the sliced bit.
However, you have to keep in mind that Node.js does its own trickery with memory management of Buffers and maintains its own backing pool. Each Buffer you have is a slice of that pool and releasing it to garbage collection doesn't necessarily free that memory. Also, when you trim a Buffer by way of slice() you're also not freeing memory, Node.js will (likely) just hang onto the original Buffer and give you the sliced view of it. But, overall Node.js will keep pool memory consumption to a minimum, so if you're allocating large amounts of memory in Buffer form and then freeing them up via GC then you're also going to save on memory.
In summary: mostly yes, but with minor caveats.
An alternative is to just use a Node.js Transform stream and pass your data through that: https://nodejs.org/api/stream.html#stream_implementing_a_transform_stream. That's how I'd deal with large data processing where you're doing a lot of discarding. Keep buffering to a minimum where you can (and bl is all about buffering).
I want to use bl to help be buffer a large http stream to an async/callback designed api.
So i want to pipe the http stream into bl and read using shallowSlice/slice API.
My http stream will be of 2Gbs, so i want to destroy data as i read thourh theme.
Will the consume API help me here - i.e. free the memory as i'm seeking (it it's the case, 'ill PR the documentation accordingly)
The text was updated successfully, but these errors were encountered: