Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does "consume" free up memory ? #64

Closed
131 opened this issue Nov 26, 2018 · 1 comment
Closed

Does "consume" free up memory ? #64

131 opened this issue Nov 26, 2018 · 1 comment

Comments

@131
Copy link

131 commented Nov 26, 2018

I want to use bl to help be buffer a large http stream to an async/callback designed api.
So i want to pipe the http stream into bl and read using shallowSlice/slice API.

My http stream will be of 2Gbs, so i want to destroy data as i read thourh theme.
Will the consume API help me here - i.e. free the memory as i'm seeking (it it's the case, 'ill PR the documentation accordingly)

@131 131 changed the title Does "consume" free up memory Does "consume" free up memory ? Nov 26, 2018
@rvagg
Copy link
Owner

rvagg commented Nov 27, 2018

Good question, with a complex answer. It would be great if you could pull-request something about this onto the README to inform others.

Have a look at BufferList.prototype.consume. You'll see that where you're consuming bytes that cross the boundary of individual Buffers in the backing array of Buffers, they get shift()ed off and freed. Where you're eating in to an individual Buffer, which you invariably do unless you happen to hit the precise boundary of a Buffer on the backing array, it does a slice() and only keeps reference to the sliced bit.

However, you have to keep in mind that Node.js does its own trickery with memory management of Buffers and maintains its own backing pool. Each Buffer you have is a slice of that pool and releasing it to garbage collection doesn't necessarily free that memory. Also, when you trim a Buffer by way of slice() you're also not freeing memory, Node.js will (likely) just hang onto the original Buffer and give you the sliced view of it. But, overall Node.js will keep pool memory consumption to a minimum, so if you're allocating large amounts of memory in Buffer form and then freeing them up via GC then you're also going to save on memory.

In summary: mostly yes, but with minor caveats.

An alternative is to just use a Node.js Transform stream and pass your data through that: https://nodejs.org/api/stream.html#stream_implementing_a_transform_stream. That's how I'd deal with large data processing where you're doing a lot of discarding. Keep buffering to a minimum where you can (and bl is all about buffering).

@rvagg rvagg closed this as completed Nov 27, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants