Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to read entire stream into buffer #403

Closed
bajtos opened this issue Apr 2, 2019 · 6 comments
Closed

How to read entire stream into buffer #403

bajtos opened this issue Apr 2, 2019 · 6 comments

Comments

@bajtos
Copy link

bajtos commented Apr 2, 2019

What is the recommended/idiomatic way for reading all data from a stream and store it in a buffer? Preferably using async/await style.

Example:

const response = await httpGetAsync('http://example.com/');
const body = /* how to read all data into a buffer? */


function httpGetAsync(urlString) {
  return new Promise((resolve, reject) => {
    http.get(urlString, resolve).once('error', reject);
  });
}

There are many userland modules in npm (e.g. get-stream and bl). The official example in Node.js docs is using .on('data') and .on('end') callbacks.

It would be great to have a built-in/native way for converting a readable stream into a buffer ✌️

@mcollina
Copy link
Member

mcollina commented Apr 2, 2019

I totally agree. I persnally use https://www.npmjs.com/package/concat-stream.
Do you see having something like that in core?

@bajtos
Copy link
Author

bajtos commented Apr 2, 2019

Do you see having something like that in core?

I would love to! But does it have any chance of being accepted? My main concern is about the "small core" argument.

@mcollina
Copy link
Member

mcollina commented Apr 2, 2019

I don't know really. Considering that async iterators will be out of experimental in 12, I would recommend the following:

const chunks = []
for await (let chunk of readable) {
  chunks.push(chunk)
}
console.log(Buffer.concat(chunks))

@bajtos
Copy link
Author

bajtos commented Apr 2, 2019

Considering that async iterators will be out of experimental in 12, I would recommend the following

That will not work on Node.js 8.x, will it? In our project, we need to support Node.js 8.x for as long as it's supported under the LTS policy (December 2019 at the time of writing).

Would it make sense to add a new utility method, e.g. stream.readToBuffer(stream, cb)? I am thinking along the lines of stream.finished and stream.pipeline APIs.

@mcollina
Copy link
Member

mcollina commented Apr 2, 2019

That will not work on Node.js 8.x, will it? In our project, we need to support Node.js 8.x for as long as it's supported under the LTS policy (December 2019 at the time of writing).

Yes.

I will just use concat-stream until then, it's well maintained.

@smitdesai1010
Copy link

You can do it using Fetch in Nodejs. Found the answer on here
fetch('https://assets-cdn.github.com/images/modules/logos_page/Octocat.png') .then(res => res.buffer()) .then(buffer => console.log)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants