Simple and dependency-free node/io.js module to transform a function into a pipeable stream
pipefy returns a unique buffer as result of contatenating each chunk emitted by the readable input stream. This is usually enough for most cases, but for large amounts of data handling huge buffers may have negative performance side-effects.
npm install pipefy --save
var fs = require('fs')
var pipefy = require('pipefy')
Instead of doing this (note that I've used the sync API for simplification):
function process(buf, path) {
// mad science here...
fs.writeFileSync(path, buf)
}
var data = fs.readFileSync('image.jpg')
process(data, 'new.jpg')
With pipefy
you can do the same in a more idiomatic and efficient way:
function process(buf, path) {
// mad science here...
fs.writeFileSync(path, buf)
}
fs.createReadStream('image.jpg')
.pipe(pipefy(process, 'new.jpg'))
Returns a WritableStream
You can subscribe to stream events to deal with the status. E.g: error
, finish
...
Writable stream implementation used internally by pipefy
See the implementation for hacking purposes
MIT - Tomas Aparicio