Simple and dependency-free node/io.js module to transform a function into a pipeable stream
pipefy returns a unique buffer as result of contatenating each chunk emitted by the readable input stream. This is usually enough for most cases, but for large amounts of data handling huge buffers may have negative performance side-effects.
Instead of doing this (note that I've used the sync API for simplification):
With
You can subscribe to stream events to deal with the status. E.g:
See the implementation for hacking purposes
pipefy returns a unique buffer as result of contatenating each chunk emitted by the readable input stream. This is usually enough for most cases, but for large amounts of data handling huge buffers may have negative performance side-effects.
Installation
npm install pipefy --save
Example
var fs = require('fs')
var pipefy = require('pipefy')
Instead of doing this (note that I've used the sync API for simplification):
function process(buf, path) {
// mad science here...
fs.writeFileSync(path, buf)
}
var data = fs.readFileSync('image.jpg')
process(data, 'new.jpg')
With
pipefy
you can do the same in a more idiomatic and efficient way:
function process(buf, path) {
// mad science here...
fs.writeFileSync(path, buf)
}
fs.createReadStream('image.jpg')
.pipe(pipefy(process, 'new.jpg'))
API
pipefy(fn, args... )
Returns a WritableStreamYou can subscribe to stream events to deal with the status. E.g:
error
, finish
...pipefy.Stream()
Writable stream implementation used internally bypipefy
See the implementation for hacking purposes