Maroshii Maroshii - 8 months ago 39
Javascript Question

Node.js Piping the same stream into multiple (writable) targets

I need to run two commands in series that need to read data from the same stream.
After piping a stream into another the buffer is emptied so i can't read data from that stream again so this doesn't work:

var spawn = require('child_process').spawn;
var fs = require('fs');
var request = require('request');

var inputStream = request('');
var identify = spawn('identify',['-']);


var chunks = [];
identify.stdout.on('data',function(chunk) {

identify.stdout.on('end',function() {
var size = getSize(Buffer.concat(chunks)); //width
var convert = spawn('convert',['-','-scale',size * 0.5,'png:-']);

function getSize(buffer){
return parseInt(buffer.toString().split(' ')[2].split('x')[0]);

Request complains about this

Error: You cannot pipe after data has been emitted from the response.

and changing the inputStream to
yields the same issue of course.
I don't want to write into a file but reuse in some way the stream that request produces (or any other for that matter).

Is there a way to reuse a readable stream once it finishes piping?
What would be the best way to accomplish something like the above example?

Answer Source

You cannot reuse the piped data, that has been sent already. And you cannot pipe a stream after its 'end'. So you cannot process same stream twice, and need two streams. You have to create duplicate of the stream by piping it to two streams. You can create a simple stream with a PassThrough stream, it simply passes the input to the output.

spawn = require('child_process').spawn;
pass = require('stream').PassThrough;

a = spawn('echo', ['hi user']);
b = new pass;
c = new pass;


count = 0;
b.on('data', function(chunk) { count += chunk.length; });
b.on('end', function() { console.log(count); c.pipe(process.stdout); });


hi user