Richard Denton Richard Denton - 1 month ago 16x
Node.js Question

Child process stdout and stderr column sizes in Node.js

I want to monitor real time data coming from a child process in Node. I can do this with no problem with the following code snippet.

var fs = require('fs');
var spawn = require('child_process').spawn;


"use strict";

var processMonitor, processListen, processDeauth;

var parseStreamDataIn = function(data) {

var str = data.toString('utf8');


var init = function() {

processMonitor = spawn('trafficmon' , ['-w'], {'shell': '/bin/bash'});

processMonitor.stdout.on('data', function (data) {
//trafficmon uses stderr, nothing needed here...
processMonitor.stderr.on('data', parseStreamDataIn);
processMonitor.on('close', function (code) {
processMonitor = null;




The problem is, the data being returned and logged in parseStreamDataIn() is being cut down to 80 characters (columns) per row, leaving half the data I want missing.

Example of trafficmon running by itself in a terminal window (That has been manually expanded to 120 characters wide)

EC:XX:XX:XX:XX:XX 131 1 0 0 1 128 120614 HTTPS SSL

Example of my node script running the same command and logging the data into the same size terminal window.

EC:XX:XX:XX:XX:XX 131 1 0 0 1 128 120614 HTTPS SSL clus

As you can see, the tail end of the string is being chopped off.

Is there any way I can tell my child process shell to return more than 80 columns per row?


pty.js allows you to specify things like number of columns.

var pty = require('pty.js');

var term = pty.spawn('bash', [], {
  name: 'xterm-color',
  cols: 80,
  rows: 30,
  cwd: process.env.HOME,
  env: process.env

term.on('data', function(data) {

term.resize(100, 40);
term.write('ls /\r');