philipper philipper -5 years ago 97
Bash Question

shell multipipe broken with multiple python scripts

I am trying to get the stdout of a python script to be shell-piped in as stdin to another python script like so:

find ~/test -name "*.txt" | python | python

It should print an output but nothing comes out of it.

Please note that this works:

find ~/test -name "*.txt" | python | cat

And this works too:

echo "bla" | python prints out the new content of the .txt files:

from multitail import multitail
import sys

filenames = sys.stdin.readlines()
# we get rid of the trailing '\n'
for index, filename in enumerate(filenames):
filenames[index] = filename.rstrip('\n')

for fn, line in multitail(filenames):
print '%s: %s' % (fn, line),

When a new line is added to the .txt file ("hehe") then prints:

/home/me/test2.txt: hehe simply prints out what it gets on stdin:

import sys

for line in sys.stdin:
print "line=", line

There is something I must be missing. Please community help me :)

Answer Source

There's a hint if you run your interactively:

$ python 
line= a

line= b

line= c

Note that I hit ctrl+D to provoke an EOF after entering the 'c'. You can see that it's slurping up all the input before it starts iterating over the lines. Since this is a pipeline and you're continuously sending output through to it, this doesn't happen and it never starts processing. You'll need to choose a different way of iterating over stdin, for example:

import sys

line = sys.stdin.readline()
while line:
    print "line=", line
    line = sys.stdin.readline()
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download