Jake Jake -4 years ago 126
Linux Question

pass stdout as file name for command line util? [answer: use named pipes]

I'm working with a command line utility that requires passing the name of a file to write output to, e.g.

foo -o output.txt


The only thing it writes to
stdout
is a message that indicates that it ran successfully. I'd like to be able to pipe everything that is written to
output.txt
to another command line utility. My motivation is that output.txt will end up being a 40 GB file that I don't need to keep, and I'd rather pipe the streams than work on massive files in a stepwise manner.

Is there any way in this scenario to pipe the real output (i.e.
output.txt
) to another command? Can I somehow magically pass
stdout
as the file argument?

Answer Source

Solution 1: Using process substitution

The most convenient way of doing this is by using process substitution. In bash the syntax looks as follows:

foo -o >(other_command)

Solution 2: Using named pipes explicitly

You can do the above explicitly / manually as follows:

  1. Create a named pipe using the mkfifo command.

    mkfifo my_buf
    
  2. Launch your other command with that file as input

    other_command < my_buf
    
  3. Execute foo and let it write it's output to my_buf

    foo -o my_buf
    
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download