Dargor Dargor - 1 year ago 43
Bash Question

Pipe a lot of files to stdin, extract first columns, then combine those in a new file

Suppose we have these two files:

$ cat ABC.txt

$ cat PQR.txt

And we want to form a new file with the 1st column of each file. This can be achieved by:

$ paste -d ' ' <(cut -d ' ' -f 1 ABC.txt) <(cut -d ' ' -f 1 PQR.txt )

But I want to use this with tons of files in the input, not only ABC.txt and PQR.TXT, but a lot of them. How we can generalize this situation to pass each file in the collection to cut and then pass all the outputs to paste (I know that this may be done better with awk but I want to know how to solve this using this approach).

Edit 1

I have discovered a dirty, dirty way of doing this:

$ str=''; for i in *.txt; \
do str="${str} <(cut -d ' ' -f 1 ${i})"; \
done ; \
str="paste -d ' ' $str"; \
eval $str

But please, free my soul with an answer that does not involve going to Computer Science Hell.

Edit 2

Each file can have n rows, if this matters.

Answer Source

Process substitution <(somecommand) doesn't pipe to stdin, it actually opens a pipe on a separate file descriptor, e.g. 63, and passes in /dev/fd/63. When this "file" is opened, the kernel* duplicates the fd instead of opening a real file.

We can do something similar by opening a bunch of file descriptors and then passing them to the command:

# Start subshell so all files are automatically closed
  # Open a new fd for each process subtitution
  for file in ./*.txt
    exec {fds[n++]}< <(cut -d ' ' -f 1 "$file")

  # fds now contain a list of fds like 12 14
  # prepend "/dev/fd/" to all of them
  parameters=( "${fds[@]/#//dev/fd/}" )

  paste -d ' ' "${parameters[@]}"

{var}< file is bash's syntax for dynamic file descriptor assignment. like var=4; cmd 4< file; but without having to hardcode the 4 and instead let bash pick a free file descriptor. exec opens it in the current shell.

* Linux, FreeBSD, OpenBSD and XNU/OSX anyways. This is not POSIX, but neither is <(..)