smitelli smitelli - 5 months ago 26
Bash Question

Piping a file into stdin with throttle

Suppose I have an executable,

, that reads lines of data from stdin and performs a complex task for each line of input. If my data was in a file called
, I could run this job by typing:

bacth_processor < data.txt


cat data.txt | batch_processor

In each case,
consumes the source data as fast as it is able.

Now, suppose I want to intentionally throttle this process. If my file has 100,000 lines, and I want the job to take 24 hours to reduce the impact on the system (that works out to a little over one line per second), is there something I could insert into the pipeline to artificially add delay between each line?


How about this?

cat data.txt | while read x; do echo "$x"; sleep 0.7; done | batch_processor

or you could use Python/Ruby/Perl/whatever in there, instead of the bash loop.