JavaSa JavaSa - 11 months ago 49
Bash Question

Running multiple scripts from bash on parallel without printing output to console

Let assume I have multiple files paths for run from terminal,
I want them to be run on parallel in the background without printing their output to the console screen. (Their output should be saved to some other log path which is defined in the python file itself).

The paths are in this format:

/home/Dan/workers/2/ etc.

When I try to do something like running one worker in background it seems to work:

In example:
cd /home/Dan/workers/1/

python > /dev/null 2>&1 &

ps -ef |grep python
, indeed show the script running on background without printing to console, but printing to its predefined log path.

However, when I try to launch them all via bash script, I've no python scripts run after the following code:

for path in /home/Dan/workers/*
if [-f path/ ]
python > /dev/null 2>&1 &

Any idea what is the difference?
In bash script I try to launch many scripts one after another just like I did for only one script.

cxw cxw
Answer Source
for path in /home/Dan/workers/*
do         # VV    V-added ${}
    if [ -f "${path}/" ]
        python "${path}/" > /dev/null 2>&1 &
        # or (cd $path; python > /dev/null 2>&1 &) like Terje said

Use ${path} instead of just path. path is the name of the variable, but what you want when you are testing the file is the value that is stored in path. To get that, prefix with $. Note that $path will also work in most situations, but if you use ${path} you will be more clear about exactly which variable you mean. Especially when learning bash, I recommend sticking with the ${...} form.

Edit: Put the whole name in double-quotes in case ${path} contains any spaces.