deinonychusaur deinonychusaur - 1 year ago 56
Linux Question

Subprocess doesn't respect arguments when using multiprocessing

The main objective here is to create a daemon-spawning function. The daemons need to run arbitrary programs (i.e. use


What I have so far in my
module is:

import os
from multiprocessing import Process
from time import sleep
from subprocess import call, STDOUT

def _daemon_process(path_to_exec, std_out_path, args, shell):

with open(std_out_path, 'w') as fh:
args = (str(a) for a in args)

if shell:
fh.write("*** LAUNCHING IN SHELL: {0} ***\n\n".format(" ".join([path_to_exec] + list(args))))
retcode = call(" ".join([path_to_exec] + list(args)), stderr=STDOUT, stdout=fh, shell=True)
fh.write("*** LAUNCHING WITHOUT SHELL: {0} ***\n\n".format([path_to_exec] + list(args)))
retcode = call([path_to_exec] + list(args), stderr=STDOUT, stdout=fh, shell=False)

if retcode:
fh.write("\n*** DAEMON EXITED WITH CODE {0} ***\n".format(retcode))
fh.write("\n*** DAEMON DONE ***\n")

def daemon(path_to_executable, std_out=os.devnull, daemon_args=tuple(), shell=True):

d = Process(name='daemon', target=_daemon_process, args=(path_to_executable, std_out, daemon_args, shell))
d.daemon = True


When trying to run this in bash (This will create a file called
in your current directory.

python -c"import daemonizer;daemonizer.daemon('ping', std_out='test.log', daemon_args=('-c', '5', ''), shell=True)"

It correctly spawns a daemon that launches
but it doesn't respect the arguments passed. This is true if shell is set to
as well. The log-file clearly states that it attempted to launch it with the arguments passed.

As a proof of concept creating the following executable:

echo "ping -c 5" > ping_test
chmod +x ping_test

The following works as intended:

python -c"import daemonizer;daemonizer.daemon('./ping_test', std_out='test.log', shell=True)"

If I test the same
code outside of the
-target it does work as expected.

So how do I fix this mess so that I can spawn processes with arguments?

I'm open to entirely different structures and modules, but they should be included among the standard ones and be compatible with python 2.7.x. The requirement is that the the
function should be callable several times asynchronously within a script and produce a daemon each and their target processes should be able to end up on different CPUs. Also the scripts need to be able to end without affecting the spawned daemons of course.

As a bonus, I noticed I needed to have a
for the spawning to work at all else the script terminates too fast. Any way to get around that arbitrary hack and/or how long do I really need to have it wait to be safe?

Answer Source

Your arguments are being "used up" by the printing of them!

First, you do this:

args = (str(a) for a in args)

That creates a generator, not a list or tuple. So when you later do this:


That consumes the arguments, and they will not be seen a second time. So you do this again:


And get an empty list!

You could fix this by commenting out your print statements, but much better would be to simply create a list in the first place:

args = [str(a) for a in args]

Then you can use args directly and not list(args). And it will always have the arguments inside.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download