Paul Paul - 1 year ago 104
Python Question

Spawn few parallel processes and kill them after finish

I need to make script which on some condition spawns parallel proccess (worker) and makes it to do some IO job. And when it finished - close that process.
But looks like the processes do not tend co exit by default.

Here is my approach:

import multiprocessing

pool = multiprocessing.Pool(4)

def f(x):
return True

r = pool.map_async(f, [1,2,3,4,5,6,7,8,9,10])

But it I run it in the ipython and whait for all prints, after this I can run ps aux | grep ipython and see a lot of processes. So looks like these workers are still alive.

Maybe I'm doind something wrong, but how can I get make these processes terminate when they finished their task? And what approach should I use if I want to spawn a lot of workers one by one (by getting some rmq message, for example)?

Answer Source

Pool spawns worker processes when you declare the pool. They do not get killed until the pool is shut down. Instead, they wait there for more work to appear in the queue.

If you change your code to:

r = pool.map_async(f, [1,2,3,4,5,6,7,8,9,10])
print "check ps ax now"
sleep (10)

you will see the pool processes have disappeared.

Another thing, your program might not work as intended as you declare function f after you declare your pool. I had to change pool = multiprocessing.Pool(4) to follow function f declaration, but this may vary between Python versions. Anyway, if you get odd "module has no attribute" -exceptions, this is the reason.


Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download