Alexander Hartvig Nielsen Alexander Hartvig Nielsen - 6 months ago 104
Python Question

Creating a timeout function in Python with multiprocessing

I'm trying to create a timeout function in Python 2.7.11 (on Windows) with the multiprocessing library.

My basic goal is to return one value if the function times out and the actual value if it doesn't timeout.

My approach is the following:

from multiprocessing import Process, Manager

def timeoutFunction(puzzleFileName, timeLimit):
manager = Manager()
returnVal = manager.list()

# Create worker function
def solveProblem(return_val):
return_val[:] = doSomeWork(puzzleFileName) # doSomeWork() returns list

p = Process(target=solveProblem, args=[returnVal])
p.start()

p.join(timeLimit)
if p.is_alive():
p.terminate()
returnVal = ['Timeout']

return returnVal


And I call the function like this:

if __name__ == '__main__':
print timeoutFunction('example.txt', 600)


Unfortunately this doesn't work and I receive some sort of EOF error in pickle.py

Can anyone see what I'm doing wrong?

Thanks in advance,

Alexander

Edit: doSomeWork() is not an actual function. Just a filler for some other work I do. That work is not done in parallel and does not use any shared variables. I'm only trying to run a single function and have it possibly timeout.

Answer

You can use Pebble library for this.

from pebble.process import concurrent
from pebble import TimeoutError

TIMEOUT_IN_SECONDS = 5

def function(foo, bar=0):
    return foo + bar

task = concurrent(target=function, args=[1], kwargs={'bar': 1}, timeout=TIMEOUT_IN_SECONDS)
try:
    results = task.get()  # blocks until results are ready
except TimeoutError:
    results = 'timeout'

The documentation has more complete examples.

The library will terminate the function if it timeouts so you don't need to worry about IO or CPU being wasted.

Comments