Luigi T. Luigi T. - 5 months ago 22
Python Question

How can I request several pages without wating for the output?

I need to request a list of URLs, passing each time different variable via POST, and most importantly, I need to save the output of each request.

Is there a way with Python I can make multiple URL requests running in background so that the request are not done ones at a time (as at the moment)?

for exrow in exdata:
id = exrow['increment_id']
mage =, data = {'id' : id})
output = mage.content

if output :
f.writelines(output + ',')


What you want are asynchronous requests. A result isn't returned immediately, but it just means it'll be handled in the background while your code can continue. At a later point, when you actually NEED the content of the request, you can retrieve it. There are several libraries built on requests that provide asynchronous calling of URLS.

They both work slightly differently, but hopefully one of them will be something you can use.

I don't have personal experience with either, but they both come with recommendations from the official requests project.