Marcel Wilson Marcel Wilson - 8 months ago 72
Python Question

grequests pool with multiple request.session?

I want to make a lot of url requets to a REST webserivce. Typically between 75-90k. However, I need to throttle the number of concurrent connections to the webservice.

I started playing around with grequests in the following manner, but quickly started chewing up opened sockets.

concurrent_limit = 30
urllist = buildUrls()
hdrs = {'Host' : 'hostserver'}
g_requests = (grequests.get(url, headers=hdrs) for url in urls)
g_responses =, size=concurrent_limit)

As this runs for a minute or so, I get hit with 'maximum number of sockets reached' errors.
As far as I can tell, each one of the requests.get calls in grequests uses it's own session which means a new socket is opened for each request.

I found a note on github referring how to make grequests use a single session. But this seems to effectively bottleneck all requests into a single shared pool. That seems to defeat the purpose of asynchronous http requests.

s = requests.session()
rs = [grequests.get(url, session=s) for url in urls]

Is is possible to use grequests or gevent.Pool in a way that creates a number of sessions?

Put another way: How can I make many concurrent http requests using either through queuing or connection pooling?


I ended up not using grequests to solve my problem. I'm still hopeful it might be possible.

I used threading:

class MyAwesomeThread(Thread):
    Threading wrapper to handle counting and processing of tasks
    def __init__(self, session, q):
        self.q = q
        self.count = 0
        self.session = session
        self.response = None

    def run(self): 
        while True:
            url, host = self.q.get()
            httpHeaders = {'Host' : host}
            self.response = session.get(url, headers=httpHeaders)
            # handle response here
            self.count+= 1

threads = []
for i in range(CONCURRENT):
    session = requests.session()
    t.daemon=True # allows us to send an interrupt 

## build urls and add them to the Queue
for url in buildurls():

## start the threads
for t in threads: