dalanmiller dalanmiller - 5 months ago 5x
Python Question

When to use and when not to use Python 3.5 `await` ?

I'm getting the flow of using

in Python 3.5 but I haven't seen a description of what things I should be
ing and things I should not be or where it would be neglible. Do I just have to use my best judgement in terms of "this is an IO operation and thus should be


By default all your code is synchronous. You can make it asynchronous defining functions with async def and calling this functions with await. More correct question is "When should I write asynchronous code instead of synchronous?". Answer is "When you can benefit of it". In most cases as you noted you will get benefit, when you work with I/O operations:

# Synchronous way:
download(url1)  # takes 5 sec.
download(url2)  # takes 5 sec.
# Total time: 10 sec.

# Asynchronous way:
await asyncio.gather(
    download(url1),  # takes 5 sec. 
    download(url2)   # takes 5 sec.
# Total time: only 5 sec. (+ little overhead for using asyncio)

Of course, if you created function that uses asynchronous code, this function should be asynchronous too (should be defined as async def). But any asynchronous function can freely use synchronous code. It makes no sense to cast synchronous code to asynchronous without some reason:

# extract_links(url) should be async because it uses async func download() inside
async def extract_links(url):  
    # download() was created async to get benefit of I/O
    data = await download(url)   
    # parse() doesn't work with I/O, no sense to make it async
    links = parse(data)  
    return links

One very important thing is that any long synchronous operation (> 50 ms, for example, it's hard to say exactly) will freeze all your asynchronous operations for that time:

async def extract_links(url):
    data = await download(url)
    links = parse(data)
    # if search_in_very_big_file() takes much time to process,
    # all your running async funcs (somewhere else in code) will be friezed
    # you need to avoid this situation
    links_found = search_in_very_big_file(links)

You can avoid it calling long running synchronous functions in separate process (and awaiting for result):

executor = ProcessPoolExecutor(2)

async def extract_links(url):
    data = await download(url)
    links = parse(data)
    # Now your main process can handle another async functions while separate process running    
    links_found = await loop.run_in_executor(executor, search_in_very_big_file, links)

One more example: when you need to use requests in asyncio. requests.get is just synchronous long running function, which you shouldn't call inside async code (again, to avoid freezing). But it's running long because of I/O, not because of long calculations. In that case, you can use ThreadPoolExecutor instead of ProcessPoolExecutor to avoid some multiprocessing overhead:

executor = ThreadPoolExecutor(2)

async def download(url):
    response = await loop.run_in_executor(executor, requests.get, url)
    return response.text