Parallel execution of asyncio functions

Let's start 2 async functions at the same time and wait until they both will finish:

import asyncio
import time

def write(msg):
    print(msg, flush=True)

async def say1():
    await asyncio.sleep(1)
    write("Hello 1!")

async def say2():
    await asyncio.sleep(1)
    write("Hello 2!")

write("start")
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.gather(
    say1(),
    say2()
))
write("exit")

loop.close()

Read it carefully.

If you will run this, you will see that Hello 1! and Hello 2! appeared at the same time after 1 second, not after 2.

Note: we wraped all print calls with write function which disables flush buffer to ensure prints are coming to terminal at a time of call and our experiment is not affected by stdout buffering

Awaiting vs waiting

Asyncio is not multithreading or multiprocessing, but it runs code in parallel🤯

The thing is next: When run_until_complete runs say1 function, the interpreter executes it line by line, and when it sees await, it starts asynchronous operation which later will be finished with some internal callback to loop (such callback hidden from us, developers).

But now, after the start, it immediately returns control to the event loop. So it starts asynchronous sleep and our loop has control, so the loop is actually ready to start the next function say2. When first async sleep is finished, it makes an internal callback to loop (hidden from us) and loop resumes execution of say1 coroutine: next operation is printing Hello 1!. After printing it returns again to the event loop. At the same time, from the second sleep, the loop receives an event about finishing the second sleep (if 2 events will come at the same time they will not be lost, they will be just queued).

So now Hello 2! printed and second method also returned. run_until_complete(gather(l1,l2,l3)) will block until all l1, l2, l3 coroutines will be done.

It can be displayed as next (assume that all red lines are at 0s time point, and all blue at 1s):

Asyncio parallel execution diagram

Note that 7 and 9 events may become swapped - if you run code several times you may notice that first Hello printed after second.

☝ BTW: async def functions are named coroutines. They could be awaited

Why it is cool

Now just imagine that you can do any blocking IO operations like sleep here (call HTTP methods, work with files, executing database queries) - just start as many as you want and wait.

You would use server hardware with maximum efficiency without spawning processes or threads which have a lot of overhead.

And it is a reality with Python 3.6+ asyncio!

When you will not able to use it

When you need multiple CPU operations in parallel. Coroutines should be used only for IO operations, like some HTTP client like aiohttp calls server. Blocking old libraries like requests would block your thread when aiohttp allows doing something else when you are waiting for server response.

CPU operations like machine learning, some hard math, looping over huge arrays will still block your thread even if you wrap it in a coroutine. Because your CPU is busy, it can't even exit to event loop while you calculate something. If you need to speed up CPU calculations look at batch processing explanation and example

Also, check out:

Did you know?
Asynchronous execution is supported in javascript from the beginning (browsers, nodejs, electron, etc). At early versions they just used callback functions to run something else after async operation finishes. But it created callback-hell issue in javascript, so after sometime in previous decade they implemented same async await interface which we have in python 3.6+. This interface looks for user like sequential execution, with same parallel IO efficiency
#asyncio #python
4
Ivan Borshchov profile picture
Feb 21, 2017
by Ivan Borshchov
Did it help you?
Yes !
No

Best related