DEV Community

Cover image for Python Asyncio: Basic Fundamentals
Vitaly Shchurov
Vitaly Shchurov

Posted on • Updated on

Python Asyncio: Basic Fundamentals

Asynchronous programming in Python doesn't seem to be a very easy subject, and it might be confusing when you first have to face it. In fact, it's a very practical tool, which is why it's really worth spending some effort to learn it.

This article will introduce you to the very fundamentals, another one, for more advanced stuff, will be coming out soon. Here, I will mainly focus on a high-level and easy way to use asyncio along with explaining what's so good about it, and when and why you should choose asyncio over threading.

Hope you enjoyed my article! If you find it helpful, I'll be grateful for your like :)

INTRO TO CONCURRENCY

Let's give a quick overview of what we need to understand. First, asyncio is a part of a broader concept called concurrency. Concurrent programming allows us to speed up our program by compelling our computer to do a few things at the same or seemingly at the same time instead of executing tasks one by one. There are three main types of concurrency: multiprocessing, (multi)threading and asynchronous programming.

When we want our machine to perform some intensive computations that we are too lazy busy to do ourselves, we’d better use multiprocessing (also called parallelism). In these cases, all the CPU cores our brand-new (or not very much so) computer has work at the same time to finish up some enormous task x times faster. This is the case of a true concurrency - when a computer can handle tasks at the same time better than Julius Caeser himself ever could (if only he'd had a laptop! Well, he had slaves instead :).

At times, our slave… machine doesn't work hard enough. A program that communicates with network requests or processes files on the hard drive spends most of its time waiting for responses. So, typically the process would look like:

# send a request
# wait
# get a response
# process it

# do the same again and again
Enter fullscreen mode Exit fullscreen mode

This waiting around can take up most of the programme's time, so it actually rests more than it works. That won't do!

In such an event, we can achieve faster performance by starting a few such processes seemingly at the same time. While one process is doing nothing waiting, another one can, say, send a request, and lean back allowing the next one to do a similar job. When a server response reaches us, the first process (thread/task) gets CPU’s attention again and finishes its job, and so on. So, we don't really run things at the same time, but we just make sure that our CPU always has things to do while some threads are put on hold.

The best explanation of this strikingly beautiful and simple idea I've ever encountered is this: imagine a genius chess grandmaster who decides to put on a little show and play with, say, 20 other people. The grandmaster gets 10 seconds to plan his move, while all the other players get two minutes. If the grandmaster 'finishes off' his competitors one by one, it'll be hours before he's done.

To save himself some time, he can play with all of them (almost) at the same time by proceeding from one board to another with each his move. This way the game will be finished many times faster just because our grandmaster (CPU) will be busy working on other boards (tasks) while his opponents are contemplating their moves. So, this is what threading and asyncio do.

For now, we've established that concurrency can speed up your code's performance by using three basic ways: multiprocessing, threading and asynchronous programming. Use multiprocessing for heavy computing (CPU-bound tasks) and threading/asyncio when there's a lot of waiting around (I/O-bound tasks: web requests, working with files, etc.).
Alt Text

ASYNCIO AND THREADING: DIFFERENCE

Essentially, asyncio and threading are different in the way threads (tasks for asyncio) take turns.*

In threading, you don't get to control when one thread passes control to another, it can even happen in the middle of an operation like x += 1. What's more, this interruption can occur in different places each time you run your code, it's decided by your operating system.

Also, because threads operate on shared data, they can really mess up with it (so-called race conditions). So, it's considered good practice to protect against such situations by using locks, or better yet, libraries that implement them. Unfortunately, this doesn't go without some extra overhead.

In asyncio, we mark an exact place where our code gives up control using a keyword await. So, the code gives a signal that it's ready to be added to the waiting list and, meanwhile, CPU can shift to other tasks.

By default, asyncio runs a bunch of tasks in one thread, which basically removes the problem of race conditions and using locks (with all their overheads), but still saves a lot of time by cutting away the waiting time.

This explanation might present asyncio in a much better light, but it doesn't mean that you should always opt for it (more about it in a minute).

ASYNCIO vs. THREADING

STRENGTHS OF ASYNCIO

asyncio is safer. You know exactly the points where your code will switch to the next task, which makes race conditions much harder to come by. Moreover, it makes debugging way easier comparing to threading.

asyncio is lighter. Threads are managed by the operating system and, therefore, require more memory. asyncio almost always operates in one thread where it runs multiple tasks, which 'unburdens' it from the 'notorious' GIL makes it much lighter.

asyncio is better for networking. Threads consume more memory since each thread has its own stack. With async code, all the code shares the same stack and it's kept small, which makes asyncio capable of supporting many thousands of simultaneous socket connections, so it's better suited for web-development.

DRAWBACKS OF ASYNCIO*

threading is better for working with the filesystem. Operating systems usually don't support asynchronous operations with files, so for working with files, you'll need threading. There's a third-party library, though, aiofiles, which provides an async-compatible API for file operations, but, in fact, it's just delegating all the work to background threads under the hood (StackOverflow). So, if no networking is involved, it's better to stick to threading.

many Python libraries are not asyncio-friendly. This means you cannot really use these libraries with asyncio or have to implement a special mechanism to avoid that (which comes with overhead). For example, the popular requests library doesn't support asyncio. So, you either need to use threading or choose another library (e.g., aiohttp works beautifully with asyncio).

asyncio doesn't always makes your code faster. So, be careful to choose asyncio where it can serve its purpose best (for example, as said, don't choose it for working with files). If speed is what you are after, you might want to consider using Cython.

• since asyncio is single-threaded, it's not 'burdened' with the GIL like threading, but it cannot really benefit from multiprocessing either.

• although, asyncio removes most hard-to-trace race conditions which occur when we use threading, there are some other possible problems. Like, how will you communicate with a database that may allow only a few connections? How will your program terminate connections gracefully when you receive a signal to shut down? How will you handle blocking disk access and logging? etc.

DRAWBACKS OF THREADING:**

This is an article about asyncio, I know, but let's quickly review some of the threading's flaws as well so you'd be more comfortable when choosing between these two tools:

• threading bugs and race conditions in threaded programs can be really be hard to debug. On one run, your program is doing just fine, the next time the output is completely wrong, and there's sometimes no reliable way to find out what exactly the problem is.

Alt Text
threads are resource-intensive. Threads require extra operating system resources to create (or even preallocate on 32-bit systems) per-thread stack space that consumes process virtual memory. To get around this problem on 32-bit systems, it's sometimes necessary to decrease the stack size with threading.stack_size(). Although, it can impact the degree to which function calls may be nested, including recursion. Single-threaded coroutines don't have this problem.

• At very high concurrency levels (say, more than 5,000 threads), there can also be a speed impact due to context-switching costs.

Threading is an inefficient model for large-scale concurrency. The operating system will continually share CPU time with all threads regardless of whether a thread is ready to do work or not. For instance, a thread may be waiting for data on a socket, but the operating system may still switch to and from that thread thousands of times before any actual work needs to be done. (In the async world, the select() system call is used to check whether a socket-awaiting coroutine needs a turn; if not, that coroutine isn’t even woken up, avoiding any switching costs completely).

To sum this up, "use async IO when you can; use threading when you must."(source)
Alt Text

BASIC ASYNCIO SYNTAX

The simplest way to do async programming is by using the async/await syntax introduced in Python 3.5. Since Python 3.5, this syntax has become a native feature of the language, which is why asynchronous functions are now called native coroutines.

Before that, using asyncio was syntactically different. I'll focus only on the new syntax, and you should always use it unless you have to work with some pre-Python 3.5 version, in which case you might want to read about the old way here or here.

Async programming is based on using coroutines. Couroutine (or an asynchronous function) is a function that can suspend its execution before reaching return, and it can then pass control to another coroutine for some time. So, in a sense, coroutine is a specialized version of Python generators, but not quite.

The main difference lies with the fact that a coroutine suspends itself before returning (yielding) the result, and only temporarily. It gives up control to another coroutine, and then resumes itself.

A function prefixed with async def automatically becomes a coroutine.

async def coroutine():
    print("This is an asynchronous function.")
Enter fullscreen mode Exit fullscreen mode

As I've mentioned, the central focus of asyncio is on how to best perform tasks that involve waiting periods. So, to tell the code to wait for something, we use the keyword await inside and only inside a coroutine (else it’s a syntax error):

import asyncio

async def main():
    print("Hello, ", end="")
    await asyncio.sleep(1)
    print("world!")
Enter fullscreen mode Exit fullscreen mode

Here, we tell our code to stop and wait for a second. Because time module is not async compatible, we have imported the asyncio module from the standard library. Besides the basic native async syntax, you'll mostly be interacting with this module.

Seems like we are good to go, but if you try to execute the main() function, nothing will happen (or you'll get a runtime warning).

RUNNING ASYNC CODE (EVENT LOOP)

Why is that? It occurs because async functions need to be explicitly run on the thing called an event loop. For the time being, I'll just tell you the essential information about it without diving too deep.

In computer science, the event loop is a programming construct that waits for events (triggers) and then performs specific (programmed) actions.

You've actually had a lot of experience with them even if you don't know about it. Every time you use a computer, you interact with event loops. An event can be your laptop listening for user input, browsing, etc.

Event loops schedule and run asynchronous tasks in Python (basically, manage them for you to make your life a bit easier).

For now, to run the loop, we'll be using the asyncio.run() method, because it does all the heavy lifting for you, and you'll probably use it most of the time anyway. But bear in mind that an event loop could be created and fine-tuned using more lower-level API. What's more, you can use other implementations. For example, if fast performance is crucial for your application, uvloop offers a faster event loop than asyncio.

When executed, asyncio.run() will block the execution of a programme until the passed-in coroutine finishes:

import asyncio

async def main():
    print("Hello, ", end="")
    await asyncio.sleep(1)
    print("world!")

asyncio.run(main())
Enter fullscreen mode Exit fullscreen mode

You'll get 'Hello, ', and then 'world!' in a second. So, here we created a useless simple task, and ran it. And what fun that was :) Now you have a taste of the syntax.

RUNNING MULTIPLE TASKS

Now, let's see how to work with a bunch of tasks, which is the whole point of asyncio). To schedule a few tasks, we need to use asyncio.gather(). By the way, asyncio.sleep() is used to imitate waiting for a web response:

import asyncio

async def task(num: int):
    print(f"Task {num}: request sent")
    await asyncio.sleep(1)
    print(f"Task {num}: response arrived")

async def main():
    await asyncio.gather(*[task(x) for x in range(1,4)])

if __name__ == '__main__':
    asyncio.run(main())
Enter fullscreen mode Exit fullscreen mode

This was a primitive imitation of interaction with a server, and after the execution we'll get:

Task 1: request sent
Task 2: request sent
Task 3: request sent
Task 1: response arrived
Task 2: response arrived
Task 3: response arrived
Enter fullscreen mode Exit fullscreen mode

See, how each task (coroutine) starts up, then sleeps and cedes control to another? Then, when the sleeping is over, it wakes up and keeps going. This is exactly what asyncio does. It runs your coroutines, then stops exactly where it is expected (that is, when something needs to be awaited), meanwhile remembering the state of a function, and executes the next task.

The question is, can we await anything we'd like? Actually, no. You can only await an awaitable object. What is that? It's an object that implements an __await__() method. Most of the times (not always), an awaitable object will be a function that is defined using async def (that is, a coroutine).

Hence, the problem of async-incompatible libraries mentioned above. The libraries that saw daylight before asyncio weren't designed to work with asynchronous code, and adjusting them requires a lot of work. For that reason, they cannot be properly used (there is a workaround, though, but it comes at a cost of doing lower-level programming and practically involving threading/multiprocessing at some point).

With that being said, never use the requests library with asyncio, because they're incompatible. Work with aiohttp instead, it's designed for working with asyncio. Likewise, the time module should never be used with asyncio. In fact, asyncio comes with its own asyncio.sleep() method, which will serve you just fine.

PRACTICAL CHALLENGE

Now, let's get our hands dirty! Since we already mentioned aiohttp, I suggest we use it to solve a more or less practical challenge. First, using an aiohttp session, we'll download a web page:

import asyncio
import aiohttp

async def fetch(url: str):
    async with aiohttp.ClientSession() as session:
        response = await session.get(url)
        print(await response.text())

if __name__ == '__main__':
    asyncio.run(fetch("https://python.org/"))
Enter fullscreen mode Exit fullscreen mode

Notice how we created an asynchronous context manager within our async function, it's pretty basic, as you can see. Nice, let's take it further and fetch a bunch of web pages from asyncio documentation and measure their lengths:

import aiohttp
import asyncio

async def fetch_and_measure(url: str):
    async with aiohttp.ClientSession() as session:
        response = await session.get(url)
        html = await response.text()
        return len(html)

async def main(urls: list):
    result = await asyncio.gather(*[fetch_and_measure(url) for url in urls])
    return result

if __name__ == '__main__':
    urls = [
        "https://docs.python.org/3/library/asyncio.html",
        "https://docs.python.org/3/library/asyncio-task.html",
        "https://docs.python.org/3/library/asyncio-stream.html",
        "https://docs.python.org/3/library/asyncio-sync.html",
        "https://docs.python.org/3/library/asyncio-subprocess.html",
    ]
    result = asyncio.run(main(urls))
    print(result)
Enter fullscreen mode Exit fullscreen mode

Notice that we have gathered tasks in a main() coroutine function. This is often considered to be a good practice.

CHAINING COROUTINES

In the example above, we could create a separate function to evaluate the length of a web page. So, can we chain coroutines together? Yes, we can. In real life, such function would do more complicated things (parsing, for example or whatever you'd like to do with your downloaded web pages), and it'd be good to keep them separated from fetching. Chaining asynchronous functions is very easy in Python, let's rework our example a little bit:

import aiohttp
import asyncio

async def measure(html: str) -> int:
    return len(html)

async def fetch(url: str):
    async with aiohttp.ClientSession() as session:
        response = await session.get(url)
        html = await response.text()
        length = await measure(html)
        return length

async def main(urls: list):
    result = await asyncio.gather(*[fetch(url) for url in urls])
    return result

if __name__ == '__main__':
    urls = [
        "https://docs.python.org/3/library/asyncio.html",
        "https://docs.python.org/3/library/asyncio-task.html",
        "https://docs.python.org/3/library/asyncio-stream.html",
        "https://docs.python.org/3/library/asyncio-sync.html",
        "https://docs.python.org/3/library/asyncio-subprocess.html",
    ]
    result = asyncio.run(main(urls))
    print(result)
Enter fullscreen mode Exit fullscreen mode

Note how you create a measure() coroutine, but you don't await anything inside it, because you can await only awaitable objects. But the measure() itself is such an object - it's a coroutine! Which is why you await this coroutine when you call it from inside the fetch().

CONCLUSION

Now, you've learned what asyncio is, what it's place in Python concurrency, what advantages and flaws it has comparing to threading, and how to create coroutines and run them on an event loop based on a high-level asyncio API.

That's enough for the first encounter, but remember, asyncio is much more than that! It now supports a wide range of interactions with native Python features: there are asynchronous iterators, generators, context managers, loops, and even comprehensions! We've actually already used an async context manager.

There is a special asyncio.Queue() for working with producer-consumer tasks, and a lot of other tools and implementations (such as fast performance event loop uvloop). I hope to introduce your to more asyncio theory and practice as soon as my next article about it is ready.

There's one more thing you need to remember. Asyncio API includes interface both for end-users and framework developers. There're methods which should be used primarily by one of those users. If you develop your application, be careful to avoid tools that are meant to be used by framework developers and fine-tuning (unless you know what you're doing and really need them).

Alt Text


* 'Using Asyncio in Python 3' by Caleb Hattingh, p. 7-8.
** 'Using Asyncio in Python 3' by Caleb Hattingh, p. 11-13.

Top comments (1)

Collapse
 
omarraafat14 profile image
Omar Raafat

Good Article, but is there part two for more advanced things can be done with Asyncio?