loading...

Asynchronous HTTP requests in Python

matteo profile image Matt ・2 min read

In python, you can make http request to API using requests module
or native urllib3 module.

However requests and urllib3 are synchronous. It means that only one HTTP call can be made at the time in a single thread. Sometimes you have to make multiples HTTP call and synchronous code will perform baldy. To avoid this, you can use multi threading or since python 3.4 asyncio module.

Test case

In order to show the difference of time between sync and async code, i made a script that read a file with 500 cities names and perform HTTP call to an API to retrieve information about location,population and so on from the city name.

Sync code performance

Here is the sync code version with requests module

@timeit
def fetch_all(cities):
    responses = []
    with requests.session() as session:
        for city in cities:
            resp = session.get(f"https://geo.api.gouv.fr/communes?nom={city}&fields=nom,region&format=json&geometry=centr")
            responses.append(resp.json())
    return responses

Finished 'fetch_all' in 38.7053 secs

Async code performance

I used aiohttp module to make the async code as requests module doesn't support asyncio for now.

async def fetch(session, url):
    """Execute an http call async
    Args:
        session: contexte for making the http call
        url: URL to call
    Return:
        responses: A dict like object containing http response
    """
    async with session.get(url) as response:
        resp = await response.json()
        return resp

async def fetch_all(cities):
    """ Gather many HTTP call made async
    Args:
        cities: a list of string 
    Return:
        responses: A list of dict like object containing http response
    """
    async with aiohttp.ClientSession() as session:
        tasks = []
        for city in cities:
            tasks.append(
                fetch(
                    session,
                    f"https://geo.api.gouv.fr/communes?nom={city}&fields=nom,region&format=json&geometry=centr",
                )
            )
        responses = await asyncio.gather(*tasks, return_exceptions=True)
        return responses
@timeit
def run(cities):
    responses = asyncio.run(fetch_all(cities))
    return responses

Finished 'run' in 3.0706 secs

Conclusion

As you can see, the async version is lot faster than the sync version so if you run into a situation where your code is performing multiple I/O calls then you should consider concurrency to improve performance. However asynchronous version require more work as you can see. I recommend reading python asyncio to learn more about asyncio.

Thanks for reading

Posted on by:

matteo profile

Matt

@matteo

Student Master's Degree in Computer Science. Data analyst apprenticeship.

Discussion

markdown guide