DEV Community

Alexis Tacnet
Alexis Tacnet

Posted on • Edited on • Originally published at kernelpanic.io

The Modern Way To Call APIs In Python

Even if I was quite skeptical at first, I now truly believe that asynchronous in Python will shape the future of the language. The ecosystem is more and more developed and a lot of amazing people are getting on board the async train to help Python developers manage this new way of coding. For sure, it's more complicated to write ansynchronous than synchronous code, but a lot has changed in the past few years, making the entry barrier lower that it has ever been.

At Strio, we use mainly FastAPI for our front API. It is an asynchronous web framework that allows us to achieve good performance with little effort. During a request from our frontend, our other clients, or directly from our customers, we usually have to call a bunch of other parties: external APIs, AMQP brokers, SQL or NoSQL databases, ... Our API plays the role of glue between all kind of services, and since all those calls are quite long, we can leverage asynchronous code to maintain impressive performance for our clients.

Among the services that we call, we have many HTTP requests to make. In the modern era of microservices, this is typically what you expect from a front API: when something quite complex has to be done, you just call the service in charge of it and let it do it. So, let's try to build the modern way of communicating asynchronously with those HTTP APIs.

The HTTP library

The first block that we need is the HTTP library. Why use a HTTP library? HTTP is a standard and you can't really avoid it, however this protocol can be hard sometimes and the reference is rather full of tricks. Fortunately, some people made some great HTTP library for python that we can use.

The one I really like is httpx, created by the encode team, in which you will find some of the most famous Pythonistas out there. httpx is basically requests, but with async support, typing and even more. Its simple syntax inspired by requests makes it really easy to use and understand, especially for someone that starts with asynchronous code and needs to read it.

Let's just take a glance at how we could make a simple GET requests, but asynchronously:

>>> import httpx
>>> async with httpx.AsyncClient() as client:
...     r = await client.get('https://ifconfig.co/json')
...
>>> r
<Response [200 OK]>
Enter fullscreen mode Exit fullscreen mode

This is the Asyncio interpreter, launched with python -m asyncio

The goal of this brick that I introduced is to make the HTTP call and return the response in a typed and verified model. 90% of the time we will then need to access the JSON body of this response, that we need to validate.

Response data validation

The next brick is the data validation of the response's body given by HTTPX. Why do we need validation? This helps us as developers to build typing for the response, which is a good thing for auto-completion and developer productivity, but also validate the data before going further. If the API returns something unexpected, we want to return a friendly exception now, instead of an AttributeError later on in the request.

For this job, I selected pydantic, a really good library that makes checking and validating data simple. Let's see how we can integrate it in our request pipeline:

>>> from pydantic import BaseModel
>>> from ipaddress import IPv4Address
>>> class IfConfig(BaseModel):
...     ip: IPv4Address
...     ip_decimal: int
... 
>>> ifconfig = IfConfig.parse_obj(r.json())
>>> ifconfig
IfConfig(ip=IPv4Address('78.153.21.75'), ip_decimal=1807729179)
Enter fullscreen mode Exit fullscreen mode

Now that we have static typing for this response data, we know that ifconfig.ip exists and is an IPv4Address. However the data is also validated, so if the field ip was missing from the response, or if we tried to parse an ipv6, for example, we would get an ValidationError exception, and we could take action for this unusual event.

Creation of an API client

In order to glue those two bricks together, we can integrate them into a class that will be called an API client.

The role of this class is to abstract the API reference logic for different pieces of our codebase, so developers can ignore the underlying layer and focus only on what resources they want to obtain or modify. This class also allows us to separate the external API's definition in a single place in order to test it and refactor it easily.

Let's create our IfConfigClient so that everything in our code, from our own API to our background jobs can query this service easily:

from ipaddress import IPv4Address

from pydantic import BaseModel
from httpx import AsyncClient

IFCONFIG_URL = "https://ifconfig.co/"

class IfConfig(BaseModel):
    ip: IPv4Address
    ip_decimal: int

class IfConfigClient(AsyncClient):
    def __init__(self):
        super().__init__(base_url=IFCONFIG_URL)

    async def get_ifconfig(self):
        request = await self.get('json')

        try:
            ifconfig = IfConfig.parse_obj(request.json())
        except ValidationError:
            print("Something went wrong!")

        return ifconfig
Enter fullscreen mode Exit fullscreen mode

The trick here is to make our client inherit from the AsyncClient class from httpx. This is something that makes everything really simple to develop but also to use. Let's keep this code in a ifconfig.py file, that we can import in our python -m asyncio interpreter:

>>> from ifconfig import IfConfigClient
>>> async with IfConfigClient() as client:
...     ifconfig = await client.get_ifconfig()
... 
>>> ifconfig
IfConfig(ip=IPv4Address('78.153.21.75'), ip_decimal=1807729179)
Enter fullscreen mode Exit fullscreen mode

The good way to call APIs

We built an API client that is quite modern: it is asynchronous, supports typing and validates data. But furthermore, this is the way I would like people to write API wrappers in the future, for several reasons.

Separations of concerns

With this configuration, the API is completely opaque to the user and can be tested properly on its own. If something changes in the API itself, like a new parameter on some endpoint, it can be done globally without breaking all the code currently using this endpoint. This is something that we have been doing in the Python ecosystem for quite some time already, but it is now even simpler to create a client.

Mocking for tests

I really like unit tests for complex functions, but intercepting everything that is going out from my code like HTTP calls is usually quite a mess. Thanks to our proxy, we have a standard way to patch our tests and even create fixtures with typing:

from unittest.mock import patch

import ifconfig

mock_ip = IPv4Address('78.153.21.75')
mock_ip_decimal = 1807729179
mock_ifconfig = ifconfig.IfConfig(ip=mock_ip, ip_decimal=mock_ip_decimal)

@patch.object(ifconfig.IfConfigClient, "get_ifconfig", return_value=mock_ifconfig)
def test_ifconfig_processing(get_ifconfig):
    ...
    assert get_ifconfig.assert_awaited_once()
    ...
Enter fullscreen mode Exit fullscreen mode

Typing everywhere

The client now has integrated typing: no more JSON and dictionnaries that you need to rummage through to get your data. You will have auto-completion with all the fields that you can access from the request response, with their types in your favorite IDE. Coupled with mypy or pyright, it also allows you to perform static type checking ahead of tests and commits.

Data validation

The API client is now in charge of taking an action if something uncommon happens in the API response, like a field disappearing, instead of having an issue when trying to access this field in the code made by someone else. This is essential for testing, but also for maintainability of the code when the API changes.

I am truly happy to see new tools like httpx and pydantic emerge in the Python landspace, I think they make building clean code easier and enforce good standards for complex codebases. This new way of coding external requests will spread. Even the Elasticsearch python maintainer is agreeing when discussing about the future of their python clients.

Top comments (5)

Collapse
 
autoferrit profile image
Shawn McElroy

We are going to be starting a new project at work using fastapi. And this is exactly how we should consume that API from our main app. I didn't even think of this yet. Thanks.

Collapse
 
v6 profile image
🦄N B🛡

Is API versioning / hashing (and client side verification and enforcement thereof) a first class citizen of Strio's approach?

Regardless, thanks for writing this, it was worth the reading to read if only for the introduction of httpx, into which I will look further.

Collapse
 
fuegoio profile image
Alexis Tacnet

Hi ! Yes, this is also something we take into account. If you mean API versionning for the external APIs that we consume (Kubernetes, Github, Gitlab, ...), this is also the goal of enforcing their schema with Pydantic. This way, we usually know well before that some endpoint changed.

Collapse
 
loki profile image
Loki Le DEV

Thanks I've started to use FastAPI for my backend but I didn't write a client yet and now I know how to proceed!

Collapse
 
rhymes profile image
rhymes

Love httpx as well!