DEV Community

Cover image for A Deep Dive into Connecting FastAPI with SingleStore.
Mahmoud Harmouch
Mahmoud Harmouch

Posted on

A Deep Dive into Connecting FastAPI with SingleStore.

Hello there! This is my fourth day documenting my journey during the SingleStore Database Hackathon. In this article, we are going to explore SingleStore as a distributed cloud SQL database and how to connect it to a FastAPI application.

This tutorial is helpful for anyone looking for a way to connect to a remote MySQL server from a FastAPI application, especially on a SingleStoreDB cluster. However, the SingleStore docs don't contain information about connecting FastAPI to a remote SingleStore database, nor is there any tutorial out there covering this topic. Therefore, in this tutorial, you will learn how to connect to a remote MySQL server, SingleStoreDB, in this case, from a FastAPI app.

Note that SingleStore provides short documentation on accessing MySQL databases from SQLAlchemy in a synchronous manner. Though we want to unleash the full potential of FastAPI through asynchronous programming, I found there were a few things that could use extra documentation for FastAPI support. In this blog post, I will walk you through the process of connecting SingleStore to FastAPI in a fully asynchronous fashion so that you can take advantage of all the benefits of asynchronous programming.

πŸ‘‰ Table Of Contents (TOC).

Getting Started with SingleStore.

In this section, we are going to address some of the main features of SingleStore, and show you how to set up your own SingleStore database in the cloud.

What is SingleStore?


How SingleStoreDB Cloud Works

As their website puts it, SingleStore is a cloud service that unifies transactions and analytics in a single engine. This makes it perfect for developers like us :-), who need fast, modern enterprise applications. SingleStore is based on a distributed SQL architecture, making it 10-100 milliseconds fast on complex queries. It's also easy to scale, so your business can grow without any headaches.

SingleStoreDB is a cloud database service that makes it easy to deploy, manage, and scale your databases. It is available on AWS, GCP, and Azure. SingleStoreDB abstracts most operations, such as deployment, management, and diagnostics. It is purpose-built to power global real-time applications with high throughput and low latency.

SingleStoreDB enables you to move away from traditional sharded architectures that are complex to manage and scale. With SingleStoreDB, you can use a single database for mission-critical applications without compromising performance or availability.

Setting up SingleStore.

πŸ” Go To TOC

I believe that the process of signing up and setting up a SingleStoreDB instance is relatively easy.

To create an account on SingleStore, you can proceed with the following steps:


SingleStore sign-up page.

  • Choose the sign-up option of your choice. It is preferred to use Gmail; it’s much faster.

  • Proceed with the sign-up process, and fill in the necessary information.


SingleStore ToS.

For more info, you can checkout the SingleStore Guide.

What is a Workspace?

πŸ” Go To TOC


Isolation and Scalability.

SingleStore workspaces are powerful, traditional deployments of the SingleStoreDB product. Each workspace has its own set of dedicated computing resources, making it excellent for running production workloads. You can attach and detach databases as needed, allowing you to quickly scale your data storage without having to worry about complex data management across different cloud regions.

Creating a Workspace.

πŸ” Go To TOC

After setting up the account, you will be prompted to create a workspace. To set up a workspace, perform the following steps:

  • Give your workspace a meaningful name.


SingleStore Workspace page.

  • You should select the Cloud Provider & Region as close to your physical location as possible to minimize latency.
  • Enter a strong password or click generate a password.
  • You will be redirected to your SingleStore dashboard.


Spin Up a cloud cluster.

  • After a few minutes, you will be notified that Your Workspace Group has been created successfully.


Workspace creation complete.

Creating a MySQL database.

πŸ” Go To TOC


Workspace database creation options.

There are two ways to create a MySQL database attached to a SingleStore Workspace: either by using the SQL Editor in the Cloud portal to enter your MySQL command CREATE DATABASE command or by using the GUI by clicking the Create Database button.

Creating a MySQL database from the GUI is easy and only requires two clicks. To do so:

  • Click on the Create Database button.


Create and connect a database to a workspace.

  • You will be prompted to enter a database name.


Database creation form.

  • Then select the workspace that the database needs to be attached to.

  • Finally, Click on the Create Database button.

To use the SQL Editor in the Cloud portal, type in your MySQL command CREATE DATABASE followed by the name of your new database and its schema. You can follow the docs for the syntax of the SQL command you need to enter.

This will create a new database with the specified name. You can look at the newly created database on the side panel. Or using the use MySQL command to access this database.


SingleStore dashboard panel.

SingleStore and CRUD operations.

πŸ” Go To TOC


SingleStore SQL Editor.

In order to interact with the database, let’s type the following code:

use chat
Enter fullscreen mode Exit fullscreen mode

The console will respond with the result. We can query and work on our database. For instance, to see the available tables inside the chat database, try the following command:

SHOW tables;
Enter fullscreen mode Exit fullscreen mode

Note that there is also the default MySQL database that holds all the info about the MySQL database, such as registered users, tables, schemas, etc.

The default MySQL users table.

With SingleStoreDB set up, we need to create our FastAPI app and connect it to this database.

Getting Started with FastAPI.

πŸ” Go To TOC

Arguably, one of the easiest ways to create API endpoints is using a framework such as FastAPI for a robust and scalable backend. Therefore, in this section, we will go through some simple endpoints that are minimal self-contained REST APIs.

We'll be using SQLAlchemy as our ORM (object-relational mapping) tool and Pydantic for data validation. SQLAlchemy will allow us to interact with our database easily and efficiently, while Pydantic will help ensure that the data being passed into our API is valid.

Project Structure.

πŸ” Go To TOC

You tend to reuse existing code in your project when developing web applications. And to make your code reusable, the recommended way to structure a FastAPI project is to create each app in a separate folder. This way, your codebase will look much more organized and structured.

Creating each app in a separate folder has another advantage. It makes it easier to reuse code from one app to another. For example, if you have created an authentication app, you can easily reuse the code in other apps that require authentication.

Of course, you are not required to follow this structure. But if you want to make your code reusable, I think this is the best practice. I'm particularly obsessed with this way of organizing code, inspired by the Django REST framework.

app
β”œβ”€β”€ auth
β”‚    β”œβ”€β”€ crud.py
β”‚    β”œβ”€β”€ model.py
β”‚    β”œβ”€β”€ router.py
β”‚    └── schemas.py
β”œβ”€β”€ config.py
β”œβ”€β”€ main.py
β”œβ”€β”€ users
β”‚    β”œβ”€β”€ crud.py
β”‚    β”œβ”€β”€ model.py
β”‚    β”œβ”€β”€ router.py
β”‚    └── schemas.py
└── utils
    β”œβ”€β”€ constants.py
    β”œβ”€β”€ crypt_util.py
    β”œβ”€β”€ jwt_util.py
    β”œβ”€β”€ mixins.py
    └── session.py
Enter fullscreen mode Exit fullscreen mode

Notice we have the root app folder, which contains multiple folders, each representing an App, as you call it in Django. Inside each app folder, we have four essential files:

  1. The crud file contains all CRUD queries to interact with the database.
  2. The model file contains all models related to this app.
  3. The schemas file contains all pydantic schemas for data validation.
  4. The router file contains all routes related to this app.

This makes it easy to reuse apps across projects and keep them well organized.

Getting Started with SQLAlchemy.

πŸ” Go To TOC

SQLAlchemy is a powerful Python library for working with relational databases. It gives developers easy ways to work with databases in their Python code. SQLAlchemy is one of the most widely used and highest-quality Python third-party libraries.

Getting started with SQLAlchemy is easy. In the following sections, I will walk you through the basics of setting up a database connection and running some basic queries. We'll also go through how to use SQLAlchemy's ORM layer to map database tables to Python objects.

Asynchronous SQLAlchemy Engine.

πŸ” Go To TOC

MySQL is a powerful and popular database, and it's well-suited for use with Python. In the following section, we are going through how to set up a connection to a MySQL database in an asynchronous manner using the create_asynchrouns_engine function from SQLAlchemy. We'll also use the databases package and the aiomysql extra dependency. We'll also adapt the SQLAlchemy declarative approach to defining our users' table.

It is important to note that there are two ways to declare a data table in SQLAlchemy. However, as the name suggests, the declarative approach is much more readable and easy to understand. Therefore, we will be using the declarative approach in our project.

The Base class and Common Mixins.

πŸ” Go To TOC

When building an application, it's crucial to adhere to the DRY principle: "Don't Repeat Yourself". This means defining a common base class from which all other classes inherit. For example, all tables in our app will have an id column. By defining this column in a common mixin, we avoid repetition and make our code more streamlined.

In addition, for monitoring purposes, we'll define a TimestampMixin with creation_date and modified_date columns. This can help monitor CRUD operations in our app (e.g., when a particular MySQL command was executed?). I think these columns are especially needed if you're implementing OTP functionality, so you can track when users last received their OTP code.

The SQLAlchemy docs are extremely helpful when getting started with the library. In particular, the section on mixins is handy. By making use of the declarative_mixin and declared_attr decorators, it is possible to build powerful mixins that can make working with SQLAlchemy much easier. By reading this section of the docs, we can implement our CommonMixin class like the following:

import re
from sqlalchemy.orm import declared_attr, declarative_mixin
from sqlalchemy import BIGINT, Column, DateTime

@declarative_mixin
class CommonMixin:
    """define a series of common elements that may be applied to mapped
    classes using this class as a mixin class."""

    __name__: str
    __table_args__ = {"mysql_engine": "InnoDB"}
    __mapper_args__ = {"eager_defaults": True}

    id: int = Column(BIGINT, primary_key=True, autoincrement=True)

    @declared_attr
    def __tablename__(cls) -> str:
        split_cap = re.findall("[A-Z][^A-Z]*", cls.__name__)
        table_name = (
            "".join(map(lambda word: word.lower() + "_", split_cap[:-1]))
            + split_cap[-1].lower()
        )
        return table_name
Enter fullscreen mode Exit fullscreen mode

As you can see, each of our data models has an id column used as the primary key. This is a common column that is shared among all of our data tables. The __tablename__ attribute is a helpful method that takes a class name like AccessToken and converts it into a snake case name access_token when initialising the data tables.

We need to define our Base class using the registry and DeclarativeMeta classes. This will allow us to utilise mypy's static type-checking capabilities.

from sqlalchemy.orm import registry
from sqlalchemy.orm.decl_api import DeclarativeMeta

mapper_registry = registry()


class Base(metaclass=DeclarativeMeta):
    __abstract__ = True

    registry = mapper_registry
    metadata = mapper_registry.metadata

    __init__ = mapper_registry.constructor
Enter fullscreen mode Exit fullscreen mode

The remaining part is to define our TimestampMixin mixin as follows:

import datetime
from sqlalchemy.orm import declared_attr, declarative_mixin
from sqlalchemy import Column, DateTime

@declarative_mixin
class TimestampMixin:
    creation_date: datetime = Column(
        DateTime, default=datetime.datetime.utcnow()
    )
    modified_date: datetime = Column(DateTime)
Enter fullscreen mode Exit fullscreen mode

Now, to use these mixins, simply add them to your model class:

class Users(Base, CommonMixin, TimestampMixin):
   ...
Enter fullscreen mode Exit fullscreen mode

Now, whenever you create or update a record of type Users, the timestamp fields can be set accordingly.

The users table.

πŸ” Go To TOC
A Database Relationship Diagram created using dbdiagram.io.

As designed during the first day of the hackathon, our users' table looks like the following:

from enum import Enum
from typing import Optional

from pydantic import EmailStr
from sqlalchemy import Column, Integer, String

from app.utils.mixins import Base, CommonMixin, TimestampMixin


class ChatStatus(str, Enum):
    online = "online"
    offline = "offline"
    busy = "busy"
    dont_disturb = "don't disturb"


class UserStatus(int, Enum):
    active = 1
    disabled = 9


class UserRole(str, Enum):
    regular = "regular"
    admin = "admin"


class Users(Base, CommonMixin, TimestampMixin):
    first_name: str = Column(String(20))
    last_name: str = Column(String(20))
    email: EmailStr = Column(String(50), index=True)
    password: str = Column(String(120), index=True)
    phone_number: str = Column(String(20), nullable=True)
    bio: Optional[str] = Column(String(60), nullable=True)
    profile_picture: Optional[str] = Column(
        String(220), nullable=True
    )
    chat_status: Optional[ChatStatus] = Column(
        String(20), nullable=True
    )
    user_status: Optional[UserStatus] = Column(
        Integer, index=True, nullable=True
    )
    user_role: Optional[UserRole] = Column(
        String(20), nullable=True
    )
Enter fullscreen mode Exit fullscreen mode

Notice that we have three indexes email, password, and user_status. These are the most commonly accessed columns when fetching records, such as in the login endpoint. By indexing these columns, we can speed up record retrieval significantly.

Note that we are using SQLAlchemy v1.4 because the databases package doesn't currently support SQLAlchemy V2.0. SQLAlchemy v2.0 introduced a new function called mapped_column which can be used instead of Column to define much more powerful columns data tables.

Pydantic and Data Validation.

πŸ” Go To TOC

As mentioned above, Pydantic will help us ensure that the data being passed into our API is valid. It’s a tool which allows you to be much more precise with your data structures.

This is especially useful when working with APIs, as it can help you avoid passing in invalid data. For example, let’s say you have a register endpoint that accepts first_name, last_name, email, and password. With Pydantic, you can define exactly what data is allowed for each field.

from typing import Optional

from pydantic import BaseModel, Field

from app.users.schemas import UserObjectSchema


class UserSchema(BaseModel):
    user: Optional[UserObjectSchema]
    token: Optional[dict[str, str]] = Field(
        ..., example="eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9"
    )
    status_code: int = Field(..., example=200)
    message: str = Field(..., example="You have successfully logged in!")


class ResponseSchema(BaseModel):
    status_code: int = Field(..., example=400)
    message: str = Field(..., example="Something went wrong!")


class UserCreate(BaseModel):
    first_name: str = Field(..., example="Mahmoud")
    last_name: str = Field(..., example="Harmouch")
    email: str = Field(..., example="business@wiseai.dev")
    password: str = Field(..., example="SEc11r3P@ssw0rD")
Enter fullscreen mode Exit fullscreen mode

This ensures that only valid data is ever passed into your API, saving you a lot of time and headaches down the road. In our example, the register endpoint will validate the data and return either UserSchema or ResponseSchema according to some logic:

from typing import Union

from fastapi import APIRouter, Depends
from fastapi.security import OAuth2PasswordRequestForm

from app.auth.crud import register_user
from app.auth.schemas import ResponseSchema, UserCreate, UserSchema

router = APIRouter(prefix="/api/v1")

@router.post(
    "/auth/register",
    name="auth:register",
    response_model=Union[UserSchema, ResponseSchema],
    responses={
        201: {
            "model": UserCreate,
            "description": "Welcome to this blazingly fast chat app!",
        },
        400: {
            "model": ResponseSchema,
            "description": "User already signed up!",
        },
    },
)
async def register(user: UserCreate):
    results = await register_user(user)
    return results
Enter fullscreen mode Exit fullscreen mode

This way, it can help you validate data before it is stored in a database, saving you a lot of time and effort in the long run.

Connecting to SingleStoreDB.

πŸ” Go To TOC
Variable to use to connect to the MySQL database.

To make SQLAlchemy create our data models on the SingleStoreDB server, we need to specify the MySQL connection URL. To do so, login into your SingleStore dashboard and then fill in the following variables in your .env file accordingly.

  • SINGLESTORE_HOST: Your remote MySQL server's domain name, the SingleStore domain name.
  • SINGLESTORE_DATABASE: The name of the database you want to access, in our case, the chat database.
  • SINGLESTORE_USERNAME: This is the user name you'll be creating for remote accesses. By default, this variable is equal to admin.
  • SINGLESTORE_PASSWORD: The corresponding password for that user.
  • SINGLESTORE_PORT: The port number where MySQL is listening to.

Next, we need to establish a database connection by instantiating the Database class provided by the databases package. The code below will return a Database instance:

from databases import Database

from app.config import Settings

settings = Settings()

SQLALCHEMY_DATABASE_URL = (
    "mysql+aiomysql://"
    + settings.SINGLESTORE_USERNAME
    + ":"
    + settings.SINGLESTORE_PASSWORD
    + "@"
    + settings.SINGLESTORE_HOST
    + ":"
    + settings.SINGLESTORE_PORT
    + "/"
    + settings.SINGLESTORE_DATABASE
)
database = Database(SQLALCHEMY_DATABASE_URL)
Enter fullscreen mode Exit fullscreen mode

As described above, the constructor takes a string as an argument specifying the client credentials, such as username, host, and password.

Now, we can call the various database instance methods to execute SQL queries like so:

from app.auth.schemas import UserCreate
import datetime

async def create_user(user: UserCreate):
    query = """
        INSERT INTO users (
          first_name,
          last_name,
          email,
          password,
          user_status,
          creation_date
        )
        VALUES (
          :first_name,
          :last_name,
          :email,
          :password,
          1,
          :creation_date
        )
    """
    values = {
        "first_name": user.first_name,
        "last_name": user.last_name,
        "email": user.email,
        "password": user.password,
        "creation_date": datetime.datetime.utcnow()
    }
    return await database.execute(query, values=values)
Enter fullscreen mode Exit fullscreen mode

But, to run this query, we must first create the users table. This can be accomplished through SQLAlchemy as follows:

from sqlalchemy.ext.asyncio import create_async_engine

async def init_models(database_url):
    engine = create_async_engine(
        database_url,
        pool_pre_ping=True,
        pool_size=30,
        max_overflow=30,
        echo_pool=True,
        future=True,
        echo=True, # verbose output
        pool_recycle=3600,
    )  # recycle every hour

    async with engine.begin() as conn:
        await conn.run_sync(Base.metadata.drop_all)
        await conn.run_sync(Base.metadata.create_all)

    await engine.dispose()
Enter fullscreen mode Exit fullscreen mode

As described previously, we use the create_async_engine function to create an asynchronous engine to translate our models into MySQL CREATE statements by executing the Base.metadata.create_all method. Our users model is being translated into the following CREATE statement:

CREATE TABLE users (
    id BIGINT NOT NULL AUTO_INCREMENT, 
    creation_date DATETIME, 
    modified_date DATETIME, 
    first_name VARCHAR(20), 
    last_name VARCHAR(20), 
    email VARCHAR(50), 
    password VARCHAR(120), 
    phone_number VARCHAR(20), 
    bio VARCHAR(60), 
    profile_picture VARCHAR(220), 
    chat_status VARCHAR(20), 
    user_status INTEGER, 
    user_role VARCHAR(20), 
    PRIMARY KEY (id)
)ENGINE=InnoDB
Enter fullscreen mode Exit fullscreen mode

This command will be executed on the SingleStore cluster to create our users' table in our chat database.

Note that translating a large number of data models is time-consuming and should be done once at the moment of creating our app. This method can be placed inside the on_startup event:

from app.utils.session import SQLALCHEMY_DATABASE_URL, database, init_models
from fastapi import FastAPI

chat_app = FastAPI(
    docs_url="/docs",
    redoc_url="/redocs",
    title="Realtime Chat App",
    description="Realtime Chat App Backend",
    version="1.0",
    openapi_url="/api/v1/openapi.json",
)

@chat_app.on_event("startup")
async def startup():
    await init_models(SQLALCHEMY_DATABASE_URL)
    await database.connect()
Enter fullscreen mode Exit fullscreen mode

Running our app will result in the users table being created on SinglStoreDB, as shown in the following image:

`users` table created successfully.

Now, you can interact with the register endpoint by visiting http://localhost:8000/docs.

FastAPI app Auth endpoints.

Conclusion.

πŸ” Go To TOC

Awesome. You've now completed the tutorial and should have a solid understanding of connecting to a remote MySQL server running on a SingleStore cluster from your FastAPI application. This will prove invaluable as you begin developing more complex applications.

I hope you found this tutorial helpful and informative. We covered a lot of ground in explaining FastAPI, MySQL, databases, SingleStoreDB, and many more.

If you're still not sure how to use Database from databases, stay tuned for more tutorials on how you can create relational tables, add rows, fetch data, etc. Thanks for reading!

You can find the full code in the following repo

GitHub logo brave-chat / brave-chat-server

Brave Chat server powered by FastAPI, SQLAlchemy, Redis, SingleStoreDB and friends.

Don't forget to smash that ⭐ button for future updates.

Happy Coding, folks; see you in the next one.

Latest comments (0)