Python has a handful of popular frameworks for building APIs such as Flask and Django but the framework we will be taking a look at which has become one of the most popular in the last 5 years is FastAPI. FastAPI is built with modern Python features such as asynchronous programming with an event loop, utilizes Pydantic for data validation and generates standard OpenAPI specs from the Python metadata which allows you to build a well documented production grade API very quickly. Lets dive in.
💡 The complete source code referenced in this guide is available on GitHub
https://github.com/dpills/fastapi-prod-guide
Setup
Make sure that you have Python3 and Poetry installed which we will use for package management.
$ python3 --version
Python 3.11.4
$ poetry --version
Poetry (version 1.6.1)
Create your project folder and initialize the project.
$ poetry init
This command will guide you through creating your pyproject.toml config.
Package name [fastapi-quick-start-guide]:
Version [0.1.0]:
Description []:
Author [dpills, n to skip]: dpills
License []:
Compatible Python versions [^3.11]:
Would you like to define your main dependencies interactively? (yes/no) [yes] no
Would you like to define your development dependencies interactively? (yes/no) [yes] no
Generated file
[tool.poetry]
name = "fastapi-quick-start-guide"
version = "0.1.0"
description = ""
authors = ["dpills"]
readme = "README.md"
[tool.poetry.dependencies]
python = "^3.11"
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
Do you confirm generation? (yes/no) [yes]
Install the core FastAPI dependencies.
$ poetry add fastapi 'uvicorn[standard]'
Creating virtualenv fastapi-quick-start-guide-KBe0UMIg-py3.11 in /Users/dpills/Library/Caches/pypoetry/virtualenvs
Using version ^0.103.2 for fastapi
Using version ^0.23.2 for uvicorn
Updating dependencies
Resolving dependencies... (1.1s)
Package operations: 18 installs, 0 updates, 0 removals
• Installing idna (3.4)
• Installing sniffio (1.3.0)
• Installing typing-extensions (4.8.0)
• Installing annotated-types (0.6.0)
• Installing anyio (3.7.1)
• Installing pydantic-core (2.10.1)
• Installing click (8.1.7)
• Installing h11 (0.14.0)
• Installing httptools (0.6.0)
• Installing pydantic (2.4.2)
• Installing python-dotenv (1.0.0)
• Installing pyyaml (6.0.1)
• Installing starlette (0.27.0)
• Installing uvloop (0.18.0)
• Installing watchfiles (0.21.0)
• Installing websockets (11.0.3)
• Installing fastapi (0.103.2)
• Installing uvicorn (0.23.2)
Writing lock file
Whenever I start a new project I like to maintain quality standards and using automated quality tools makes it easy. Lets go ahead and install mypy for static type checking, black for formatting, and ruff for linting. Add these to the dev dependencies.
$ poetry add -G dev ruff black mypy
Using version ^0.0.292 for ruff
Using version ^23.9.1 for black
Using version ^1.6.0 for mypy
Updating dependencies
Resolving dependencies... (0.3s)
Package operations: 7 installs, 0 updates, 0 removals
• Installing mypy-extensions (1.0.0)
• Installing packaging (23.2)
• Installing pathspec (0.11.2)
• Installing platformdirs (3.11.0)
• Installing black (23.9.1)
• Installing mypy (1.6.0)
• Installing ruff (0.0.292)
Writing lock file
Add the configurations for these quality tools to the pyproject.toml
file which should now look similar to this.
📝 pyproject.toml
[tool.poetry]
name = "fastapi-quick-start-guide"
version = "0.1.0"
description = ""
authors = ["dpills"]
readme = "README.md"
[tool.poetry.dependencies]
python = "^3.11"
fastapi = "^0.103.2"
uvicorn = { extras = ["standard"], version = "^0.23.2" }
[tool.poetry.group.dev.dependencies]
ruff = "^0.0.292"
black = "^23.9.1"
mypy = "^1.6.0"
[tool.black]
line-length = 88
[tool.ruff]
select = ["E", "F", "I"]
fixable = ["ALL"]
exclude = [".git", ".mypy_cache", ".ruff_cache"]
line-length = 88
[tool.mypy]
disallow_any_generics = true
disallow_subclassing_any = true
disallow_untyped_calls = true
disallow_untyped_defs = true
disallow_incomplete_defs = true
check_untyped_defs = true
no_implicit_optional = true
warn_redundant_casts = true
warn_unused_ignores = true
warn_return_any = true
strict_equality = true
disallow_untyped_decorators = false
ignore_missing_imports = true
implicit_reexport = true
plugins = "pydantic.mypy"
[tool.pydantic-mypy]
init_forbid_extra = true
init_typed = true
warn_required_dynamic_aliases = true
warn_untyped_fields = true
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
Now that we have our dependencies installed, add the main python file and the basic FastAPI example.
📝 main.py
import uvicorn
from fastapi import FastAPI
app = FastAPI()
@app.get("/")
async def read_root() -> dict[str, str]:
"""
Hello World
"""
return {"Hello": "World"}
@app.get("/items/{item_id}")
async def read_item(item_id: str) -> dict[str, str]:
"""
Get an Item
"""
return {"item_id": item_id}
if __name__ == "__main__":
uvicorn.run(
"main:app",
host="0.0.0.0",
port=8000,
log_level="debug",
reload=True,
)
Run the file with python and open your web browser to http://localhost:8000/docs
$ poetry shell
Spawning shell within /Users/dpills/Library/Caches/pypoetry/virtualenvs/fastapi-quick-start-guide-KBe0UMIg-py3.11
$ python3 main.py
INFO: Will watch for changes in these directories: ['/Users/dpills/articles/fastapi-quick-start-guide']
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
INFO: Started reloader process [28609] using WatchFiles
INFO: Started server process [28611]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: 127.0.0.1:50323 - "GET /docs HTTP/1.1" 200 OK
INFO: 127.0.0.1:50323 - "GET /openapi.json HTTP/1.1" 200 OK
INFO: 127.0.0.1:50332 - "GET /items/test123 HTTP/1.1" 200 OK
ℹ️ FastAPI automatically generates an OpenAPI spec from the python metadata and runs a Swagger UI for interactive documentation.
Congrats you just created a simple API! You should see the interactive documentation in your web browser and you can test making an API call. As we make changes to the code and save the server will auto-reload but you may need to refresh the webpage if the documentation has changed.
CRUD with MongoDB
Now that we have our environment setup and API running lets take a look at implementing full CRUD (Create, Read, Update, Delete) functionality by creating some Todo APIs backed by a Mongo Database. Delete the initial read_root
and read_item
example endpoint functions.
Environment Setup
💡 Refer to Containers Demystified 🐳🤔 for a Docker container guide
and MongoDB Quick Start Guide 🍃⚡️ for a MongoDB Guide
Create a .env
file to store our secrets so they are not exposed in our source code.
📝 .env
MONGO_INITDB_ROOT_USERNAME=root
MONGO_INITDB_ROOT_PASSWORD=mySecureDbPassword1
MONGO_URI=mongodb://root:mySecureDbPassword1@localhost:27017/
Add the docker compose spec for the MongoDB.
📝 docker-compose.yml
services:
db:
image: mongo:7.0.1
container_name: myAPIdb
restart: always
ports:
- 27017:27017
env_file:
- .env
volumes:
- type: volume
source: my_api_db_data
target: /data/db
volumes:
my_api_db_data:
Run the MongoDB with Docker Compose.
$ docker-compose up -d
[+] Building 0.0s (0/0) docker-container:unruffled_shockley
[+] Running 3/3
✔ Network fastapi-quick-start-guide_default Created
✔ Volume "fastapi-quick-start-guide_my_api_db_data" Created
✔ Container myAPIdb Started
$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
dfa4bbee67d3 mongo:7.0.1 "docker-entrypoint.s…" 2 minutes ago Up 2 minutes 0.0.0.0:27017->27017/tcp myAPIdb
Now that we have our MongoDB running lets add the python dependencies to use the environment variables and connect to the MongoDB.
$ poetry add python-dotenv pydantic-settings motor
Using version ^1.0.0 for python-dotenv
Using version ^2.0.3 for pydantic-settings
Using version ^3.3.1 for motor
Updating dependencies
Resolving dependencies... (0.1s)
Package operations: 4 installs, 0 updates, 0 removals
• Installing dnspython (2.4.2)
• Installing pymongo (4.5.0)
• Installing motor (3.3.1)
• Installing pydantic-settings (2.0.3)
Writing lock file
$ poetry add -G dev motor-types
Using version ^1.0.0b3 for motor-types
Updating dependencies
Resolving dependencies... (0.1s)
Package operations: 1 install, 0 updates, 0 removals
• Installing motor-types (1.0.0b3)
Use Pydantic settings to load the Mongo URI from the .env
environment variables and setup the connection to the MongoDB using Motor
which is an asynchronous driver since we will be using async
functions.
📝 main.py
import uvicorn
from fastapi import FastAPI
from motor.motor_asyncio import AsyncIOMotorClient
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
mongo_uri: str
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
settings = Settings()
db_client = AsyncIOMotorClient(settings.mongo_uri)
db = db_client.todoDb
app = FastAPI()
...
Create
The HTTP POST
method is used to indicate that a user is creating something. Setup a POST
route to create new todo entries. FastAPI is model driven and uses Pydantic Models which are Python classes to define and validate data types.
📝 main.py
from datetime import datetime
from pydantic import BaseModel
...
class Todo(BaseModel):
title: str
completed: bool = False
class TodoId(BaseModel):
id: str
@app.post("/todos", response_model=TodoId)
async def create_todo(payload: Todo) -> TodoId:
"""
Create a new Todo
"""
now = datetime.utcnow()
insert_result = await db.todos.insert_one(
{
"title": payload.title,
"completed": payload.completed,
"created_date": now,
"updated_date": now,
}
)
return TodoId(id=str(insert_result.inserted_id))
In the Swagger documentation we can see the new endpoint we just wrote along with the schemas which were derived from the Pydantic models. Test it out to make sure the todo item is inserted successfully and we should see the Todo ID returned in the response.
Read
The HTTP GET
method is used to indicate that a user is requesting data from the server. Lets setup two GET routes one for fetching a list of records and one for fetching a single record. For the response data model TodoRecord
we can use inheritance to use the fields from the TodoId
and Todo
models.
📝 main.py
...
class Todo(BaseModel):
title: str
completed: bool = False
class TodoId(BaseModel):
id: str
class TodoRecord(TodoId, Todo):
created_date: datetime
updated_date: datetime
@app.get("/todos", response_model=list[TodoRecord])
async def get_todos() -> list[TodoRecord]:
"""
Get Todos
"""
todos: list[TodoRecord] = []
async for doc in db.todos.find():
todos.append(
TodoRecord(
id=str(doc["_id"]),
title=doc["title"],
completed=doc["completed"],
created_date=doc["created_date"],
updated_date=doc["updated_date"],
)
)
return todos
...
For fetching a single record we will need to take in the todo id
from a user and since this will be a Mongo ObjectId
we will need to make sure it is valid when it is passed in. FastAPI and Pydantic offer a few field type objects to add additional metadata and validation. In this case we can use Path
which allows us to give a description as well as add regex validation to make sure it is in valid ObjectId format.
📝 main.py
...
MONGO_ID_REGEX = r"^[a-f\d]{24}$"
@app.get("/todos/{id}", response_model=TodoRecord)
async def get_todo(
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX)
) -> TodoRecord:
"""
Get a Todo
"""
doc = await db.todos.find_one({"_id": ObjectId(id)})
return TodoRecord(
id=str(doc["_id"]),
title=doc["title"],
completed=doc["completed"],
created_date=doc["created_date"],
updated_date=doc["updated_date"],
)
...
HTTP has well defined response codes that are returned to indicate the status of a request. Typically a code in the 200s is successful, 400s is handled exceptions and 500s are server side errors or unhandled exceptions. FastAPI includes HTTPException
which allows known exceptions to be caught and a valid status code and helpful error message to be returned to the user. In this case when an ID is provided but not found in the database we can return an HTTP 404 Not Found status with some details indicating that a Todo with the provided ID does not exist.
📝 main.py
from fastapi import FastAPI, HTTPException, Path
...
class NotFoundException(BaseModel):
detail: str = "Not Found"
MONGO_ID_REGEX = r"^[a-f\d]{24}$"
@app.get(
"/todos/{id}",
response_model=TodoRecord,
responses={
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def get_todo(
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX)
) -> TodoRecord:
"""
Get a Todo
"""
doc = await db.todos.find_one({"_id": ObjectId(id)})
if not doc:
raise HTTPException(status_code=404, detail="Not Found")
return TodoRecord(
id=str(doc["_id"]),
title=doc["title"],
completed=doc["completed"],
created_date=doc["created_date"],
updated_date=doc["updated_date"],
)
Validate this is working as expected with a valid id and an unknown id.
Update
The HTTP PUT
and PATCH
methods are used to update data. PUT
requires the full payload record to be sent for the update where as PATCH
allows partial updates. Lets add a PUT
route which combines the functionality from the get one todo with passing the Todo Id in the path along with the create todo where we provide a body payload with our updates.
📝 main.py
@app.put(
"/todos/{id}",
response_model=TodoId,
responses={
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def update_todo(
payload: Todo,
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX),
) -> TodoId:
"""
Update a Todo
"""
now = datetime.utcnow()
update_result = await db.todos.update_one(
{"_id": ObjectId(id)},
{
"$set": {
"title": payload.title,
"completed": payload.completed,
"updated_date": now,
}
},
)
if update_result.matched_count == 0:
raise HTTPException(status_code=404, detail="Not Found")
return TodoId(id=id)
Delete
The HTTP DELETE
method is used to delete data and follows the same pattern with using the ID within the URL path.
📝 main.py
@app.delete(
"/todos/{id}",
response_model=bool,
responses={
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def delete_todo(
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX),
) -> bool:
"""
Delete a Todo
"""
delete_result = await db.todos.delete_one({"_id": ObjectId(id)})
if delete_result.deleted_count == 0:
raise HTTPException(status_code=404, detail="Not Found")
return True
Full CRUD Functionality
This covers full CRUD (Create, Read, Update, Delete) functionality for working with our todo data and our python file should now look like this.
📝 main.py
from datetime import datetime
import uvicorn
from bson import ObjectId
from fastapi import FastAPI, HTTPException, Path
from motor.motor_asyncio import AsyncIOMotorClient
from pydantic import BaseModel
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
mongo_uri: str
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
settings = Settings()
db_client = AsyncIOMotorClient(settings.mongo_uri)
db = db_client.todoDb
app = FastAPI()
MONGO_ID_REGEX = r"^[a-f\d]{24}$"
class Todo(BaseModel):
title: str
completed: bool = False
class TodoId(BaseModel):
id: str
class TodoRecord(TodoId, Todo):
created_date: datetime
updated_date: datetime
class NotFoundException(BaseModel):
detail: str = "Not Found"
@app.post("/todos", response_model=TodoId)
async def create_todo(payload: Todo) -> TodoId:
"""
Create a new Todo
"""
now = datetime.utcnow()
insert_result = await db.todos.insert_one(
{
"title": payload.title,
"completed": payload.completed,
"created_date": now,
"updated_date": now,
}
)
return TodoId(id=str(insert_result.inserted_id))
@app.get(
"/todos/{id}",
response_model=TodoRecord,
responses={
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def get_todo(
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX)
) -> TodoRecord:
"""
Get a Todo
"""
doc = await db.todos.find_one({"_id": ObjectId(id)})
if not doc:
raise HTTPException(status_code=404, detail="Not Found")
return TodoRecord(
id=str(doc["_id"]),
title=doc["title"],
completed=doc["completed"],
created_date=doc["created_date"],
updated_date=doc["updated_date"],
)
@app.get("/todos", response_model=list[TodoRecord])
async def get_todos() -> list[TodoRecord]:
"""
Get Todos
"""
todos: list[TodoRecord] = []
async for doc in db.todos.find():
todos.append(
TodoRecord(
id=str(doc["_id"]),
title=doc["title"],
completed=doc["completed"],
created_date=doc["created_date"],
updated_date=doc["updated_date"],
)
)
return todos
@app.put(
"/todos/{id}",
response_model=TodoId,
responses={
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def update_todo(
payload: Todo,
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX),
) -> TodoId:
"""
Update a Todo
"""
now = datetime.utcnow()
update_result = await db.todos.update_one(
{"_id": ObjectId(id)},
{
"$set": {
"title": payload.title,
"completed": payload.completed,
"updated_date": now,
}
},
)
if update_result.matched_count == 0:
raise HTTPException(status_code=404, detail="Not Found")
return TodoId(id=id)
@app.delete(
"/todos/{id}",
response_model=bool,
responses={
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def delete_todo(
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX),
) -> bool:
"""
Delete a Todo
"""
delete_result = await db.todos.delete_one({"_id": ObjectId(id)})
if delete_result.deleted_count == 0:
raise HTTPException(status_code=404, detail="Not Found")
return True
if __name__ == "__main__":
uvicorn.run(
"main:app",
host="0.0.0.0",
port=8000,
log_level="debug",
reload=True,
)
Utilities
Metadata
The Swagger OpenAPI docs that we have been working with are what people will reference when using our API so it is useful to have it display relevant information. With the base FastAPI
app object we can set the API title, version and description which supports markdown and update the Docs location which we will change to the base path /
. Additionally a root_path
can be set in case your API is behind a reverse proxy at a subpath such as /api
but in this case we will be running the API directly and it will be at the root path.
📝 main.py
...
class Settings(BaseSettings):
mongo_uri: str
root_path: str = ""
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
settings = Settings()
db_client = AsyncIOMotorClient(settings.mongo_uri)
db = db_client.todoDb
description = """
This is a fancy API built with [FastAPI🚀](https://fastapi.tiangolo.com/)
📝 [Source Code](https://github.com/dpills/fastapi-prod-guide)
🐞 [Issues](https://github.com/dpills/fastapi-prod-guide/issues)
"""
app = FastAPI(
title="My Todo App",
description=description,
version="1.0.0",
docs_url="/",
root_path=settings.root_path,
)
...
Open your browser to http://localhost:8000/ since we changed the docs_url
and we now see our metadata show up!
Logging
All applications require logging to properly troubleshoot issues and it is a good idea to use the python logging module or a logging library. We can set the format and make use of logging levels which we can set via our environment variable settings so we can adjust it without changing our source code in different environments.
📝 main.py
import logging
import sys
...
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
mongo_uri: str
root_path: str = ""
logging_level: str = "INFO"
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
settings = Settings()
logging.basicConfig(
stream=sys.stdout,
level=settings.logging_level,
format="[%(asctime)s] %(levelname)s [%(name)s.%(funcName)s:%(lineno)d] %(message)s", # noqa: E501
datefmt="%d/%b/%Y %H:%M:%S",
)
logger = logging.getLogger("my-todos")
...
Middleware
Middleware in APIs is useful to avoid duplicating logic required on every endpoint such as CORS, performance metadata or request logging.
Enabling CORS (Cross Origin Resource Sharing) on APIs is required when you want to allow a website hosted on another domain to directly call your API from their Website. For example If I hosted a Frontend React Web App on my-app.com and wanted it to call my API hosted on my-api.com then I would need to add my-app.com to the Allowed Origins. Luckily this is a very common middleware and FastAPI includes support for it out of the box.
📝 main.py
from fastapi.middleware.cors import CORSMiddleware
...
app = FastAPI(
title="My Todo App",
description=description,
version="1.0.0",
docs_url="/",
root_path=settings.root_path,
)
app.add_middleware(
CORSMiddleware,
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
allow_origins=[
"http://localhost:3000",
],
)
Middleware can also run before and after an API function executes which allows us to track the performance and return it in the response header along with logging the request information to the console which can be later be used for troubleshooting and metrics.
📝 main.py
import time
...
from typing import Any, Callable, TypeVar
...
from fastapi import FastAPI, HTTPException, Path, Request, Response
...
F = TypeVar("F", bound=Callable[..., Any])
@app.middleware("http")
async def process_time_log_middleware(request: Request, call_next: F) -> Response:
"""
Add API process time in response headers and log calls
"""
start_time = time.time()
response: Response = await call_next(request)
process_time = str(round(time.time() - start_time, 3))
response.headers["X-Process-Time"] = process_time
logger.info(
"Method=%s Path=%s StatusCode=%s ProcessTime=%s",
request.method,
request.url.path,
response.status_code,
process_time,
)
return response
...
We now see our middleware logging when making calls.
INFO: 127.0.0.1:60005 - "GET /todos HTTP/1.1" 200 OK
[15/Oct/2023 09:31:52] INFO [my-todos.process_time_log_middleware:91] Method=GET Path=/todos StatusCode=200 ProcessTime=0.004
💡 Refer to How to rate limit FastAPI with Redis 📈 to see how to add rate limiting middleware
CRUD APIs With Utilities
📝 main.py
import logging
import sys
import time
from datetime import datetime
from typing import Any, Callable, TypeVar
import uvicorn
from bson import ObjectId
from fastapi import FastAPI, HTTPException, Path, Request, Response
from fastapi.middleware.cors import CORSMiddleware
from motor.motor_asyncio import AsyncIOMotorClient
from pydantic import BaseModel
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
mongo_uri: str
root_path: str = ""
logging_level: str = "INFO"
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
settings = Settings()
logging.basicConfig(
stream=sys.stdout,
level=settings.logging_level,
format="[%(asctime)s] %(levelname)s [%(name)s.%(funcName)s:%(lineno)d] %(message)s", # noqa: E501
datefmt="%d/%b/%Y %H:%M:%S",
)
logger = logging.getLogger("my-todos")
db_client = AsyncIOMotorClient(settings.mongo_uri)
db = db_client.todoDb
description = """
This is a fancy API built with [FastAPI🚀](https://fastapi.tiangolo.com/)
📝 [Source Code](https://github.com/dpills/fastapi-prod-guide)
🐞 [Issues](https://github.com/dpills/fastapi-prod-guide/issues)
"""
app = FastAPI(
title="My Todo App",
description=description,
version="1.0.0",
docs_url="/",
root_path=settings.root_path,
)
app.add_middleware(
CORSMiddleware,
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
allow_origins=[
"http://localhost:3000",
],
)
MONGO_ID_REGEX = r"^[a-f\d]{24}$"
F = TypeVar("F", bound=Callable[..., Any])
class Todo(BaseModel):
title: str
completed: bool = False
class TodoId(BaseModel):
id: str
class TodoRecord(TodoId, Todo):
created_date: datetime
updated_date: datetime
class NotFoundException(BaseModel):
detail: str = "Not Found"
@app.middleware("http")
async def process_time_log_middleware(request: Request, call_next: F) -> Response:
"""
Add API process time in response headers and log calls
"""
start_time = time.time()
response: Response = await call_next(request)
process_time = str(round(time.time() - start_time, 3))
response.headers["X-Process-Time"] = process_time
logger.info(
"Method=%s Path=%s StatusCode=%s ProcessTime=%s",
request.method,
request.url.path,
response.status_code,
process_time,
)
return response
@app.post("/todos", response_model=TodoId)
async def create_todo(payload: Todo) -> TodoId:
"""
Create a new Todo
"""
now = datetime.utcnow()
insert_result = await db.todos.insert_one(
{
"title": payload.title,
"completed": payload.completed,
"created_date": now,
"updated_date": now,
}
)
return TodoId(id=str(insert_result.inserted_id))
@app.get(
"/todos/{id}",
response_model=TodoRecord,
responses={
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def get_todo(
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX)
) -> TodoRecord:
"""
Get a Todo
"""
doc = await db.todos.find_one({"_id": ObjectId(id)})
if not doc:
raise HTTPException(status_code=404, detail="Not Found")
return TodoRecord(
id=str(doc["_id"]),
title=doc["title"],
completed=doc["completed"],
created_date=doc["created_date"],
updated_date=doc["updated_date"],
)
@app.get("/todos", response_model=list[TodoRecord])
async def get_todos() -> list[TodoRecord]:
"""
Get Todos
"""
todos: list[TodoRecord] = []
async for doc in db.todos.find():
todos.append(
TodoRecord(
id=str(doc["_id"]),
title=doc["title"],
completed=doc["completed"],
created_date=doc["created_date"],
updated_date=doc["updated_date"],
)
)
return todos
@app.put(
"/todos/{id}",
response_model=TodoId,
responses={
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def update_todo(
payload: Todo,
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX),
) -> TodoId:
"""
Update a Todo
"""
now = datetime.utcnow()
update_result = await db.todos.update_one(
{"_id": ObjectId(id)},
{
"$set": {
"title": payload.title,
"completed": payload.completed,
"updated_date": now,
}
},
)
if update_result.matched_count == 0:
raise HTTPException(status_code=404, detail="Not Found")
return TodoId(id=id)
@app.delete(
"/todos/{id}",
response_model=bool,
responses={
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def delete_todo(
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX),
) -> bool:
"""
Delete a Todo
"""
delete_result = await db.todos.delete_one({"_id": ObjectId(id)})
if delete_result.deleted_count == 0:
raise HTTPException(status_code=404, detail="Not Found")
return True
if __name__ == "__main__":
uvicorn.run(
"main:app",
host="0.0.0.0",
port=8000,
log_level="debug",
reload=True,
)
Organization
As we continue to add features to our API and adding functions and routes is has made our single main.py
file difficult to manage. Un-organized and low quality standards on code make it difficult to maintain in the long term so we will take a look at adding better standards and organization to our API.
APIRouters
For larger projects FastAPI includes the concept of routers to create sections of an API, The main change is within the subsection you create the an router
object and then the routes use @router.get
syntax instead of @app.get
. Then the router for a section can be imported and included in the app with a tag category and prefix. We will use this pattern to restructure our API.
Project Structure
Move all environment variables to the central config file.
📝 app/config.py
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
mongo_uri: str
root_path: str = ""
logging_level: str = "INFO"
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
settings = Settings()
Add global static values that may be re-used in multiple places of the API.
📝 app/static_values.py
MONGO_ID_REGEX = r"^[a-f\d]{24}$"
Move the database connection object creation and logging handler to the utilities
folder
📝 app/utilities/db.py
from motor.motor_asyncio import AsyncIOMotorClient
from app.config import settings
db_client = AsyncIOMotorClient(settings.mongo_uri)
db = db_client.todoDb
📝 app/utilities/logger.py
import logging
import sys
from app.config import settings
logging.basicConfig(
stream=sys.stdout,
level=settings.logging_level,
format="[%(asctime)s] %(levelname)s [%(name)s.%(funcName)s:%(lineno)d] %(message)s", # noqa: E501
datefmt="%d/%b/%Y %H:%M:%S",
)
logger = logging.getLogger("my-todos")
Move models into their own file in the associated todos API route.
📝 app/routers/todos/models.py
from datetime import datetime
from pydantic import BaseModel
class Todo(BaseModel):
title: str
completed: bool = False
class TodoId(BaseModel):
id: str
class TodoRecord(TodoId, Todo):
created_date: datetime
updated_date: datetime
class NotFoundException(BaseModel):
detail: str = "Not Found"
Move all of the todo related functions into their router files. We will switch to using routers with the @router
syntax and also adjust the paths so we can add /v1/todos
prefixes when we include the router in the main app.
📝 app/routers/todos/todos.py
from datetime import datetime
from bson import ObjectId
from fastapi import APIRouter, HTTPException, Path
from app.static_values import MONGO_ID_REGEX
from app.utilities.db import db
from .models import NotFoundException, Todo, TodoId, TodoRecord
router = APIRouter()
@router.post("", response_model=TodoId)
async def create_todo(payload: Todo) -> TodoId:
"""
Create a new Todo
"""
now = datetime.utcnow()
insert_result = await db.todos.insert_one(
{
"title": payload.title,
"completed": payload.completed,
"created_date": now,
"updated_date": now,
}
)
return TodoId(id=str(insert_result.inserted_id))
@router.get(
"/{id}",
response_model=TodoRecord,
responses={
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def get_todo(
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX)
) -> TodoRecord:
"""
Get a Todo
"""
doc = await db.todos.find_one({"_id": ObjectId(id)})
if not doc:
raise HTTPException(status_code=404, detail="Not Found")
return TodoRecord(
id=str(doc["_id"]),
title=doc["title"],
completed=doc["completed"],
created_date=doc["created_date"],
updated_date=doc["updated_date"],
)
@router.get("", response_model=list[TodoRecord])
async def get_todos() -> list[TodoRecord]:
"""
Get Todos
"""
todos: list[TodoRecord] = []
async for doc in db.todos.find():
todos.append(
TodoRecord(
id=str(doc["_id"]),
title=doc["title"],
completed=doc["completed"],
created_date=doc["created_date"],
updated_date=doc["updated_date"],
)
)
return todos
@router.put(
"/{id}",
response_model=TodoId,
responses={
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def update_todo(
payload: Todo,
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX),
) -> TodoId:
"""
Update a Todo
"""
now = datetime.utcnow()
update_result = await db.todos.update_one(
{"_id": ObjectId(id)},
{
"$set": {
"title": payload.title,
"completed": payload.completed,
"updated_date": now,
}
},
)
if update_result.matched_count == 0:
raise HTTPException(status_code=404, detail="Not Found")
return TodoId(id=id)
@router.delete(
"/{id}",
response_model=bool,
responses={
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def delete_todo(
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX),
) -> bool:
"""
Delete a Todo
"""
delete_result = await db.todos.delete_one({"_id": ObjectId(id)})
if delete_result.deleted_count == 0:
raise HTTPException(status_code=404, detail="Not Found")
return True
Finally we will import our API router and include them in our FastAPI App and adjust our uvicorn
run function to point to our new app path app.main:app
.
📝 app/main.py
import time
from typing import Any, Callable, TypeVar
import uvicorn
from fastapi import FastAPI, Request, Response
from fastapi.middleware.cors import CORSMiddleware
from app.config import settings
from app.routers.todos import todos
from app.utilities.logger import logger
description = """
This is a fancy API built with [FastAPI🚀](https://fastapi.tiangolo.com/)
📝 [Source Code](https://github.com/dpills/fastapi-prod-guide)
🐞 [Issues](https://github.com/dpills/fastapi-prod-guide/issues)
"""
app = FastAPI(
title="My Todo App",
description=description,
version="1.0.0",
docs_url="/",
root_path=settings.root_path,
)
app.add_middleware(
CORSMiddleware,
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
allow_origins=[
"http://localhost:3000",
],
)
F = TypeVar("F", bound=Callable[..., Any])
@app.middleware("http")
async def process_time_log_middleware(request: Request, call_next: F) -> Response:
"""
Add API process time in response headers and log calls
"""
start_time = time.time()
response: Response = await call_next(request)
process_time = str(round(time.time() - start_time, 3))
response.headers["X-Process-Time"] = process_time
logger.info(
"Method=%s Path=%s StatusCode=%s ProcessTime=%s",
request.method,
request.url.path,
response.status_code,
process_time,
)
return response
app.include_router(
todos.router,
prefix="/v1/todos",
tags=["todos"],
)
if __name__ == "__main__":
uvicorn.run(
"app.main:app",
host="0.0.0.0",
port=8000,
log_level="debug",
reload=True,
)
Now that we have updated our project structure we can run our app from the base project folder using the -m
python flag.
$ python3 -m app.main
INFO: Will watch for changes in these directories: ['/Users/dpills/articles/fastapi-quick-start-guide']
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
INFO: Started reloader process [96492] using WatchFiles
INFO: Started server process [96494]
INFO: Waiting for application startup.
INFO: Application startup complete.
The API documentation now include the /v1/todos
prefixes along with the todos
category section thanks to the router tags.
Security
Up until this point our API has not had any authentication or authorization built in which leaves it vulnerable and makes it to where all usage is anonymous and we cannot differentiate per user actions. It is important to build in modern auth mechanisms with a common option being Oauth2 using Bearer tokens. There are many different Oauth2 identity providers and depending on how you deploy your application you can take advantage of integrated auth services, you could have a reverse proxy like NGINX in front of your API which handles the auth or you can build the auth directly into your application code. We will be building the auth directly into our API utilizing Github Oauth and an Authorization Code
flow.
GitHub Oauth Setup
Navigate to the GitHub Oauth Apps
developer settings at https://github.com/settings/developers and create a new oauth app.
Then generate the secret after the app creation.
Add the client id and secret to our .env
file.
📝 .env
MONGO_INITDB_ROOT_USERNAME=root
MONGO_INITDB_ROOT_PASSWORD=mySecureDbPassword1
MONGO_URI=mongodb://root:mySecureDbPassword1@localhost:27017/
GITHUB_OAUTH_CLIENT_ID=0ec7d96992836a5fbb98
GITHUB_OAUTH_CLIENT_SECRET=e813b0cdb4d402d55d5d09cfd08dfd06xxxxxxxx
As well as the config settings so we can use them in our API.
📝 app/config.py
...
class Settings(BaseSettings):
mongo_uri: str
github_oauth_client_id: str
github_oauth_client_secret: str
root_path: str = ""
logging_level: str = "INFO"
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
...
Oauth Callback
We will need to make HTTP calls to Github to get a users access token and username. The most popular HTTP Python library is Requests but since we are using asynchronous functions we will use the HTTPX library which has async support.
$ poetry add httpx
Using version ^0.25.0 for httpx
Updating dependencies
Resolving dependencies... (0.2s)
No dependencies to install or update
Writing lock file
Add a new auth router folder and setup the oauth callback endpoint.
Users will need to navigate to https://github.com/login/oauth/authorize?client_id=GITHUB_OAUTH_CLIENT_ID&redirect_uri=http://localhost:8000/v1/auth/callback with your apps client id in order to authorize it to receive their access token and get their identity. It will redirect back to our endpoint with a code in the URL parameter /v1/auth/callback?code=xxxx
and we can use this code to get the user’s access token and username. We can then add the user to our database with a secure SHA-256 hash of their access token which we can use to lookup the user in the future.
📝 app/routers/auth/models.py
from pydantic import BaseModel
class OauthException(BaseModel):
detail: str
class OauthToken(BaseModel):
access_token: str
The Oauth code will be sent back to our endpoint
📝 app/routers/auth/auth.py
import hashlib
from datetime import datetime
import httpx
from fastapi import APIRouter, HTTPException, Query
from app.config import settings
from app.utilities.db import db
from .models import OauthException, OauthToken
router = APIRouter()
@router.get(
"/callback",
response_model=OauthToken,
responses={
400: {"description": "Oauth Error", "model": OauthException},
},
)
async def oauth_callback(
code: str = Query(description="Authorization Code"),
) -> OauthToken:
"""
GitHub Oauth Integration Callback
"""
async with httpx.AsyncClient() as client:
token_result = await client.post(
"https://github.com/login/oauth/access_token",
json={
"client_id": settings.github_oauth_client_id,
"client_secret": settings.github_oauth_client_secret,
"code": code,
"redirect_uri": "http://localhost:8000/v1/auth/callback",
},
headers={"Accept": "application/json"},
)
data = token_result.json()
error = data.get("error")
if error:
raise HTTPException(
status_code=400,
detail=f"{data.get('error')}: {data.get('error_description')}",
)
access_token: str = data.get("access_token")
user_result = await client.get(
"https://api.github.com/user",
headers={"Authorization": f"Bearer {access_token}"},
)
user_data = user_result.json()
user = user_data.get("login")
await db.tokens.insert_one(
{
"user": user,
"access_token_hash": hashlib.sha256(access_token.encode()).hexdigest(),
"created_date": datetime.utcnow(),
},
)
return OauthToken(access_token=access_token)
Include the new auth router in our [main.py](http://main.py)
file and include the Github authorization link in our API description.
📝 app/main.py
...
from app.routers.auth import auth
...
description = f"""
This is a fancy API built with [FastAPI🚀](https://fastapi.tiangolo.com/)
Authorize to get an Access Token from GitHub at <https://github.com/login/oauth/authorize?client_id={settings.github_oauth_client_id}&redirect_uri=http://localhost:8000/v1/auth/callback>
📝 [Source Code](https://github.com/dpills/fastapi-prod-guide)
🐞 [Issues](https://github.com/dpills/fastapi-prod-guide/issues)
"""
...
app.include_router(
auth.router,
prefix="/v1/auth",
tags=["auth"],
)
...
You can now test navigating to the Github authorization link in the API description and validate that a Access Token is returned.
{
"access_token": "gho_aBplc0MVRPFeeyE95UJPg209LRSp7V1xxxxx"
}
Validating an Access Token
Create a function in the auth file to take in an access token check our DB cache if we know the user else validate it against the Github API and if it is still invalid raise a 401 error.
📝 app/routers/auth/auth.py
from typing import Annotated
...
from fastapi import APIRouter, Depends, HTTPException, Query
from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer
...
router = APIRouter()
security = HTTPBearer()
...
async def validate_access_token(
access_token: Annotated[HTTPAuthorizationCredentials, Depends(security)]
) -> str:
"""
Validate an access token
Returns the username or raises a 401 HTTPException
"""
access_token_hash = hashlib.sha256(access_token.credentials.encode()).hexdigest()
cached_token = await db.tokens.find_one({"access_token_hash": access_token_hash})
if cached_token:
user: str | None = cached_token.get("user")
if user:
return user
async with httpx.AsyncClient() as client:
user_result = await client.get(
"https://api.github.com/user",
headers={"Authorization": f"Bearer {access_token.credentials}"},
)
if user_result.status_code == 200:
user_data = user_result.json()
user = user_data.get("login")
if user:
await db.tokens.insert_one(
{
"user": user,
"access_token_hash": access_token_hash,
"created_date": datetime.utcnow(),
},
)
return user
raise HTTPException(
status_code=401,
detail="Unauthorized",
)
Since we will be looking up access token hashes in the MongoDB we can use a mongosh
shell to add a unique index to that field to make it more performant. We can also add a TTL Index so that we on only cache tokens for 24 hours.
> use todoDb
switched to db todoDb
> db.tokens.createIndex( { "access_token_hash": 1 }, { unique: true } )
access_token_hash_1
> db.tokens.createIndex( { "created_date": 1 }, { expireAfterSeconds: 86400 } )
created_date_1
💡 Refer to MongoDB Quick Start Guide 🍃⚡️ for a MongoDB Guide
Add Bearer Token Authorization to Endpoints
Add the Bearer token auth and the validate_access_token
function to the todos endpoints.
📝 app/routers/todos/todos.py
...
from typing import Annotated
...
from fastapi import APIRouter, Depends, HTTPException, Path
from fastapi.security import (
HTTPAuthorizationCredentials,
HTTPBearer,
)
from app.routers.auth.auth import validate_access_token
from app.routers.auth.models import UnauthorizedException
...
security = HTTPBearer()
@router.post(
"",
response_model=TodoId,
responses={
401: {"description": "Unauthorized", "model": UnauthorizedException},
},
)
async def create_todo(
access_token: Annotated[HTTPAuthorizationCredentials, Depends(security)],
user: Annotated[str, Depends(validate_access_token)],
payload: Todo,
) -> TodoId:
...
@router.get(
"/{id}",
response_model=TodoRecord,
responses={
401: {"description": "Unauthorized", "model": UnauthorizedException},
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def get_todo(
access_token: Annotated[HTTPAuthorizationCredentials, Depends(security)],
user: Annotated[str, Depends(validate_access_token)],
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX),
) -> TodoRecord:
...
@router.get(
"",
response_model=list[TodoRecord],
responses={
401: {"description": "Unauthorized", "model": UnauthorizedException},
},
)
async def get_todos(
access_token: Annotated[HTTPAuthorizationCredentials, Depends(security)],
user: Annotated[str, Depends(validate_access_token)],
) -> list[TodoRecord]:
...
@router.put(
"/{id}",
response_model=TodoId,
responses={
401: {"description": "Unauthorized", "model": UnauthorizedException},
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def update_todo(
access_token: Annotated[HTTPAuthorizationCredentials, Depends(security)],
user: Annotated[str, Depends(validate_access_token)],
payload: Todo,
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX),
) -> TodoId:
...
@router.delete(
"/{id}",
response_model=bool,
responses={
401: {"description": "Unauthorized", "model": UnauthorizedException},
404: {"description": "Not Found", "model": NotFoundException},
},
)
async def delete_todo(
access_token: Annotated[HTTPAuthorizationCredentials, Depends(security)],
user: Annotated[str, Depends(validate_access_token)],
id: str = Path(description="Todo ID", pattern=MONGO_ID_REGEX),
) -> bool:
...
Validate that the auth is working correctly.
$ curl -v http://localhost:8000/v1/todos
< HTTP/1.1 403 Forbidden
{
"detail": "Not authenticated"
}
$ curl -v http://localhost:8000/v1/todos -H 'Authorization: Bearer BAD_TOKEN'
< HTTP/1.1 401 Unauthorized
{
"detail": "Unauthorized"
}
$ curl -v http://localhost:8000/v1/todos -H 'Authorization: Bearer gho_aBplc0MVRPFeeyE95UJPg209LRSp7V1xxxxx'
< HTTP/1.1 200 OK
[
{
"title": "Create CRUD APIs",
"completed": true,
"id": "652bddc9a09fc3e748c3d5e7",
"created_date": "2023-10-15T12:40:41.191000",
"updated_date": "2023-10-15T12:42:19.153000"
}
]
The todo endpoints will show a lock icon in the swagger does now as well.
Integrate User Info
Up until this point we have been storing our todo records without usernames which is fine for a single user but in order to support multiple users we need to track which user owns a record. Then we can also odd in logic into our Read, Update and Delete endpoints to allow users to only fetch, modify and delete their own todos.
📝 app/routers/todos/models.py
...
class TodoRecord(TodoId, Todo):
user: str
created_date: datetime
updated_date: datetime
...
📝 app/routers/todos/todos.py
...
async def create_todo(
...
insert_result = await db.todos.insert_one(
{
"title": payload.title,
"completed": payload.completed,
"user": user,
"created_date": now,
"updated_date": now,
}
)
...
async def get_todo(
...
doc = await db.todos.find_one({"_id": ObjectId(id), "user": user})
...
return TodoRecord(
id=str(doc["_id"]),
title=doc["title"],
completed=doc["completed"],
user=doc["user"],
created_date=doc["created_date"],
updated_date=doc["updated_date"],
)
...
async def get_todos(
...
async for doc in db.todos.find({"user": user}):
todos.append(
TodoRecord(
id=str(doc["_id"]),
title=doc["title"],
completed=doc["completed"],
user=doc["user"],
created_date=doc["created_date"],
updated_date=doc["updated_date"],
)
)
...
async def update_todo(
...
update_result = await db.todos.update_one(
{"_id": ObjectId(id), "user": user},
{
"$set": {
"title": payload.title,
"completed": payload.completed,
"updated_date": now,
}
},
)
...
async def delete_todo(
...
delete_result = await db.todos.delete_one({"_id": ObjectId(id), "user": user})
...
Delete any existing todo entries from the database and test out creating and fetching a new todo record.
Since we are doing Mongo lookups with the new user
field, add indexes for that field so it is more performant.
> use todoDb
switched to db todoDb
> db.todos.createIndex( { "user": 1 })
user_1
> db.todos.createIndex( { "_id": 1, "user": 1 })
_id_1_user_1
Testing
Testing is critical to make sure your API or Application is working correctly and setting up automated testing before merging or deploying updates gives you the confidence that nothing has broken which allows you to push releases more frequently. There are many different categories of testing but we will take a look at implementing unit tests and integration tests and verifying our code coverage.
Test Environment Setup
Add the test, coverage and mock libraries.
$ poetry add -G dev pytest coverage mongomock-motor pytest_httpx pytest-asyncio
Using version ^7.4.2 for pytest
Using version ^7.3.2 for coverage
Using version ^0.0.21 for mongomock-motor
Using version ^0.21.1 for pytest-asyncio
Updating dependencies
Resolving dependencies... (0.3s)
Package operations: 9 installs, 0 updates, 0 removals
• Installing iniconfig (2.0.0)
• Installing pluggy (1.3.0)
• Installing sentinels (1.0.0)
• Installing mongomock (4.1.2)
• Installing pytest (7.4.2)
• Installing coverage (7.3.2)
• Installing mongomock-motor (0.0.21)
• Installing pytest-asyncio (0.21.1)
Add a testing
boolean to the settings so we can indicate we are running tests within the app.
📝 app/config.py
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
mongo_uri: str
github_oauth_client_id: str
github_oauth_client_secret: str
root_path: str = ""
logging_level: str = "INFO"
testing: bool = False
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
settings = Settings()
Update the Database file to use the testing
setting to indicate whether we should load our mock in-memory MongoDB or the actual MongoDB connection.
📝 app/utilities/db.py
from motor.motor_asyncio import AsyncIOMotorClient, AsyncIOMotorDatabase
from app.config import settings
def get_db() -> AsyncIOMotorDatabase:
"""
Get MongoDB
"""
if settings.testing:
from mongomock_motor import AsyncMongoMockClient
mock_db: AsyncIOMotorDatabase = AsyncMongoMockClient().todoDb
return mock_db
else:
return AsyncIOMotorClient(settings.mongo_uri).todoDb
db = get_db()
Add a conftest.py file which is used for global testing configs, we will setup the event loop, instantiate our test client and add a test user to our mock database so we can properly test authenticated endpoints.
📝 app/conftest.py
import asyncio
import hashlib
from datetime import datetime
from typing import Any, Generator
import pytest
from httpx import AsyncClient
from app.main import app
from app.utilities.db import db
async def add_db_test_user() -> None:
"""
Add test user to Database
"""
await db.tokens.update_one(
{"user": "tester"},
{
"$set": {
"access_token_hash": hashlib.sha256("GOOD_TOKEN".encode()).hexdigest(),
"created_date": datetime.utcnow(),
}
},
upsert=True,
)
return None
@pytest.fixture(scope="session")
def event_loop() -> Generator[asyncio.AbstractEventLoop, Any, None]:
"""
Override Event Loop
"""
try:
loop = asyncio.get_running_loop()
except RuntimeError:
loop = asyncio.new_event_loop()
# Add test user to DB
loop.run_until_complete(add_db_test_user())
yield loop
loop.close()
@pytest.fixture()
def test_client() -> AsyncClient:
"""
Create an instance of the client
"""
return AsyncClient(app=app, base_url="http://test", follow_redirects=True)
Adding Tests
We can now add tests to our todo endpoints.
📝 app/routers/todos/test_todos.py
import pytest
from httpx import AsyncClient
pytestmark = pytest.mark.asyncio
AUTH_HEADER = {"Authorization": "Bearer GOOD_TOKEN"}
async def test_create_todo(test_client: AsyncClient) -> None:
"""
Test Creating a todo
"""
# No Bearer Token
r = await test_client.post("/v1/todos", json={"title": "test", "completed": False})
assert r.status_code == 403
# Invalid Bearer Token
r = await test_client.post(
"/v1/todos",
json={"title": "test", "completed": False},
headers={"Authorization": "Bearer BAD_TOKEN"},
)
assert r.status_code == 401
# Valid Bearer Token
r = await test_client.post(
"/v1/todos",
json={"title": "create_test", "completed": False},
headers=AUTH_HEADER,
)
assert r.status_code == 200
assert r.json().get("id")
async def test_get_todos(test_client: AsyncClient) -> None:
"""
Test Fetching todos
"""
# Get all Todos
r = await test_client.get("/v1/todos", headers=AUTH_HEADER)
assert r.status_code == 200
results = r.json()
assert results
# Get single Todo
todo_id = results[0].get("id")
r = await test_client.get(f"/v1/todos/{todo_id}", headers=AUTH_HEADER)
assert r.status_code == 200
# Unknown Todo ID
r = await test_client.get("/v1/todos/652d729bb8da04810695a943", headers=AUTH_HEADER)
assert r.status_code == 404
async def test_update_todo(test_client: AsyncClient) -> None:
"""
Test updating a todo
"""
# Get all Todos
r = await test_client.get("/v1/todos", headers=AUTH_HEADER)
assert r.status_code == 200
results = r.json()
assert results
# Update a Todo
todo_id = results[0].get("id")
r = await test_client.put(
f"/v1/todos/{todo_id}",
json={"title": "update_test", "completed": True},
headers=AUTH_HEADER,
)
assert r.status_code == 200
# Unknown Todo ID
r = await test_client.put(
"/v1/todos/652d729bb8da04810695a943",
json={"title": "update_test", "completed": True},
headers=AUTH_HEADER,
)
assert r.status_code == 404
async def test_delete_todo(test_client: AsyncClient) -> None:
"""
Test deleting a todo
"""
# Get all Todos
r = await test_client.get("/v1/todos", headers=AUTH_HEADER)
assert r.status_code == 200
results = r.json()
assert results
# Delete a Todo
todo_id = results[0].get("id")
r = await test_client.delete(
f"/v1/todos/{todo_id}",
headers=AUTH_HEADER,
)
assert r.status_code == 200
# Unknown Todo ID
r = await test_client.delete(
"/v1/todos/652d729bb8da04810695a943",
headers=AUTH_HEADER,
)
assert r.status_code == 404
Run the test suite with the coverage and pytest command to generate coverage output. Make sure to set the TESTING
environment variable at the start.
$ export TESTING=true && poetry run coverage run --source ./app -m pytest --disable-warnings
==================================================== test session starts =====================================================
platform darwin -- Python 3.11.4, pytest-7.4.2, pluggy-1.3.0
rootdir: /Users/depillsb/articles/fastapi-quick-start-guide
plugins: httpx-0.26.0, asyncio-0.21.1, anyio-3.7.1
asyncio: mode=Mode.STRICT
collected 4 items
app/routers/todos/test_todos.py .... [100%]
================================================ 4 passed, 1 warning in 1.23s ================================================
$ poetry run coverage html
Wrote HTML report to htmlcov/index.html
Opening the generated htmlcov/index.html
in a web browser shows the coverage percentage for the app and our todo endpoints should now be 100% covered!
Production Deployment
In development we have been using uvicorn
which runs a single instance of our API but in production we can handle more load if we run multiple instances and load balance traffic against them and this is what gunicorn
allows us to do. We can run gunicorn
specifying the number of workers we want running and tell it to use uvicorn
for the workers.
Install gunicorn and add a gunicorn_conf.py
file to the project.
$ poetry add gunicorn
Using version ^21.2.0 for gunicorn
Updating dependencies
Resolving dependencies... (0.2s)
Package operations: 1 install, 0 updates, 0 removals
• Installing gunicorn (21.2.0)
Writing lock file
📝 gunicorn_conf.py
import json
import multiprocessing
import os
workers_per_core_str = os.getenv("WORKERS_PER_CORE", "1")
max_workers_str = os.getenv("MAX_WORKERS", "10")
use_max_workers = None
if max_workers_str:
use_max_workers = int(max_workers_str)
web_concurrency_str = os.getenv("WEB_CONCURRENCY", None)
host = os.getenv("HOST", "0.0.0.0")
port = os.getenv("PORT", "8000")
bind_env = os.getenv("BIND", None)
use_loglevel = os.getenv("LOG_LEVEL", "info")
if bind_env:
use_bind = bind_env
else:
use_bind = f"{host}:{port}"
cores = multiprocessing.cpu_count()
workers_per_core = float(workers_per_core_str)
default_web_concurrency = workers_per_core * cores
if web_concurrency_str:
web_concurrency = int(web_concurrency_str)
assert web_concurrency > 0
else:
web_concurrency = max(int(default_web_concurrency), 2)
if use_max_workers:
web_concurrency = min(web_concurrency, use_max_workers)
accesslog_var = os.getenv("ACCESS_LOG", "-")
use_accesslog = accesslog_var or None
errorlog_var = os.getenv("ERROR_LOG", "-")
use_errorlog = errorlog_var or None
graceful_timeout_str = os.getenv("GRACEFUL_TIMEOUT", "60")
timeout_str = os.getenv("TIMEOUT", "60")
keepalive_str = os.getenv("KEEP_ALIVE", "5")
# Gunicorn config variables
worker_class = "app.workers.ConfigurableWorker"
loglevel = use_loglevel
workers = web_concurrency
bind = use_bind
errorlog = use_errorlog
worker_tmp_dir = "/tmp/shm"
accesslog = use_accesslog
graceful_timeout = int(graceful_timeout_str)
timeout = int(timeout_str)
keepalive = int(keepalive_str)
# For debugging and testing
log_data = {
"loglevel": loglevel,
"workers": workers,
"bind": bind,
"graceful_timeout": graceful_timeout,
"timeout": timeout,
"keepalive": keepalive,
"errorlog": errorlog,
"accesslog": accesslog,
# Additional, non-gunicorn variables
"workers_per_core": workers_per_core,
"use_max_workers": use_max_workers,
"host": host,
"port": port,
}
print(json.dumps(log_data))
Running the app with Gunicorn runs 10 instances of the API and load balances across them.
$ gunicorn -k uvicorn.workers.UvicornWorker -c gunicorn_conf.py app.main:app
{"loglevel": "info", "workers": 10, "bind": "0.0.0.0:8000", "graceful_timeout": 60, "timeout": 60, "keepalive": 5, "errorlog": "-", "accesslog": "-", "workers_per_core": 1.0, "use_max_workers": 10, "host": "0.0.0.0", "port": "8000"}
[2023-10-15 14:51:23 -0400] [37562] [INFO] Starting gunicorn 21.2.0
[2023-10-15 14:51:23 -0400] [37562] [INFO] Listening at: http://0.0.0.0:8000 (37562)
[2023-10-15 14:51:23 -0400] [37562] [INFO] Using worker: uvicorn.workers.UvicornWorker
[2023-10-15 14:51:23 -0400] [37563] [INFO] Booting worker with pid: 37563
[2023-10-15 14:51:23 -0400] [37566] [INFO] Booting worker with pid: 37566
[2023-10-15 14:51:23 -0400] [37567] [INFO] Booting worker with pid: 37567
[2023-10-15 14:51:23 -0400] [37568] [INFO] Booting worker with pid: 37568
[2023-10-15 14:51:23 -0400] [37569] [INFO] Booting worker with pid: 37569
[2023-10-15 14:51:23 -0400] [37570] [INFO] Booting worker with pid: 37570
[2023-10-15 14:51:23 -0400] [37571] [INFO] Booting worker with pid: 37571
[2023-10-15 14:51:23 -0400] [37572] [INFO] Booting worker with pid: 37572
[2023-10-15 14:51:24 -0400] [37573] [INFO] Booting worker with pid: 37573
[2023-10-15 14:51:24 -0400] [37574] [INFO] Booting worker with pid: 37574
[2023-10-15 14:51:24 -0400] [37563] [INFO] Started server process [37563]
[2023-10-15 14:51:24 -0400] [37563] [INFO] Waiting for application startup.
[2023-10-15 14:51:24 -0400] [37563] [INFO] Application startup complete.
[2023-10-15 14:51:24 -0400] [37567] [INFO] Started server process [37567]
[2023-10-15 14:51:24 -0400] [37567] [INFO] Waiting for application startup.
[2023-10-15 14:51:24 -0400] [37567] [INFO] Application startup complete.
[2023-10-15 14:51:24 -0400] [37566] [INFO] Started server process [37566]
[2023-10-15 14:51:24 -0400] [37566] [INFO] Waiting for application startup.
[2023-10-15 14:51:24 -0400] [37566] [INFO] Application startup complete.
[2023-10-15 14:51:24 -0400] [37569] [INFO] Started server process [37569]
[2023-10-15 14:51:24 -0400] [37569] [INFO] Waiting for application startup.
[2023-10-15 14:51:24 -0400] [37569] [INFO] Application startup complete.
[2023-10-15 14:51:24 -0400] [37568] [INFO] Started server process [37568]
[2023-10-15 14:51:24 -0400] [37568] [INFO] Waiting for application startup.
[2023-10-15 14:51:24 -0400] [37568] [INFO] Application startup complete.
[2023-10-15 14:51:24 -0400] [37570] [INFO] Started server process [37570]
[2023-10-15 14:51:24 -0400] [37570] [INFO] Waiting for application startup.
[2023-10-15 14:51:24 -0400] [37570] [INFO] Application startup complete.
[2023-10-15 14:51:25 -0400] [37571] [INFO] Started server process [37571]
[2023-10-15 14:51:25 -0400] [37571] [INFO] Waiting for application startup.
[2023-10-15 14:51:25 -0400] [37571] [INFO] Application startup complete.
[2023-10-15 14:51:25 -0400] [37572] [INFO] Started server process [37572]
[2023-10-15 14:51:25 -0400] [37572] [INFO] Waiting for application startup.
[2023-10-15 14:51:25 -0400] [37572] [INFO] Application startup complete.
[2023-10-15 14:51:25 -0400] [37574] [INFO] Started server process [37574]
[2023-10-15 14:51:25 -0400] [37574] [INFO] Waiting for application startup.
[2023-10-15 14:51:25 -0400] [37574] [INFO] Application startup complete.
[2023-10-15 14:51:25 -0400] [37573] [INFO] Started server process [37573]
[2023-10-15 14:51:25 -0400] [37573] [INFO] Waiting for application startup.
[2023-10-15 14:51:25 -0400] [37573] [INFO] Application startup complete.
[2023-10-15 14:51:26 -0400] [37562] [INFO] Handling signal: winch
Containerization
Now lets containerize this application so it is easier to use and so it can be deployed to a cloud environment. Add a Dockerfile
in the base project folder where we will use a Python 3.11 version, use a multi-stage build to omit the poetry dependencies, copy over our code, install our requirements and run our API or port 8000 with 10 workers.
📝 Dockerfile
FROM python:3.11-slim-bookworm as requirements-stage
RUN pip install poetry
COPY ./pyproject.toml ./poetry.lock /
RUN poetry export -f requirements.txt --output requirements.txt --without-hashes --without=dev
FROM python:3.11-slim-bookworm
COPY --from=requirements-stage /requirements.txt /requirements.txt
COPY ./pyproject.toml ./gunicorn_conf.py /
COPY ./app /app
RUN python3 -m pip install --no-cache-dir --upgrade -r requirements.txt
RUN mkdir -p /tmp/shm && mkdir /.local
ENV PORT 8000
EXPOSE 8000
ENTRYPOINT ["gunicorn", "-k", "uvicorn.workers.UvicornWorker", "-c", "gunicorn_conf.py", "app.main:app"]
Build the container image
$ docker build . -t fastapi-todos:1.0.0
[+] Building 29.1s (14/14) FINISHED docker:desktop-linux
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 614B 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> [internal] load metadata for docker.io/library/python:3.11-slim-bookworm 1.8s
=> [auth] library/python:pull token for registry-1.docker.io 0.0s
=> [internal] load build context 0.0s
=> => transferring context: 132.09kB 0.0s
=> [requirements-stage 1/4] FROM docker.io/library/python:3.11-slim-bookworm@sha256:fda05d00fc47a4133a1b65bdd89007facf4ec0fa5fb737a35652699b18029830 3.5s
=> => resolve docker.io/library/python:3.11-slim-bookworm@sha256:fda05d00fc47a4133a1b65bdd89007facf4ec0fa5fb737a35652699b18029830 0.0s
=> => sha256:5f658eaeb6f6b3d1c7e64402784a96941bb104650e33f18675d8a9aea28cfab2 3.33MB / 3.33MB 0.3s
=> => sha256:3a21992ae6ea870866a08c3130aeedc758053117776270e79ce8f50c2d5ecb36 12.81MB / 12.81MB 0.7s
=> => sha256:fda05d00fc47a4133a1b65bdd89007facf4ec0fa5fb737a35652699b18029830 1.65kB / 1.65kB 0.0s
=> => sha256:b26e46d3c77da193a5057346b362bcb1db25ea53fda4e66c34655c1e7c838984 1.37kB / 1.37kB 0.0s
=> => sha256:e35901feb1c144a36a7e654cb313632ff45bb290a031fd98a41845722872b86e 6.95kB / 6.95kB 0.0s
=> => sha256:1bc163a14ea6a886d1d0f9a9be878b1ffd08a9311e15861137ccd85bb87190f9 29.18MB / 29.18MB 1.0s
=> => sha256:94f5f9f3c96b32388cb097a0adee2086ff4e0a298e5812c4389265c6eaac9bae 244B / 244B 0.5s
=> => sha256:82d03286152357f66ff0e02abec8a43980f8f73065d8ea9d122de8fa46e1e31a 3.39MB / 3.39MB 0.8s
=> => extracting sha256:1bc163a14ea6a886d1d0f9a9be878b1ffd08a9311e15861137ccd85bb87190f9 1.5s
=> => extracting sha256:5f658eaeb6f6b3d1c7e64402784a96941bb104650e33f18675d8a9aea28cfab2 0.1s
=> => extracting sha256:3a21992ae6ea870866a08c3130aeedc758053117776270e79ce8f50c2d5ecb36 0.5s
=> => extracting sha256:94f5f9f3c96b32388cb097a0adee2086ff4e0a298e5812c4389265c6eaac9bae 0.0s
=> => extracting sha256:82d03286152357f66ff0e02abec8a43980f8f73065d8ea9d122de8fa46e1e31a 0.2s
=> [requirements-stage 2/4] RUN pip install poetry 12.9s
=> [requirements-stage 3/4] COPY ./pyproject.toml ./poetry.lock / 0.0s
=> [requirements-stage 4/4] RUN poetry export -f requirements.txt --output requirements.txt --without-hashes --without=dev 0.6s
=> [stage-1 2/5] COPY --from=requirements-stage /requirements.txt /requirements.txt 0.0s
=> [stage-1 3/5] COPY ./pyproject.toml ./gunicorn_conf.py / 0.0s
=> [stage-1 4/5] COPY ./app /app 0.0s
=> [stage-1 5/5] RUN python3 -m pip install --no-cache-dir --upgrade -r requirements.txt 9.8s
=> exporting to image 0.3s
=> => exporting layers 0.3s
=> => writing image sha256:a6b09ad14d165189a8c9cbdb9b1dfffc525907ff71c053af60b37701852138c2 0.0s
=> => naming to docker.io/library/fastapi-todos:1.0.0 0.0s
Test running the image locally
$ docker run -p 8000:8000 --env-file .env fastapi-todos:1.0.0
[2023-10-15 19:02:23 +0000] [1] [INFO] Starting gunicorn 21.2.0
[2023-10-15 19:02:23 +0000] [1] [INFO] Listening at: http://0.0.0.0:8000 (1)
[2023-10-15 19:02:23 +0000] [1] [INFO] Using worker: uvicorn.workers.UvicornWorker
[2023-10-15 19:02:23 +0000] [7] [INFO] Booting worker with pid: 7
[2023-10-15 19:02:23 +0000] [8] [INFO] Booting worker with pid: 8
[2023-10-15 19:02:23 +0000] [9] [INFO] Booting worker with pid: 9
[2023-10-15 19:02:23 +0000] [10] [INFO] Booting worker with pid: 10
[2023-10-15 19:02:23 +0000] [11] [INFO] Booting worker with pid: 11
[2023-10-15 19:02:23 +0000] [12] [INFO] Booting worker with pid: 12
[2023-10-15 19:02:23 +0000] [13] [INFO] Booting worker with pid: 13
[2023-10-15 19:02:23 +0000] [7] [INFO] Started server process [7]
[2023-10-15 19:02:23 +0000] [7] [INFO] Waiting for application startup.
[2023-10-15 19:02:23 +0000] [7] [INFO] Application startup complete.
[2023-10-15 19:02:23 +0000] [15] [INFO] Booting worker with pid: 15
[2023-10-15 19:02:23 +0000] [8] [INFO] Started server process [8]
[2023-10-15 19:02:23 +0000] [8] [INFO] Waiting for application startup.
[2023-10-15 19:02:23 +0000] [8] [INFO] Application startup complete.
[2023-10-15 19:02:23 +0000] [17] [INFO] Booting worker with pid: 17
[2023-10-15 19:02:23 +0000] [18] [INFO] Booting worker with pid: 18
[2023-10-15 19:02:23 +0000] [9] [INFO] Started server process [9]
[2023-10-15 19:02:23 +0000] [9] [INFO] Waiting for application startup.
[2023-10-15 19:02:23 +0000] [9] [INFO] Application startup complete.
[2023-10-15 19:02:23 +0000] [10] [INFO] Started server process [10]
[2023-10-15 19:02:23 +0000] [10] [INFO] Waiting for application startup.
[2023-10-15 19:02:23 +0000] [10] [INFO] Application startup complete.
[2023-10-15 19:02:23 +0000] [11] [INFO] Started server process [11]
[2023-10-15 19:02:23 +0000] [11] [INFO] Waiting for application startup.
[2023-10-15 19:02:23 +0000] [11] [INFO] Application startup complete.
[2023-10-15 19:02:23 +0000] [12] [INFO] Started server process [12]
[2023-10-15 19:02:23 +0000] [12] [INFO] Waiting for application startup.
[2023-10-15 19:02:23 +0000] [12] [INFO] Application startup complete.
[2023-10-15 19:02:23 +0000] [13] [INFO] Started server process [13]
[2023-10-15 19:02:23 +0000] [13] [INFO] Waiting for application startup.
[2023-10-15 19:02:23 +0000] [13] [INFO] Application startup complete.
[2023-10-15 19:02:24 +0000] [15] [INFO] Started server process [15]
[2023-10-15 19:02:24 +0000] [15] [INFO] Waiting for application startup.
[2023-10-15 19:02:24 +0000] [15] [INFO] Application startup complete.
[2023-10-15 19:02:24 +0000] [17] [INFO] Started server process [17]
[2023-10-15 19:02:24 +0000] [17] [INFO] Waiting for application startup.
[2023-10-15 19:02:24 +0000] [17] [INFO] Application startup complete.
[2023-10-15 19:02:24 +0000] [18] [INFO] Started server process [18]
[2023-10-15 19:02:24 +0000] [18] [INFO] Waiting for application startup.
[2023-10-15 19:02:24 +0000] [18] [INFO] Application startup complete.
{"loglevel": "info", "workers": 10, "bind": "0.0.0.0:8000", "graceful_timeout": 60, "timeout": 60, "keepalive": 5, "errorlog": "-", "accesslog": "-", "workers_per_core": 1.0, "use_max_workers": 10, "host": "0.0.0.0", "port": "8000"}
[17/Oct/2023 19:02:46] INFO [my-todos.process_time_log_middleware:51] Method=GET Path=/ StatusCode=200 ProcessTime=0.002
192.168.65.1:20858 - "GET / HTTP/1.1" 200
[17/Oct/2023 19:02:47] INFO [my-todos.process_time_log_middleware:51] Method=GET Path=/openapi.json StatusCode=200 ProcessTime=0.026
192.168.65.1:20859 - "GET /openapi.json HTTP/1.1" 200
This image can be pushed to a central container repository like DockerHub so it can be pulled down by any server.
$ docker image tag fastapi-todos:1.0.0 dpills/fastapi-todos:1.0.0
$ docker push dpills/fastapi-todos:1.0.0
The push refers to repository [docker.io/dpills/fastapi-todos]
28d44516d23e: Pushed
d3adbb568b7a: Pushed
7690ae0fc4d3: Pushed
1e5a71ad08c2: Pushed
e3f2fdf4ed2c: Pushed
801b21c3331c: Mounted from library/python
e02fcdad509e: Mounted from library/python
4e7e2e312a26: Mounted from library/python
fd887e1d7390: Mounted from library/python
32f2ee38f285: Mounted from library/python
1.0.0: digest: sha256:f346ee76439cc1ca8dfed2e5d369be855ced730b7542fb2c663e1c3454757976 size: 2412
Docker Compose
We have already been using a compose spec to run our local mongo database but now we can update it to also run our API.
📝 docker-compose.yml
services:
db:
image: mongo:7.0.1
container_name: myAPIdb
restart: always
ports:
- 27017:27017
env_file:
- .env
volumes:
- type: volume
source: my_api_db_data
target: /data/db
api:
image: dpills/fastapi-todos:1.0.0
container_name: fastapi-todos
restart: always
ports:
- 8000:8000
env_file:
- .env
depends_on:
- db
volumes:
my_api_db_data:
You will need to update the mongo URI in the .env file to use the compose service name instead of localhost when running with Docker Comose.
📝 .env
...
MONGO_URI=mongodb://root:mySecureDbPassword1@db:27017/
...
$ docker-compose up
[+] Building 0.0s (0/0) docker-container:unruffled_shockley
[+] Running 3/3
✔ Network fastapi-quick-start-guide_default Creat... 0.0s
✔ Container myAPIdb Created 0.0s
✔ Container fastapi-todos Created 0.0s
Attaching to fastapi-todos, myAPIdb
If you want to run this only on a single node with HTTPS, NGINX can be used to terminate the HTTPS connection and reverse proxy to the API. A certificate needs to be provisioned and added into the /etc/ssl/certs
directory for this to work properly.
📝 nginx.conf
worker_processes 3;
error_log /dev/stdout info;
events {
worker_connections 2048;
}
http {
include /etc/nginx/mime.types;
server {
listen 443 ssl;
server_name todo.com;
ssl_certificate /etc/ssl/certs/todos.pem;
ssl_certificate_key /etc/ssl/certs/todos.key;
location / {
proxy_pass http://api:8000;
}
}
}
📝 docker-compose.yml
services:
db:
image: mongo:7.0.1
container_name: myAPIdb
restart: always
ports:
- 27017:27017
env_file:
- .env
volumes:
- type: volume
source: my_api_db_data
target: /data/db
api:
image: dpills/fastapi-todos:1.0.0
container_name: fastapi-todos
restart: always
ports:
- 8000:8000
env_file:
- .env
depends_on:
- db
server:
image: nginx:1.25-alpine
container_name: nginx
restart: always
ports:
- 443:443
volumes:
- type: bind
read_only: true
source: /host/path/nginx.conf
target: /etc/nginx/nginx.conf
- type: bind
read_only: true
source: /etc/ssl/certs
target: /etc/ssl/certs
depends_on:
- api
volumes:
my_api_db_data:
This is now a secure single-node containerized API setup! 🎉
Kubernetes
Refer to Kubernetes Quick Start Guide ☁️⚡️🚀 for an in-depth Kubernetes tutorial
If you need to handle a large scale of requests then you will want to use multiple instances of the container running on multiple nodes. Kubernetes is the industry standard for large scale container orchestration. The mongoDB can be setup through a DataBase as a Service (DBaaS) offering such as Mongo Atlas or deployed to the Kubernetes cluster with a Helm Chart such as the Bitnami MongoDB Helm Chart. In order to deploy the API to a Kubernetes environment it needs to have a deployment, service and secret to map the environment variables to the container.
📝 deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
labels:
app: my-todos-api
name: my-todos-api
spec:
replicas: 3 # Scale up to 3 instances of the container
selector:
matchLabels:
app: my-todos-api
template:
metadata:
labels:
app: my-todos-api
spec:
containers:
- name: my-todos-api
image: dpills/fastapi-todos:1.0.0
imagePullPolicy: IfNotPresent
ports:
- containerPort: 8000
protocol: TCP
envFrom:
- secretRef:
name: my-todos-api-secret
resources:
limits:
cpu: "2"
memory: 2Gi
requests:
cpu: "1"
memory: 1Gi
📝 service.yaml
apiVersion: v1
kind: Service
metadata:
labels:
app: my-todos-api
name: my-todos-api-svc
spec:
ports:
- name: http
port: 8000
protocol: TCP
targetPort: 8000
selector:
app: my-todos-api
The Kubernetes secret can be created with kubectl
using the .env
file. Remove the --dry-run=client -o yaml
portion at the end to actually create the secret on a cluster.
$ kubectl create secret generic my-todos-api-secret --from-env-file=.env --dry-run=client -o yaml
apiVersion: v1
data:
GITHUB_OAUTH_CLIENT_ID: MGVjN2Q5Njk5MjgzNmE1ZmJiOTc=
GITHUB_OAUTH_CLIENT_SECRET: ZTgxM2IwY2RiNGQ0MDJkNTVkNWQwOWNmZDA4ZGZkMDZjZjVmYzZlYw==
MONGO_INITDB_ROOT_PASSWORD: bXlTZWN1cmVEYlBhc3N3b3JkMQ==
MONGO_INITDB_ROOT_USERNAME: cm9vdA==
MONGO_URI: bW9uZ29kYjovL3Jvb3Q6bXlTZWN1cmVEYlBhc3N3b3JkMUBkYjoyNzAxNy8=
kind: Secret
metadata:
creationTimestamp: null
name: my-todos-api-secret
We now have a highly-scalable cloud-native API setup! 🎉
Top comments (5)
Awesome article. I have been a Node.js developer for a few years and I was searching for some articles that explicit some best practices and tricks to avoid on FastAPI projects, this article is so helpful.
💛🌴
Wow! That's an impressive article. It's comprehensive guide, really helpful.
Hello from MongoDB :) If you're in a rush you can now use the MongoDB FastAPI app generator (github.com/mongodb-labs/full-stack...) and eliminate much of the boilerplate
Don't you think worker ochestration should be handled at the cluster level (k8s) instead of gunicorn level ?