DEV Community

Cover image for API Caching with ElastiCache Redis & AWS Lambda
Artur Bartosik
Artur Bartosik

Posted on • Updated on

API Caching with ElastiCache Redis & AWS Lambda

In this article, we will not only explore the benefits of using Redis as a caching solution for a REST API, but I’ll also provide a practical example of how to implement Redis caching in a serverless environment using AWS Lambda with TypeScript. I’ll demonstrate how to call an external API and cache response using ElastiCache Redis, resulting in faster response times and improved reliability. By following my example, you will learn how to integrate Redis caching into your own serverless applications.

Code example

If you want to directly jump to the code sample and play on a living organism, the repo is available on my Github right below:

https://github.com/luafanti/elasticache-redis-and-lambda

Deployment Overview

The infrastructure in demo is provisioned using the CloudFormation template. The stack created with the default parameters will provide the following resources:

  • VPC with single public & private subnet, and other base VPC components
  • NAT Gateway
  • ElastiCache Redis
  • TypeScript Lambda & HTTP API Gateway

**Please note that stack includes components (NAT & Redis) that will incur hourly costs even if you have AWS Free Tier.

You can also create a stack in MultiAZ mode. Then the stack will create:

  • VPC with two public and two private subnets
  • 2 NAT Gateway, one per private subnet
  • ElastiCache Redis in Multi-AZ mode - one master and one replica instance
  • this same TypeScript Lambda

In Multi-AZ mode costs will be doubled due to two instances of NAT and Redis

Install is simple with

npm install 
sam build
sam deploy
Enter fullscreen mode Exit fullscreen mode

To remove whole stack

sam delete
Enter fullscreen mode Exit fullscreen mode

Important points to clarify

  • Lambda needs to be deployed inside VPC if want to connect managed Redis. ElastiCache is a service designed to be used internally in VPC.
  • NAT Gateway is required in this setup. Lambda running inside a VPC is never assigned a public IP address, so it can't connect to anything outside the VPC - in this case our external API. NAT Gateway resolves this problem.
  • If you want to connect to Redis cluster e.g. from local CLI, you have to setup a VPN connection or bastion host.
  • Top-level await isn’t supported in this Lambda sample. It’s because the actual setup uses CommonJS packaging. To enable this feature it needs to configure esbuild to output modules files with .mjs extension.

In-Depth

REST APIs have become an integral part of our systems. GraphQL and gRPC often in some aspects can be good replacements for old-fashioned REST, but personally, I still can't imagine forcefully avoiding this kind of API. Anyway, REST is ubiquitous and in our systems, we often integrate our services with that way. Sometimes it's an internal API, sometimes it's an external API. In the latter case, it’s more difficult because we have no control over the performance and availability of this API.

The long response time of external API automatically increases the overall time of request handling in our service. The unavailability of external API, even more, impacts our app and required special handling. The antidote to all this evil may be an additional layer of caching... and this is where Redis comes into the game.

Why Redis as a cache?

The main reason for Redis is its high performance. Redis is considered one of the faster key-value databases. There are several reasons behind that efficiency:

  • RAM-based data store. RAM access is several orders of magnitude faster than HDD or even SSD access. Not to mention access over API, which is burdened with the highest latency.

Redis data access pyramid

  • Efficient data structures. Redis offers various data structures such as List, Sets, Hashes, Bitmaps, etc. All types are implemented in C and to allocate memory use custom wrapper for malloc called zmalloc. This allows Redis to choose different alloc libraries depending on the needs.
  • Event-driven architecture. Redis uses a reactor design pattern to multiplex I/O to handle thousands of incoming requests at the same time using just a single thread.

The additional reason in the case of AWS is that the Redis database is available here as a managed service - ElastiCache Redis. It simplifies configuration and maintenance, and shifts some of the responsibility to the provider.

Step-by-Step caching flow

Flow is very simple… ProxyLambda first checks if response from External API exists in cache. As a cache key in this simple example, I just use the full request path also with query params. In complex API I can recommend a more advanced strategy for key generation. If response object exists in Redis cache, Lambda returns it directly. Otherwise, Lambda calls External API as before but additionally saves this response to the Redis cache. Thanks to this, a subsequent request to ProxyLambda for this same resource, will be returned from cache instead calling External API.

Basically the two diagrams below should explain it all

API caching with Redis - component diagram

API caching with Redis -  sequence diagram

There is still the question about invalidating records in cache. Here the strategy must fit requirements. In this demo, objects in the cache are eternal. One of solution would be to hardcode expiration time on save (redis.setEx(cacheKey, 86400, apiResponse). A more elegant way would be to create dedicated invalidation Lambda, which will remove objects from the cache when receiving an event that in External API some resource has been removed or modified.

Pros and Cons of the Solution

🟢 Better performance. Responses from Redis cache can be much faster than from External API. Latency is also stable and doesn't depend on the current API load.

🟢 Improved reliability. Our faced API can respond even if External API is down.

🟢 Less load on the External API as less requests reaches it.

🟡 Additional work on management, and maintenance of ElastiCache.

🟡 Additional cost of ElastiCache cluster.

Top comments (0)