DEV Community

Vignesh M
Vignesh M

Posted on • Updated on

A simple caching strategy for Node REST APIs, Part 1

Hello World, this is the beginning of 2 Part Series on "How to make your REST APIs blazing fast 🚀". These are from my personal experiences and from projects that I built.

Some time ago, I was working on a marketplace platform, where the users can list their products to sell. On the Home Page, it would load a bunch of products, and with the products self-data, it will also load some stats, previous sale history, recent lists data etc. We also let the user sort, filter and do more actions right on the page without reloading or re-fetching for a quick experience. But this came at a cost. For the API to send all this data, it had to do a bunch of calculations which ended up taking a few hundred milliseconds, in range 200-400ms, and worse during high traffic. So we started looking into ways to improve this. This series talks about those methods.

Part 1: A simple caching strategy for Node REST APIs
Part 2: Cache invalidation 😭

So let's jump right into Part 1

Here is the endpoint that we will work on. It simply takes in some query, fetches data from the database, processes it and returns back a JSON response.

// products/routes.js

router.get(
  '/',
  processQuery,
  productsController.index,
  responseHandler
)

Enter fullscreen mode Exit fullscreen mode

okay now, let's add some Cache 💸!

For this example, We will use node-cache, we will put it in a single file, then it can be easily replaced by any cache storage by changing just few lines.

First of all, install the node-cache package.

$ npm install node-cache --save

We will create a cache middleware, it can be used with any endpoint we want easily. This is how the middleware looks like.

// middlewares/cache.js

const NodeCache = require('node-cache')

// stdTTL: time to live in seconds for every generated cache element.
const cache = new NodeCache({ stdTTL: 5 * 60 })

function getUrlFromRequest(req) {
  const url = req.protocol + '://' + req.headers.host + req.originalUrl
  return url
}

function set(req, res, next) {
  const url = getUrlFromRequest(req)
  cache.set(url, res.locals.data)
  return next()
}

function get(req, res, next) {
  const url = getUrlFromRequest(req)
  const content = cache.get(url)
  if (content) {
    return res.status(200).send(content)
  }
  return next()
}

module.exports = { get, set }

Enter fullscreen mode Exit fullscreen mode

Let's go over the functions one by one.

  1. getUrlFromRequest takes the request and returns the complete request URL.
    We use this URL as the unique KEY for our cache.

  2. set saves our processed response (res.locals.data) to the cache with the complete URL as the KEY.

  3. get uses the URL as the KEY to retrieve the previously stored cached response, if it finds the data, it sends it back as the response, else the request is forwarded to the next middleware.

Our cache middleware is ready! Let's plug it in with our product route.

// products/routes.js

const cache = require('./cache-middleware') // 👈 import our cache middleware

router.get( 
  '/',
  cache.get,  // 👈
  processQuery,
  productsController.index,
  cache.set, // 👈
  responseHandler
)
Enter fullscreen mode Exit fullscreen mode

That's all, Our Endpoint is already faster! But how 😯??

We have added our two middlewares get and set to the route. When a new request comes in, it will first go through cache.get, since we don't have anything in the cache yet, the request passes down to the next middlewares and arrives at cache.set, which will save the response in the cache for next 5 minutes.

Any request that comes in the next 5 minutes, will be able to retrieve this cache form cache.get and will immediately return it to the users. No calculations are done. The database isn't touched.

By doing this we were able to bring down our response time to just a few milliseconds 🎉.

But, yes, this is not the final solution, there are minor issues with this approach. Users on the site won't get real-time data, the data shown can be max 5 mins old. While this approach may work for some use cases, this was not an acceptable solution for us, our users needed real-time data. So we had to look into this more. We had to look into Cache Invalidation 😈, which we will talk about in our next Part. 👋

Follow me on Twitter | Github, I build and post cool stuff. 👨‍💻

Top comments (11)

Collapse
 
chema profile image
José María CL

Thanks!

I will leave my implementation to help others to figure out how to implement this approach

// get-mydata-controller.ts
export const getMyDataController = async (req, res, next) => {
  try {
    const { id } = req.params;
    const myDataRepository: MyDataRepository = new PrismaMyDataRepository();
    const myData = await getMyData(
      id,
      myDataRepository
    );

    res.locals.data = myData; // make the result accesible to the middlewares

    next(); // allow the next middleware to handle the response data
  } catch (err) {
    res
      .status(500)
      .json({ data: null, error: err.message });
  }
};
Enter fullscreen mode Exit fullscreen mode
// response-handler.ts
export function responseHandler(message = "Data successfully fetched") {
  return (req, res) => {
    res
      .status(200)
      .json({ data: res.locals.data, message });
  }
}

Enter fullscreen mode Exit fullscreen mode
// routes.ts
router.get(
  "/:id/my-data",
  cacheLayer.get,
  getMyDataController,
  cacheLayer.set,
  responseHandler("My Data successfully fetched")
);
Enter fullscreen mode Exit fullscreen mode
Collapse
 
svenvarkel profile image
Sven Varkel

It seems that you're using URL as cache key, right? Things to be considered:

  • are the URL-s public? I.e circumstances changes when the URL-s are behind authentication because then you cannot send back the same responses to all users etc.
  • do all the parameters come in via URL? Are there any headers or anything that may differ between requests? Should these be considered?

What's your cache backend?

Collapse
 
vigzmv profile image
Vignesh M

Both of these considerations depend upon one's own project and what routes they want to cache. This example works best for data that only dependents on the URL and not any context like current user.

Collapse
 
gijovarghese profile image
Gijo Varghese

I wrote a similar one but saving data in Cloudflare edge servers coffeencoding.com/how-i-used-cloud...

Collapse
 
nikosdev profile image
Nikos Kanakis

Awesome Post 🏆

Collapse
 
namthang_ng profile image
Nguyễn Nam Thắng

Can you explain the processQuery, responseHandler fuction for me.

Collapse
 
vigzmv profile image
Vignesh M

Hey, processQuery & responseHandler are not anything related to caching or have anything to do with this artcle. They are just examples of common middlewares.


In our app, processQuery transforms some of our url query params, does some checks if they are valid.

responseHandler just converts the response data to json and returns a 200 status.
This may not be everyone's usecase.

Collapse
 
namthang_ng profile image
Nguyễn Nam Thắng

thank bro

Collapse
 
namthang_ng profile image
Nguyễn Nam Thắng

Thank you. Do you still have this tutorial. i don't see it on your github

Collapse
 
fukamichal profile image
fuka-michal

function getUrlFromRequest(req) {
const url = req.protocol + '://' + req.headers.host + req.originalUrl
return url
}

becasue of req.protocol in key the cache is not effective as can be :)

Collapse
 
shashidhar85 profile image
shashidhar reddy

Hi, can you please share index file also ?