DEV Community

Cover image for LRU cache Illustrated - For the visual learner
kapeel kokane
kapeel kokane

Posted on

LRU cache Illustrated - For the visual learner

Hey There ๐Ÿ‘‹๐Ÿพ

Today's topic of discussion is the LRU cache. A few days back, I created a twitter thread giving an introduction about the same. Here's the thread if you want to take a look. ๐Ÿ‘‡๐Ÿพ


๐Ÿ’๐Ÿปโ€โ™‚๏ธ In today's post, we will be digging in a little deeper and also look at how one can implement an LRU cache in JavaScript.

Why cache anything? ๐Ÿค”

The first question that we will tackle is

why do we need to cache anything in the first place?

Well, the way software gets used by consumers follows a specific pattern. Somewhat similar to the 80-20 rule. It basically means that the data that gets queried once, is more likely to get queried on the same device again.

And that even makes sense. Whenever I open twitter, as I'm definitely sure that my user information needs to be fetched every time, it's an efficient choice to cache that information on my browser so that the next time it is required, there is a faster way to fetch it.

Why not cache everything? ๐Ÿคจ

The next logical question would then be

If caching is actually that good and solves all our problems, why don't we cache everything?

Well, there is the issue of space constraint. In the previous example, if the browser starts caching all the user's info that I visit, sooner or later, the browser is going to run out of memory. And hence, there needs to be a conscious thought about what to cache and for how long.

Cache replacement!

With that in mind, we now need to think about a scenario

What if the cache is at full capacity right now and we need to add another element to it?

That is where the LRU part comes into the picture. Which expands to Least recently used. The logic being that something which was used(stored/accessed) a long long time back, would most probably not be used again. There are other strategies of cache eviction (deletion) apart from the LRU one. They are:

  • First in first out: The one that was added first, gets deleted first, irrespective of when it got accessed.
  • Last in first out: The one that was added last, gets deleted first, irrespective of when it got accessed.
  • Least frequently used: The one that was accessed the fewest number of times, gets deleted first.
  • Random replacement: Any one of the cache items is randomly chosen and deleted.

There are several other strategies in addition to these. Also, there is no one-size-fits-all strategy and each of these mentioned above are suited for different use cases. But in today's article, we will be looking specifically into the LRU cache.

LRU illustrated

Let us assume that we have an LRU cache that can only hold 3 user details at a time and visualise how that would look like. using the put() method to add user to the cache and the get() method to fetch the user info from the cache. Before adding anything, this is what the cache looks like:
empty cache

Let's add the 3 users. Using string value for example, can be assumed to be an object with different key/value data about the user.

cache.put('amy', "amy's details" )
cache.put('bob', "bob's details" )
cache.put('clint', "clint's details" )
Enter fullscreen mode Exit fullscreen mode

The cache is now at full capacity and looks like this:
cache after 3 puts

Now, If we want to add a fourth user: dylan to the cache, one of the previous users needs to be deleted. And that would be amy according to the least recently used principle.

cache.put('dylan', "dylan's details" )
Enter fullscreen mode Exit fullscreen mode

dylan added to cache

But, let's say if before adding dylan to the cache, if we had accessed amy's user object, it would NOT be the least recently used item in the cache and due to that, bob would have got chucked out instead.

cache.get('amy')
cache.put('dylan', "dylan's details" )
Enter fullscreen mode Exit fullscreen mode

dylan added after amy accessed

I hope that provides you the gist of how this is working. Let's dive into the code!

Let's code

We will code this up as a JavaScript class with the get and put methods in it.

Here's what the class looks like with it's constructor

class LRUCache {
  constructor(capacity) {
    this.cache = new Map();
    this.capacity = capacity;
  }
}
Enter fullscreen mode Exit fullscreen mode

Here's the get() method

get(key) {
  if (!this.cache.has(key)) return -1;

  const v = this.cache.get(key);
  this.cache.delete(key);
  this.cache.set(key, v);
  return this.cache.get(key);
};
Enter fullscreen mode Exit fullscreen mode

The first line just checks whether the item is present in the cache and returns a -1 in case it is not.

But do you notice the part wherein the object is present?

We access the value, delete it from the cache and then add it again before returning its value. Well that is a trick which you'll soon understand.

Let's look at the put() method before that:

put(key, value) {
  if (this.cache.has(key)) {
    this.cache.delete(key);
  }
  this.cache.set(key, value);
  if (this.cache.size > this.capacity) {
    this.cache.delete(this.cache.keys().next().value);
  }
};
Enter fullscreen mode Exit fullscreen mode

In the first part here, if the cache already has the key that we are trying to add, we delete it first and then add it again. This is confusing too, right?

The next part will make it clear.

Notice what we are doing in case the cache has exceeded capacity? we are doing this.cache.keys().next().value. Well, this is a special trick using which we are fetching the value that was written first out of all the values written to the Map.

You see, in the get() method, we deleted the key and set it again so that it ends up being the most recently added value and does not show up when we fetch this.cache.keys().next().value value as it got recently accessed.

The deletion and adding inside the put() function is serving a similar function. Basically, we are refreshing that particular value in the cache!

And that's the magic. And we have this fully functional cache written in JavaScript which we can play around with!

Hope you liked that.
Cheers! ๐Ÿ™Œ๐Ÿพ

Discussion (0)