DEV Community

Cover image for LRU Cache Design🛠
Amr Fahmy Hassan
Amr Fahmy Hassan

Posted on

LRU Cache Design🛠

What is Cache?

A cache is a hardware or software component that stores data so that future requests for that data can be served faster; the data stored in a cache might be the result of an earlier computation or a copy of data stored elsewhere.

What Is An LRU Cache?

So a Least Recently Used (LRU) Cache is a storage of items so that future access to those items can be serviced quickly and an LRU policy is used for cache eviction.

Why do we need Caches?

Basically, computers are processing operations on data which is stored in main memory, for example hard disks. Unfortunately, reading and writing data directly from/to disks is not efficient, so we need to predict, what are the blocks of data which is more likely to be processed by the processor to put in the cache.

Now you can guess that cache is the nearest/fastest place for the processor to get the data (if found) from. So why don't we just use caches for all the data processing? There are two main reasons, firstly caches store the data temporarily, when there is no electric power all the data disappears. Secondly caches are expensive and have limited capacity.

So we need to define the method by which we would choose the data blocks that will be into our cache, in order to minimize the possibility of reading/writing from the disk. This method depends on the principle of Locality of reference.

Locality of reference

Since size of cache memory is less as compared to main memory. So to check which part of main memory should be given priority and loaded in cache is decided based on locality of reference.

Types of Locality of reference

Spatial Locality of reference

This says that there is a chance that element will be present in the close proximity to the reference point and next time if again searched then more close proximity to the point of reference.

Temporal Locality of reference

In this Least recently used algorithm will be used. Whenever there is page fault occurs within a word will not only load word in main memory but complete page fault will be loaded because spatial locality of reference rule says that if you are referring any word next word will be referred in its register that’s why we load complete page table so the complete block will be loaded.

Removal policies

At some point, cache size will hit it maximum capacity. It must replace some blocks to receive new ones. There are multiple applies policies.

  • First in first out (FIFO)
  • Last in first out (LIFO)
  • Least recently used (LRU)
  • Most recently used (MRU)
  • Least-frequently used (LFU)

I will pick the LRU policy to design a LRU Cache.

LRU Cache example

Let's assume that there is a cache of capacity 3.
The cache contains { 5 , 4 ,1 } and there is a operation to be done which is put(6)

1
2
3
4
5

Design LRU Cache

Requirements

Least Recently Used (LRU) cache should support the following operations:

  • get(key) - Get the value (will always be positive) of the key if the key exists in the cache, otherwise return -1.

  • put(key, value) - Set or insert the value if the key is not already present. When the cache reached its capacity, it should invalidate the least recently used item before inserting a new item.

Constrains:

  • Cache capacity: initialized with a positive capacity.
  • Key: positive number.
  • Lookup of cache items must be O(1)
  • Addition to the cache must be O(1)
  • The cache should evict items using the LRU policy

Data Structures

Thinking about the data structures that will be used in designing is the most important step. To make it easier, let's analyze the requirements.

  • Get(key): Fast look up in O(1) time complexity.
  • Put(key , value): Look up O(1) and if the size hits the capacity evict LRU in O(1).

So let's think about data structures that might help to achieve this two functionalities in O(1) time complexity.
A Hash table is what we look for most of the times when we need O(1) look ups, but how would if help us in knowing the LRU element? there must be another helpful data structure.
Most of the answers might be dynamic arrays, trees or linked lists, all of these data structures will face a time complexity problem while deleting (like arrays) or while inserting (like linked lists).
A helpful data structure which can delete in O(1) is Doubly linked list . So YES we have got the data structures that can achieve the requirements.

  • Doubly linked list for fast removal.
    • A node contains key and value.
  • The Doubly linked list is supported by a Hash table for fast access.
    • The key is the input key and the value is a pointer to the corresponding node in the doubly linked list.

Let's dive into an example of a LRU Cache of size 3

  • put(1,5)
  • put(2,17)
  • put(3,99)
  • put(4,1)
  • get(2)

6
7
8
9
10
11
12
13

Is LRU policy the best choice?

Definitely, every policy has pros and cons, one could perfectly work in some conditions but other conditions no. So choosing the suitable policy is affected by the type of your business.

Helpful references

Implement An LRU Cache - The LRU Cache Eviction Policy ("LRU Cache" on LeetCode)
Leetcode problem
Geeksforgeeks

You can also find me on

Linkedin
Github
Twitter

Top comments (2)

Collapse
 
ahmedkhaledak profile image
Ahmed Khaled

This is a great article. Keep it up.

Collapse
 
amrable profile image
Amr Fahmy Hassan

Thank you bro 👨🏼‍💻