DEV Community

Cover image for ExpiringLRUCache and The CLOCK Algorithm: A Fun Dive into Python Caching
Santhosh Balasa
Santhosh Balasa

Posted on • Updated on

ExpiringLRUCache and The CLOCK Algorithm: A Fun Dive into Python Caching


Caching is a common technique that programs use to speed up access to slow data sources by keeping a copy of the data in fast access memory. In Python, we can implement this concept in many ways, but let's talk about an exciting one: The ExpiringLRUCache!

What is ExpiringLRUCache?

ExpiringLRUCache is a cache implementation that combines two powerful concepts: The Least Recently Used (LRU) cache eviction policy and an expiration time for cache entries. In simple terms, it remembers the most recently used items up to a specified limit and for a particular duration. After the duration expires or when the cache is full, it starts removing items, making room for the new ones. Sounds cool, right?

The CLOCK Algorithm

But, how does ExpiringLRUCache decide which item to remove first? Here comes the CLOCK algorithm! It's a smart approximation of the LRU policy. Imagine a clock hand moving over each cache item. If the item is recently used, it gets a "second chance," and the hand moves on. If not, it's time to say goodbye to that item!

How Can We Use ExpiringLRUCache?

Now, let's get our hands dirty with a fun example:

from expiringlrucache import ExpiringLRUCache

# Create a cache of size 3 with items expiring after 60 seconds
cache = ExpiringLRUCache(3, 60)

# Put some items in the cache
cache.put("apple", "delicious")
cache.put("banana", "yummy")
cache.put("cherry", "tasty")

# Get an item from the cache
print(cache.get("banana"))  # Output: "yummy"

# After 60 seconds...
print(cache.get("apple"))  # Output: None (as it's expired)
Enter fullscreen mode Exit fullscreen mode

With just a few lines of code, we've implemented a speedy cache! 🚀

How to Improve It?

Though ExpiringLRUCache is quite efficient, there are a few ways to boost its performance:

  1. Eviction Policy: For more precise LRU behavior, consider implementing a true LRU cache.
  2. Parallelism: If you have a high degree of parallelism, consider partitioning the cache into multiple segments each with its own lock.
  3. Cache Size Tuning: Adjust the cache size based on its observed hit rate.
  4. Dynamic Expiration Strategy: Assign longer expiration times to entries that are accessed more frequently.

Remember, it's all about finding the right balance based on your specific needs!


Caching is a powerful tool, and with Python's flexibility, we can customize it to our heart's content. ExpiringLRUCache and the CLOCK algorithm offer an efficient approach to caching, making our applications faster and more efficient. So, next time you find your program waiting for data, consider using an ExpiringLRUCache. It might just save your day!

Happy caching! 🎉

Top comments (0)