When we work on apis, especially to display informations, it oftenly comes to a point where we have many identical calls. So, to avoid an overutilization of our resources and gain speed, we can setup a cache directly in our Python code.
Today, we will see how to do it with lru_cache.
Setup
Like a lot of python librairies, lru_cache is available through pip.
PIP - Lru_cache : https://pypi.org/project/lru_cache/
pip install lru_cache
How to use it
from cache import LruCache
@LruCache(maxsize=2, timeout=1)
def foo(num):
return num
As you can see it, it's really easy to use it. You just have to add an annotation above the function where you want to add cache.
Parameters
maxsize
maxsize
is the maximum of differents calls of the method which can be cached at the same time.
"differents calls of the method" means that at least one parameter changed.
Example
foo(1) # Foo is called and its return is added in cache
foo(1) # Value directly returned from cache
foo(2) # Foo is called and its return is added in cache
foo(3) # Foo is called and its return is added in cache instead of the return of "foo(1)" as the max size is 2
foo(2) # Value directly returned from cache
If no value is given for this parameter, so every different calls will be cached.
timeout
timeout
is the number of seconds where the result of a call is stored before calling again the method.
foo(1) # Called at 10:00:00 - Foo is called and its return is added in cache
foo(1) # Called at 10:00:00 - Value directly returned from cache
foo(2) # Called at 10:00:00 - Foo is called and its return is added in cache
foo(2) # Called at 10:00:00 - Value directly returned from cache
foo(1) # Called at 10:00:01 - Value directly returned from cache
foo(1) # Called at 10:00:02 - Previous value expired, Foo is called and its return is added in cache
If no values are given for this parameter, all the informations will be stored until a restart of the python app.
Invalidate cache
If you need to invalidate a value in your cache, you can use the following line:
foo.invalidate(num)
And that's it! With only 3 lines, we are able to add cache and invalidate some data to increase our performances and reduce the resource usage!
I hope it will help you and if you have any questions (there are not dumb questions) or some points are not clear for you, don't hesitate to add your question in the comments or to contact me directly on LinkedIn.
Top comments (6)
One of your examples reads:
Why is the call marked with
(*)
not resolved by returning 2 from the cache?After each call, I expect the cache to be:
AFAICT the timeout has no bearing on this example.
What am I missing?
Did you setup a timeout value? Maybe this point is not completly clear. I've given an example about how to use it, but it's a general case. Examples for
maxsize
andtimeout
are to explain how changing a parameter value interact with the cache functionment.Also just to be sure, I did a test with logs to check when the method is called
and I have the following results
As you can see, the
foo
log appears each time the method is called and is not called the second time I callfoo(2)
So is the post not enough clear about the examples and the general way about how to use lru_cache?
You have verified that your example is incorrect. Your original example said:
It should be:
since as you show:
the last
result :: 2
does not call foo() but returns 2 from the cache.Good catch, it's a miss from my part when I did copy-paste during translation
lru cache is already implemented in functools already bundled with python, so no need to import another package
docs.python.org/3/library/functool...
Yes but it's not the same thing. As you can see in the screenshot, it says "Returns the same as
lru_cache(maxsize=None)
" and one thing that I want to expose is the "timeout" option from lru_cache. That's why I talked about lru_cache and not about this Python feature.But it's a great point to let everyone know that the option is available by default in Python. And thanks for that.
Many times I had to add a code sample find online to be able to have this expiration feature. So now that the feature is available I want to share it! (maybe the feature is not so recent, but I recently need to go back to a Python project where I needed to add this kind of caching)