DEV Community

Discussion on: Cache Me Outside: Speed Boosts in Python

Collapse
 
walker profile image
Walker Harrison

This is neat, and love the Fibonacci example. Would it ever make sense to do something like "pre-emptive caching" whereby, in this case, you preload a few answers along the way so that even if something isn't cached there are bookmarks that will reduce the runtime? For example, would having the answers to powers of 2 baked in reduce the time required by half before the function was ever called?

Collapse
 
rpalo profile image
Ryan Palo

Yeah, it seems like it. It doesn't look like there is an easy way to do that with the standard library @lru_cache, but you could definitely do it if you rolled your own. I guess the other option if you were going to have it run for a long time (like for a web-app or something) would be to provide a preload function that just exercises those inputs artificially:

def preload(func, inputs):
    """Runs the function with the provided list of inputs,
    filling in the cache before expected client use"""
    for input in inputs:
        func(input)
Enter fullscreen mode Exit fullscreen mode

I'm working on another post about decorators, so I'll work a preloaded memoization example into it :)

Collapse
 
walker profile image
Walker Harrison

Awesome. Looking forward to your next post

Thread Thread
 
rpalo profile image
Ryan Palo

Just went out on Monday! dev.to/rpalo/unwrapping-decorators...