lonelyenvoy / python-memoization

A powerful caching library for Python, with TTL support and multiple algorithm options.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Use the cache directly without a decorator

mikhail opened this issue · comments

I've a need to get/set values from an LFU cache directly, rather than as a function decorator. The need is as such:

def slow_function(*args, **kwargs)
    cache = choose_cache_out_of_many(*args)
    found = cache.get(*args, **kwags)
    if found: return found

    result = slow_code()

    cache.set(result, *args, **kwargs)

This pattern of having multiple caches and only knowing which one to leverage inside the function that is to be cached means I cannot use a decorator.

How can I access memoization caches directly?

I have 16 different caches and I need to keep them separate so that overflowing cache 3 does not pop out values from cache 1 and so on

I was able to come up with a hacky way of handling this, but a natural implementation would still be welcome. pseudo code:

def __init__(self):
  self.caches = []
  for _ in range(16):
    new_cache = mem.cached(custom_key_maker=self._custom_keys)(self.get_next_state)
    self.caches.append(new_cache)

def get_next_state(self, arg1, arg2, arg3, use_cache=True):
  if use_cache:
    cache_idx = get_cache_id(arg1)
    cache = self.caches[cache_idx]
    return cache(arg1, arg2, arg3)

This creates a once-recursive call that utilizes a cache from a list of caches, had to also leverage custom_key_maker to ignore the use_cache bit

Yes, upvote for sure. Any sophisticated caching usage quickly escape the boundaries imposed by decorators.