syrusakbary / aiodataloader

Asyncio DataLoader for Python3

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Cache the Data after future gets resolved

rajanjha786 opened this issue · comments

Hello,
I'm trying to use Redis backed dict instead of regular dict as a cache for the Dataloader.
But aiodataloader is directly caching the future object instead of the actual value and due to this, the Redis backed dict is storing the str representation of the future.

class ABCLoader(Dataloader):

 async def batch_load_fn(keys):
       # Some Processing
       return response

loader = ABCLoader(cache_map=RedisDict(namespace="ABC", expires=600))

# The value which get stored in Redis is
key = "ABC:key1"
value = "Future:<Future pending>"

Is there anyway to cache the actual value instead of future ?

It makes sense to store the future object in a cache instead of the actual value because we are awaiting on dataloader.load method and it should return a future but then we can't use the Redis backed cache in the default implementation of data loader.

One solution can be to override the load method of dataloader, register a callback on future to save the data in the cache
while retrieving the value from the cache wrap it in the future and return.

class RedisDataloader(Dataloader):

    async def load(self, key):
        ....
        if self.cache and key in self._cache.keys():
            cached_result = self._cache.get(key)
            future = self.loop.create_future()
            future.set_result(cached_result)
            return future
        ........
        if self.cache:
            def cache_value(future):
                 self._cache[key] = future.value()
             future.add_callback(cache_value)
        ..........
        return future 

Please let me know if there is any other better way to do it
Thanks

A more correct approach might be to wrap the RedisDict in a custom class which implements the behavior you need. That way you don't need to access the internal _cache member.

Closing this since its answered.