Cache answers
ModischFabrications opened this issue · comments
People that are trying this service out are likely to queue the same result multiple times.
Caching these values will really save some execution time.
This is somehow relevant to #24
Measure RAM size and performance for a comparison. Caching will result in some faster responses, but slows down every request and will increase ram usage.
It would also be nice to get a "/cached" path to debug successful caching and offer "offline" solutions
Remember to add a "cached" field to response object
Problem: Uvicorn will spawn different processes in parallel, which will have difficulties keeping a shared cache. Redis and other external solutions might be nice, but have huge overhead. Might be able to cache requests in middleware before passing them to backend