tiangolo / fastapi

FastAPI framework, high performance, easy to learn, fast to code, ready for production

Home Page:https://fastapi.tiangolo.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Is there a built-in way to cache route responses?

zrachlin opened this issue · comments

Description
Hi! I'm coming from Flask and am very new to FastAPI.

I'm wondering if there is built-in way to cache the results of API requests so that they can be returned automatically when requested again? Some of the routes I plan to make call external APIs and do some data processing on the results, so they take a few seconds to finish. With Flask, I've been using the Flask-Caching extension, which enables me to put a decorator above the route/view to denote that I want to cache (or memoize if there are input arguments) its result. So it gets called the first time the request is made, but then all subsequent requests with the same arguments/parameters return the cached result.

I'd love to figure out how to make this happen with FastAPI. I've been looking through the documentation for something related to caching, and the only mention I could find was the following on the sub-dependency page:

Using the same dependency multiple times
If one of your dependencies is declared multiple times for the same path operation, for example, multiple dependencies have a common sub-dependency, FastAPI will know to call that sub-dependency only once per request.
And it will save the returned value in a "cache" and pass it to all the "dependants" that need it in that specific request, instead of calling the dependency multiple times for the same request.

I'm still trying to fully comprehend dependancies/sub-dependencies, so maybe this is what I'm looking for? But this seems like it is talking about using something multiple times in a single request, rather than in separate requests.

Any help/direction would be greatly appreciated. Thanks!

@euri10 Ok cool, thanks. Do you have to declare all of your path operation functions as async in order to be compatible with aiocache? Or will they work as normal non-async functions since FastAPI works asynchronously behind the scenes (from what i've read)?

I'm just quoting this excellent post because I couldn't explain it better (
https://www.aeracode.org/2018/02/19/python-async-simplified/)

There are four cases

Calling sync code from sync code. This is just a normal function call - like time.sleep(10). Nothing risky or special about this.

Calling async code from async code. You have to use await here, so you would do await asyncio.sleep(10)

Calling sync code from async code. You can do this, but as I said above, it will block the whole process and make things mysteriously slow, and you shouldn't. Instead, you need to give the sync code its own thread.

Calling async code from sync code. Trying to even use await inside a synchronous function is a syntax error in Python, so to do this you need to make an event loop for the code to run inside.

@zrachlin I would think you might want to cache the external APIs' results, not necessarily the responses from your API.

Otherwise, someone could, for example, use a stale/invalid authentication token, and still receive the response, because that old token input with the extra data is still on the cache.

To use aiocache you would benefit from async functions more. Otherwise you would need custom tricks.

But if you use normal def functions you could probably use a normal Redis Python package and do it yourself. Redis has fields with TTL, so you can have stale data invalidation easily. It shouldn't be that hard to implement.

Assuming the original issue was solved, it will be automatically closed now. But feel free to add more comments or create new issues.