Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a built-in way to cache route responses? #651

Closed
zrachlin opened this issue Oct 24, 2019 · 5 comments
Closed

Is there a built-in way to cache route responses? #651

zrachlin opened this issue Oct 24, 2019 · 5 comments
Labels
question Question or problem question-migrate

Comments

@zrachlin
Copy link

Description
Hi! I'm coming from Flask and am very new to FastAPI.

I'm wondering if there is built-in way to cache the results of API requests so that they can be returned automatically when requested again? Some of the routes I plan to make call external APIs and do some data processing on the results, so they take a few seconds to finish. With Flask, I've been using the Flask-Caching extension, which enables me to put a decorator above the route/view to denote that I want to cache (or memoize if there are input arguments) its result. So it gets called the first time the request is made, but then all subsequent requests with the same arguments/parameters return the cached result.

I'd love to figure out how to make this happen with FastAPI. I've been looking through the documentation for something related to caching, and the only mention I could find was the following on the sub-dependency page:

Using the same dependency multiple times
If one of your dependencies is declared multiple times for the same path operation, for example, multiple dependencies have a common sub-dependency, FastAPI will know to call that sub-dependency only once per request.
And it will save the returned value in a "cache" and pass it to all the "dependants" that need it in that specific request, instead of calling the dependency multiple times for the same request.

I'm still trying to fully comprehend dependancies/sub-dependencies, so maybe this is what I'm looking for? But this seems like it is talking about using something multiple times in a single request, rather than in separate requests.

Any help/direction would be greatly appreciated. Thanks!

@zrachlin zrachlin added the question Question or problem label Oct 24, 2019
@euri10
Copy link
Contributor

euri10 commented Oct 24, 2019 via email

@zrachlin
Copy link
Author

@euri10 Ok cool, thanks. Do you have to declare all of your path operation functions as async in order to be compatible with aiocache? Or will they work as normal non-async functions since FastAPI works asynchronously behind the scenes (from what i've read)?

@euri10
Copy link
Contributor

euri10 commented Oct 25, 2019

I'm just quoting this excellent post because I couldn't explain it better (
https://www.aeracode.org/2018/02/19/python-async-simplified/)

There are four cases

Calling sync code from sync code. This is just a normal function call - like time.sleep(10). Nothing risky or special about this.

Calling async code from async code. You have to use await here, so you would do await asyncio.sleep(10)

Calling sync code from async code. You can do this, but as I said above, it will block the whole process and make things mysteriously slow, and you shouldn't. Instead, you need to give the sync code its own thread.

Calling async code from sync code. Trying to even use await inside a synchronous function is a syntax error in Python, so to do this you need to make an event loop for the code to run inside.

@tiangolo
Copy link
Owner

@zrachlin I would think you might want to cache the external APIs' results, not necessarily the responses from your API.

Otherwise, someone could, for example, use a stale/invalid authentication token, and still receive the response, because that old token input with the extra data is still on the cache.

To use aiocache you would benefit from async functions more. Otherwise you would need custom tricks.

But if you use normal def functions you could probably use a normal Redis Python package and do it yourself. Redis has fields with TTL, so you can have stale data invalidation easily. It shouldn't be that hard to implement.

@github-actions
Copy link
Contributor

Assuming the original issue was solved, it will be automatically closed now. But feel free to add more comments or create new issues.

@tiangolo tiangolo changed the title [QUESTION] Is there a built-in way to cache route responses? Is there a built-in way to cache route responses? Feb 24, 2023
@tiangolo tiangolo reopened this Feb 28, 2023
Repository owner locked and limited conversation to collaborators Feb 28, 2023
@tiangolo tiangolo converted this issue into discussion #7958 Feb 28, 2023

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
question Question or problem question-migrate
Projects
None yet
Development

No branches or pull requests

3 participants