Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Use the unwrapped call in
solve_dependencies
to determine if a dependency is a coroutine or (async) generator.Since non-async dependencies are run in a threadpool, it's generally preferably to use async dependencies. For dependencies that will not change during the lifetime of the server (like settings), it's preferable to use
functools.cache
to avoid duplicate work. There are two issues with this.First is that
@cache
will cache the coroutine itself not the result, so the first call will succeed; however, subsequent calls will fail withRuntimeError: cannot reuse already awaited coroutine
. This is fixable with @serhiy-storchaka'sreawaitable
decorator outlined in python/cpython#90780 .The second issue is that when solving dependencies, FastAPI does not consider the
__wrapped__
attribute, so because thelru_cache_wrapper
object is not a coroutine, FastAPI will not attempt to await it. In the example below this results in an errorAttributeError: 'coroutine' object has no attribute 'foo'
. This is the issue this PR solves by callinginspect.unwrap
on the dependency call and using the unwrapped call to determine what type it is.If anyone comes across this in the mean time, a reasonable workaround is to wrap the cached function in an undecorated function.
(Related to #5077)