You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The paginate function runs into severe performance issues when the datasets are in the thousands.
We are using fastapi_pagination with SQLAlchemy and with a result of 8000 records. Since paginate requires a Sequence we must load the entire query result into memory and then paginate slices the result. Avg. response time above 1 second.
Our solution was to do the following, which uses the native offset() and limit() from SQLAlchemy. Obviously this does not apply to other databases ORMs so a better approach would be potentially allowing a slice_function like the length_function and then allowing a queryset-type object to be passed instead of only a sequence.
This may be helpful for people.
The
paginate
function runs into severe performance issues when the datasets are in the thousands.We are using fastapi_pagination with SQLAlchemy and with a result of 8000 records. Since
paginate
requires aSequence
we must load the entire query result into memory and thenpaginate
slices the result. Avg. response time above 1 second.paginate
function in the package.Our solution was to do the following, which uses the native
offset()
andlimit()
from SQLAlchemy. Obviously this does not apply to other databases ORMs so a better approach would be potentially allowing aslice_function
like thelength_function
and then allowing aqueryset
-type object to be passed instead of only a sequence.Our solution
The new function is 100x faster (.01 seconds)
The text was updated successfully, but these errors were encountered: