-
Notifications
You must be signed in to change notification settings - Fork 136
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance issues with paginate() #273
Labels
question
Further information is requested
Comments
Hi @fsnlarson, fastapi-pagination/fastapi_pagination/ext/sqlalchemy.py Lines 14 to 32 in 5a9ad78
|
That extension is the right solution. The docs should maybe be updated to show an example of using an extension. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
This may be helpful for people.
The
paginate
function runs into severe performance issues when the datasets are in the thousands.We are using fastapi_pagination with SQLAlchemy and with a result of 8000 records. Since
paginate
requires aSequence
we must load the entire query result into memory and thenpaginate
slices the result. Avg. response time above 1 second.paginate
function in the package.Our solution was to do the following, which uses the native
offset()
andlimit()
from SQLAlchemy. Obviously this does not apply to other databases ORMs so a better approach would be potentially allowing aslice_function
like thelength_function
and then allowing aqueryset
-type object to be passed instead of only a sequence.Our solution
The new function is 100x faster (.01 seconds)
The text was updated successfully, but these errors were encountered: