Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Very slow response during pagination for large datasets #286
I'm experiencing this issue where it takes approx. 3-30 seconds for a GET query on an API endpoint (Without any Pre or Post processors attached to it). The format of the result is attached below: (81399 records, 10 per page, 8140 pages)
Structure of model:
The Developer Tools request time:
I suspect this is because of the rel="last" in the Link header.
Could it be because of this issue? Stackoverflow: Why is SQLAlchemy count() much slower than the raw query?
A count query on the SQLAlchemy Model of the application highlighting the same:
added a commit
Feb 11, 2014
referenced this issue
Feb 11, 2014
Well it seems like you've fixed your issue while I was trying to give a hint on how to figure out whats the slow part.
Would still be interested in how much you could gain from that simple fix you proposed.
Here is my written answer in case it's helpful to anyone in the future:
I've experienced the same issues with .count() and being slower. A much higher penalty (at least in my environments) was the serialization done in to_dict().
you can do so by
Also using the Profiler Middleware from werkzeug will tell you which parts of your code are the bottleneck:
Even after using the primary_key_name function, the build tests fail. I'll have to look into the build logs to isolate the bug.
It took anywhere between 3 - 30 seconds for a standalone API call on that particular model before the temporary fix. However now I haven't seen it go above 400 ms.