I'm experiencing this issue where it takes approx. 3-30 seconds for a GET query on an API endpoint (Without any Pre or Post processors attached to it). The format of the result is attached below: (81399 records, 10 per page, 8140 pages)
Structure of model:
class TimeSeries(db.Model, ValidationMixin):
id = db.Column(db.Integer, primary_key=True)
channel_id = db.Column(db.Integer, db.ForeignKey('channel.id'))
sensor_id = db.Column(db.Integer, db.ForeignKey('sensor.id'))
time = db.Column(db.DateTime)
value1 = db.Column(db.Unicode(255))
value2 = db.Column(db.Unicode(255))
value3 = db.Column(db.Unicode(255))
value4 = db.Column(db.Unicode(255))
# will validate nullability and string types
The Developer Tools request time:
I suspect this is because of the rel="last" in the Link header.
Would still be interested in how much you could gain from that simple fix you proposed.
Here is my written answer in case it's helpful to anyone in the future:
I've experienced the same issues with .count() and being slower. A much higher penalty (at least in my environments) was the serialization done in to_dict().
You could try enabling the sqlalchemy query logging to figure out how long the individual queries are running.
you can do so by
app.config['SQLALCHEMY_RECORD_QUERIES'] = True
from flask.ext.sqlalchemy import get_debug_queries
print '\n'.join(map(str, get_debug_queries()))
Also using the Profiler Middleware from werkzeug will tell you which parts of your code are the bottleneck:
from werkzeug.contrib.profiler import ProfilerMiddleware
app.wsgi_app = ProfilerMiddleware(app.wsgi_app)