Large StreamingResponse locks up server #8152
-
|
Description I'm following the example @tiangolo suggested in #58 with StreamingResponse. In my case, I'm attempting to stream slices of an intervaltree in a chunked way, but the server locks up when the request is big. Example @app.get('/datasets/{label}/intervals')
def intervals(label: str, begin: float, end):
async def intervalGenerator():
yield '['
firstItem = True
for r in db[begin:end]: # db is an IntervalTree
if not firstItem:
yield ','
yield json.dumps(db[label]['intervals'][r.data])
firstItem = False
yield ']'
return StreamingResponse(intervalGenerator(), media_type='application/json')The resulting JSON is consumed on the client side by oboe.js—a thing built to handle chunked JSON on the fly that has some mechanisms to allow for rendering+discarding chunks in the browser in a way that we don't have to keep the whole response in memory. All goes well for small-ish requests, but when larger slices are requested, the whole server locks up (Control-C doesn't work; I have to Main question |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
|
@alex-r-bigelow you are using Don't use In the latest versions, That might solve your problem. |
Beta Was this translation helpful? Give feedback.
-
|
Thanks! Updating to the latest version and using a normal generator seems to fix the main problem (at least the server locking part—for anyone reaching this via Google, it looks like |
Beta Was this translation helpful? Give feedback.
-
|
Thanks for reporting back and closing the issue! |
Beta Was this translation helpful? Give feedback.
@alex-r-bigelow you are using
json.dumpstoyieldinside anasyncfunction.json.dumpsis blocking, but as your function isasync, it is running in the main loop, so it might be blocking your app.Don't use
async, use a normal function that returns a normal iterator.In the latest versions,
StreamingResponsecan take a normal generator (not only an async generator).That might solve your problem.