This repository has been archived by the owner on Mar 15, 2020. It is now read-only.
Streaming requests and responses #71
Labels
enhancement
New feature or improvement of an existing one
refactor
This will need refactoring the code
Is your feature request related to a problem? Please describe.
There is currently no way of processing the request as a stream or sending a response as a stream. Yet, this is useful when either the request's or the response's body is too large to afford loading it in full in memory.
Note: this is not the same as chunked requests or responses, which is determined by the
Transfer-Encoding
header.Describe the solution you'd like
We should be able to read the request body as a stream or send a response as a stream.
Describe alternatives you've considered
req
object, i.e.async for chunk in req
.yield
chunks of the response. The generator could be registered by decorating it, e.g.@res.stream
.Describe alternatives you've considered
I thought about providing an
@stream
decorator for view handlers themselves, for example:But that would not have allowed to use the response object at all. This is because the generator is called by Starlette while the response is being sent, and the Bocadillo
Response
is pretty much a Starlette response builder which does its work before the response is even created.Additional context
Useful Starlette features: request.stream(), StreamingResponse
The text was updated successfully, but these errors were encountered: