Streaming Large Files #8229
-
|
I am working with this example in the documentation that shows a basic example of how to work with files. Is it possible to stream a large file with this? What is the correct pattern? It's possible to stream files with starlette, so how would integrating this functionality with fastapi work? Thank you! |
Beta Was this translation helpful? Give feedback.
Replies: 15 comments
-
|
Seconded! Thanks for any pointers you might have. |
Beta Was this translation helpful? Give feedback.
-
|
First, for the case of @petermorrownavican , I added a new feature to support That would probably be the best way to deal with large files. It uses a standard "spooled" file (in memory up to some limit, then on disk). And you can interact with it as a normal file in Python. Stream request contentIf you have some specific use case that requires you to read the You just have to declare a parameter to take the Starlette async for chunk in request.stream():
...Adapting the example in Starlette, it would be something similar to: from fastapi import FastAPI
from starlette.requests import Request
app = FastAPI()
@app.post("/files/")
def create_file(request: Request):
body = b''
async for chunk in request.stream():
body += chunk
response = Response(body, media_type='text/plain')
return responseStream response contentIn the case of @matthdsm , you can return a Starlette It would be similar to: from fastapi import FastAPI
from starlette.requests import Request
from starlette.responses import StreamingResponse
app = FastAPI()
@app.get("/files/")
def read_stream():
return StreamingResponse(some_generator, media_type='application/json')But now, have in mind that JSON is a format that needs the whole file to be ready before being parsed. Maybe you have a custom receiver that takes JSON lines separated by new line characters or something similar, but otherwise, if it's pure JSON, your frontend won't be able to parse the contents until you have the full JSON downloaded. If you need to send different arbitrary JSON messages/documents, you can also think of using |
Beta Was this translation helpful? Give feedback.
-
|
Hi @tiangolo Thanks for the comprehensive explanation! I got it to work using the example you provided. My front-end will have to be able to receive a json stream, since I'm outputting json objects. I've tried using complete json documents, but in my case, that just doesn't work at all. Cheers |
Beta Was this translation helpful? Give feedback.
-
|
Thank you for your quick, thorough and thoughtful responses @tiangolo! The new feature to support |
Beta Was this translation helpful? Give feedback.
-
|
Awesome! Thanks for reporting back and closing the issue 👍 🌮 Just out of curiosity, are you using FastAPI at NAVICAN? |
Beta Was this translation helpful? Give feedback.
-
|
We are experimenting with it in dev to see if it meets our needs, but not in production (yet). We like it a lot so far though! 👍 🌯 |
Beta Was this translation helpful? Give feedback.
-
|
That's great to hear! I would love to know how it goes. |
Beta Was this translation helpful? Give feedback.
-
|
@tiangolo I am returning a StreamingResponse as suggested above |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
@tiangolo Is there a way to specify the chunk size (in seconds)? I am trying to receive chunks of 'x' seconds from a large audio file on which I can do further processing and send a streaming response. |
Beta Was this translation helpful? Give feedback.
-
|
Chunks are always sized in bytes. Any translation from len(bytes) -> seconds should be manually implemented. Wouldn’t know how though, I’d guess it would depend on the file format. |
Beta Was this translation helpful? Give feedback.
-
How can I access the bytes in the chunk object, based on the code below
|
Beta Was this translation helpful? Give feedback.
-
|
I also wonder if we can set an actual chunk size when iter through the stream. Like the code below, if I am reading a large file like 4GB here and want to write the chunk into server's file, it will trigger too many operations that writing chunks into file if chunk size is small by default. |
Beta Was this translation helpful? Give feedback.
-
|
How can I use this starlette request stream in a fast api endpoint that accepts other things other than the file? |
Beta Was this translation helpful? Give feedback.
-
|
@tiangolo greetings are we able to stream multiple file on single request? do we have anything to achieve this? currently i am able to process one single large file with the help of request.stream()... |
Beta Was this translation helpful? Give feedback.

First, for the case of @petermorrownavican , I added a new feature to support
UploadFile(from Starlette) in FastAPI, the new documentation is here: https://fastapi.tiangolo.com/tutorial/request-files/#file-parameters-with-uploadfileThat would probably be the best way to deal with large files.
It uses a standard "spooled" file (in memory up to some limit, then on disk).
And you can interact with it as a normal file in Python.
Stream request content
If you have some specific use case that requires you to read the
bytesas a stream of content, chunk by chunk (that also means that you don't need to have to whole content/file before starting to read it), you can use the same code as in the e…