You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It's a good idea, we've gone down this path before! Unfortunately our primary backend (GCS) doesn't support uploading chunks concurrently so we ended up abandoning it. Still open to this for other backends, though!
I think you're on the right track in terms of configuration:
Should not be the default behavior
Need to be able to configure the number of concurrent chunks
Also just a heads up, we ended up doing quite a bit of experimentation at Mux over performance for large files like this. One of the big reasons why we ended up with our adaptive chunk size approach is because (up to a point), the fewer chunks, the faster the upload in total. Might want to experiment on that side of things as well.
Are you interested to take in an PR to send chunks in parallel and would you in that case be open to discuss the approach to take?
I need to upload 10-100 GB files and want to see if I can improve the speed by sending
n
chunks in parallel to my backend.The text was updated successfully, but these errors were encountered: