-
Notifications
You must be signed in to change notification settings - Fork 168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Concurrent uploads #7
Comments
Strictly speaking, no. But it's possible to launch multiple instances, e.g. to upload different directories. |
Have you tested this (running multiple instances in parallel)? PS: this is very high on my wish-list as well, since the ACD Desktop App can't even do that, but then again - what can it do 💥 ? |
To be clear - this is a great addition, but you have to consider that it will only be useful for people, who have large upload bandwidths available. The server caps the connection speed at Imagine your ISP limits your upload to As I said this only becomes interesting when you have multiples of If you have a proper glass-fibre connection, you might get there, but anything else, forget about it. Example US ProvidersV.Fios: you need a plan above the Alright, I am drifting off-topic, I'm afraid, but my point was simply to show that you need access to a very fast connection for this to be beneficial. |
I wrote a tiny wrapper using |
This is turning into a bit of a lonesome conversation 👽 but since I am testing this, I thought I should share my findings. After good results uploading 2 files in parallel, I realised that the performance will also heavily depend on the disk read-speeds. If you are syncing files from an external I am currently testing 12 files in parallel, but peak transfer rates are stalling at Also noteworthy - CPU times are quite reasonable. Every process uses about |
@chrisidefix If there are two "overlapping" writes to the sqlite database, the instance that tries to write later should crash because it cannot acquire a lock. However, there also may be unnecessary auth token refreshes. PS: My maximum upload speed is about 8Mbit/s, so this isn't very high on my priority list. |
It may become interesting again, when you either: (1) want to upload many small files Download speed for many folks could very well be above the limit, but you are right, it's a nice to have feature that may be too stressful to implement when thinking about its actual benefit. |
* added QueuedLoader - concurrent transfers (#7) - retry on error (disabled by default) * retry_on decorator added for transfer functions (jobs) * api: add multiple read/write callbacks api for ul/dl * api: progress printing removed * api: fix for resuming of incomplete downloads * db conn thread check disabled * single file progress wrapper FileProgress added * progress aggregator MultiProgress added * progress speed determination improved * download behavior changed to skip existing files
How does the concurrent download works? |
There is now an |
@yadayada Thanks for implementing this. This commit shows a significant performance improvement - I tested this feature and can confirm that even for slower upload speeds it is well worth using parallel uploads. It allows a much more continuous use of the available bandwidth. The only downside is an elevated use of CPU resources. Previously, when I ran 4 processes in parallel, CPU use would be at about 20% (~ 5% per process). Now for All in all, you could consider making UPDATE: I am continuously uploading large files (at least 2 GB in size) from an external USB drive running |
I am also finding this a great feature, especially when uploading directories with many small files. With small files, uploading serially won't maximize the bandwidth usage. However, I'm not seeing a big jump in CPU usage with -x 8. Python's CPU usage is currently 20%-50%, hovering at 20+% most of the time. |
I'm keeping one thread as the default for now, because I'm currently not sure if it's safe to insert into sqlite from different threads under all conditions. To my regret, sqlite3 seems to be safe for multiple processes, but not necessarily thread-safe. |
Hi,
Can this do concurrent uploads? If so, how many?
Thanks!
The text was updated successfully, but these errors were encountered: