Skip to content

fix remaining time estimation for public link uploads #37053

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 6, 2020

Conversation

karakayasemi
Copy link
Contributor

@karakayasemi karakayasemi commented Feb 29, 2020

Description

Currently, remaining time estimation is relies on new Date().getMilliseconds(); difference between the two fileupload.fileuploadprogressall events. When time second change there is no meaning in this milliseconds difference. We are using Jquery-file-upload package in file-upload and the package is already providing bitrate for upload. With this PR the code will use this bitrate.

In addition, to provide a smooth experience, the code is using a buffer mechanism. But, in the first bufferSize(20) events, since the buffer is empty, time estimation is wrong. The remaining time is rapidly increasing at the beginning. This problem resolved by dividing bufferTotal to the filled buffer size. Also, this buffer logic explained by adding a comment line.

Related Issue

Motivation and Context

Fixing bugs.

How Has This Been Tested?

Test 1:

  • Upload a 5 GB file via a public link
  • Look at the estimated upload time (Previously it was wrong on Chrome)

Test 2:

  • Upload a 5 GB file as user
  • Look at the estimated upload time

I repeated tests with IE, Opera, Chrome and Firefox browsers. All of them look okay.

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Database schema changes (next release will require increase of minor version instead of patch)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Technical debt
  • Tests only (no source changes)

Checklist:

  • Code changes
  • Unit tests added
  • Acceptance tests added
  • Documentation ticket raised:
  • Changelog item, see TEMPLATE

@update-docs
Copy link

update-docs bot commented Feb 29, 2020

Thanks for opening this pull request! The maintainers of this repository would appreciate it if you would create a changelog item based on your changes.

@codecov
Copy link

codecov bot commented Feb 29, 2020

Codecov Report

Merging #37053 into master will not change coverage by %.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff            @@
##             master   #37053   +/-   ##
=========================================
  Coverage     64.75%   64.75%           
  Complexity    19135    19135           
=========================================
  Files          1270     1270           
  Lines         74909    74909           
  Branches       1329     1329           
=========================================
  Hits          48511    48511           
  Misses        26007    26007           
  Partials        391      391           
Flag Coverage Δ Complexity Δ
#javascript 54.17% <ø> (ø) 0.00 <ø> (ø) ⬆️
#phpunit 65.93% <ø> (ø) 19135.00 <ø> (ø) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 6cbadba...b464c26. Read the comment docs.

@karakayasemi karakayasemi force-pushed the fix-upload-progress branch 2 times, most recently from bd6dd18 to 4c29a21 Compare February 29, 2020 21:35
@jvillafanez
Copy link
Member

Couple of questions:

In addition, to provide a smooth experience, the code is using a buffer mechanism.

Do we need this? I think this "sugarcoating" not only it isn't accurate but also makes the code more complex.
In addition, it might be easier to understand to use an average of the previous "known" bitrates in order to calculate the remaining time.

The other question is if we're handling a possible "0" bitrate, mainly to not crash something.

@karakayasemi karakayasemi force-pushed the fix-upload-progress branch from 4c29a21 to b464c26 Compare March 3, 2020 07:12
@karakayasemi
Copy link
Contributor Author

Couple of questions:

In addition, to provide a smooth experience, the code is using a buffer mechanism.

Do we need this? I think this "sugarcoating" not only it isn't accurate but also makes the code more complex.
In addition, it might be easier to understand to use an average of the previous "known" bitrates in order to calculate the remaining time.

Yes, I agree on the complexity part. However, now I tried calculating with average bitrates, I even implemented with different weighted averages to reduce the effect of a single bitrate, the buffer implementation seems slightly more performant. In addition, I found the commit of this buffer algorithm with git blame. The committer claims that he followed the same logic with the desktop client. IMHO, it can stay.

The other question is if we're handling a possible "0" bitrate, mainly to not crash something.

Yes, we have controls for 0 or negative bitrate case. The code ignores these bitrates.

@karakayasemi
Copy link
Contributor Author

@cdamken, @voroyam If you confirm the issue resolved, we can merge this one. Thanks.

@phil-davis
Copy link
Contributor

When I have a local ownCloud and upload a ~2GB ISO file locally it actually takes about 20 seconds.

Before this change the progress bar would say "a few seconds" and then after a few seconds it would change to saying "2 minutes" "3 minutes"... and then near the end go back to estimating "a few seconds".

After this change the progress bar says "a few seconds" all the time (and on some runs might say "1 minute" early in the upload if the upload is going a bit slow).

On a 10G file it says "2 minutes" pretty consistently, sometimes estimating 3 or 4 minutes (I think the file in system in the background slows down every now and then as it writes out the uploading file). Without the change, it seems to always say "1 minute" or "a few seconds" - i.e. it was too optimistic.

For me. this change is "a good thing".

It does need to be tried in an environment with "normal" internet speeds and a file size that will take 10 minutes or more to upload, to get a real idea of how "steady" the time estimate is.

@karakayasemi
Copy link
Contributor Author

It does need to be tried in an environment with "normal" internet speeds and a file size that will take 10 minutes or more to upload, to get a real idea of how "steady" the time estimate is.

I made some tests by limiting the internet speed of my virtual machine. As far as I see, all test results are more stable than before.

@jvillafanez
Copy link
Member

@karakayasemi check if https://github.com/owncloud/core/pull/36814/files#diff-87d6d7ffd7622e59f65a6e6be8de6a56R53-R59 could be useful here. It's mainly to adjust the thresholds in order to show more accurate info instead of "a few seconds"

@karakayasemi
Copy link
Contributor Author

@karakayasemi check if https://github.com/owncloud/core/pull/36814/files#diff-87d6d7ffd7622e59f65a6e6be8de6a56R53-R59 could be useful here. It's mainly to adjust the thresholds in order to show more accurate info instead of "a few seconds"

@jvillafanez thank you for the suggestion. I guess increasing the precision needs a pm decision. Let's merge it as it is by only fixing its errors.

@karakayasemi karakayasemi merged commit a3b4a34 into master Mar 6, 2020
@delete-merged-branch delete-merged-branch bot deleted the fix-upload-progress branch March 6, 2020 08:50
@davitol
Copy link
Contributor

davitol commented Apr 2, 2020

@karakayasemi Do we refer in this ticket to this time estimation?:

uploadBar

If so, it looks like estimation moves randomly

@karakayasemi
Copy link
Contributor Author

karakayasemi commented Apr 3, 2020

@davitol Can you compare the new code performance with the old one?

If the first several calculated bit rates are slower in your instance it can led this behavior. We are also using an smoothing algorithm in here that uses last 20 consecutive packages average for calculating remaining time. The first 20 packages should be calculated quickly, but any fluctuated package can affect the average significantly at the beginning of upload.

@micbar
Copy link
Contributor

micbar commented Apr 3, 2020

@karakayasemi Can you clarify if we need another fix? If yes, we would need it on the release-10.4.1 branch

@karakayasemi
Copy link
Contributor Author

I do not think, we need a new fix. Since bitrate fluctation is very common problem, the time estimation is very tricky. However, the new implementation has better performance in my tests. Also @phil-davis stated similar comment on the above #37053 (comment)

@phil-davis
Copy link
Contributor

When I tested this in a local git clone and uploaded a few GB just on my localhost I found that the estimation was more stable than previously. There is no perfect solution for this - software cannot predict the future (e.g. if the network suddenly slows to a trickle then obviously the existing estimate is now wrong). IMO it is an improvement. Nothing is broken. If someone wants to improve it more, then great - propose a PR to master... But the code in 10.4.1 is fine to release.

@jvillafanez
Copy link
Member

I'm more inclined to provide a percentage based on the mount of data transferred because it should be more foreseeable and more accurate, but that's a different thing to be considered in the future.
The main problem with these kind of estimations is that we don't know, from a user's perspective, if that's a bug in the code, or a bitrate fluctuation, or a different thing. We could provide a random amount of time and blame the bitrate fluctuation.

I'm not saying to change anything (it's out of scope anyway), but I think it's something to take into account for future developments.

@davitol
Copy link
Contributor

davitol commented Apr 3, 2020

@davitol Can you compare the new code performance with the old one?

Yes, i already did when i tested it and I did not see relevant differences. But if you made some tests as you said and found it more stable, and also went fine for @phil-davis I'm ok to keep it like it is now for this version and let's see if @jvillafanez will be worthy for the future. Thank you all for the feedback.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Estimate properly or not show ETA for uploaded files using the web UI.
5 participants