Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flag to stop Rclone if google drive daily upload limit (750gb) reaches #3857

Closed
Dibbyo456 opened this issue Jan 6, 2020 · 13 comments
Closed

Flag to stop Rclone if google drive daily upload limit (750gb) reaches #3857

Dibbyo456 opened this issue Jan 6, 2020 · 13 comments

Comments

@Dibbyo456
Copy link
Contributor

@Dibbyo456 Dibbyo456 commented Jan 6, 2020

I'm already aware of the --bwlimit and --max-transfer flags but wouldn't be more helpful if we could stop/exit Rclone if the daily upload limit reaches? Something like --stop-if-daily-upload-limit-reached 馃

The main issue with this idea is google drive does not send any particular error message in case of daily upload limit reaches which makes it harder to determine. 馃槖

Similar discussion on forum.

@ncw
Copy link
Member

@ncw ncw commented Jan 6, 2020

I'd be delighted to implement this if someone can work out a reliable signal for it from google.

@ncw ncw added this to the Help Wanted milestone Jan 6, 2020
@Dibbyo456
Copy link
Contributor Author

@Dibbyo456 Dibbyo456 commented Jan 7, 2020

I have figured out a way to catch the upload limits reached signal.

Rclone throws this error when Project limit reaches:

2020/01/07 13:31:16 DEBUG : pacer: low level retry 1/1 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=xxxxxxxxx, userRateLimitExceeded)

But when the daily upload limit (750gb) reaches, Rclone starts throw this errors:

2020/01/07 14:12:50 DEBUG : pacer: low level retry 10/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2020/01/07 14:12:50 DEBUG : xxx/xxx/xxx/xxx/xxx: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 1/10
2020/01/07 14:12:52 DEBUG : pacer: low level retry 3/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)

I had to upload dummy contents to 3 of my google drive to confirm the error messages and all of them returned the same errors when upload limit reaches.

The main part is: Error 403: User rate limit exceeded., userRateLimitExceeded
As you can see, this one does not gives any extra errors message like the first one, just plain 403 error with no details. This is how you know the limit has been reached.

Also note that User rate limit exceeded have different casing between the errors.
First one (project limit) has: Error 403: User Rate Limit Exceeded.
But second error has: Error 403: User rate limit exceeded.

I also tried to generate other errors by increasing checkers, burst, tps, but none of them matches errors like this one.

I am 99% positive that this would work. 馃檪

@Dibbyo456 Dibbyo456 changed the title [Feature request] Flag to stop Rclone if google drive daily upload limit (750gb) reaches Flag to stop Rclone if google drive daily upload limit (750gb) reaches Jan 10, 2020
@ncw
Copy link
Member

@ncw ncw commented Jan 11, 2020

@Dibbyo456 thanks for that analysis - that looks interesting.

At the moment rclone is only looking at the 403 part (or 429 which is also generated) and the userRateLimitExceeded part.

Ignoring those parts which are the same we get

  1. User rate limit exceeded.
  2. User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=xxxxxxxxx

where 1) is the permanent limit and 2) is the temporary one.

So what rclone could do is see if the error text is exactly User rate limit exceeded. and then wrap that into one of these errors

  1. FatalError - these stop the sync immediately
  2. NoRetryError - this doesn't stop the sync but won't cause low level or high level retries.
  3. NoLowLevelRetryError - this doesn't stop the sync or high level retries, but does stop low level retries.

Wrapping it in FatalError probably does what you want. Does this need to be gated with a flag? It is a drive specific option so it could be something like --drive-stop-if-daily-upload-limit-reached.

This is perhaps specific enough that it doesn't need gating with a flag, not sure - what do you think?

Is it possible to get a sample of the two messages using -vv --dump responses I'd like to see exactly what the HTTP headers and JSON message returned in the response is - there might be a more reliable signal than looking at the error text.

@ncw ncw removed this from the Help Wanted milestone Jan 11, 2020
@ncw ncw added this to the v1.51 milestone Jan 11, 2020
@Dibbyo456
Copy link
Contributor Author

@Dibbyo456 Dibbyo456 commented Jan 12, 2020

@ncw

Wrapping it in FatalError probably does what you want

Yup.

It is a drive specific option so it could be something like --drive-stop-if-daily-upload-limit-reached.

Yup, sounds good to me.

This is perhaps specific enough that it doesn't need gating with a flag, not sure - what do you think?

I actually do not understand, what you mean by this. :/

Is it possible to get a sample of the two messages using -vv --dump responses I'd like to see exactly what the HTTP headers and JSON message returned in the response is - there might be a more reliable signal than looking at the error text.

Temporary errors (api limits): temporary-err.log

Permanent errors (750gb reaches): permanent-err.log

@ncw
Copy link
Member

@ncw ncw commented Jan 12, 2020

This is perhaps specific enough that it doesn't need gating with a flag, not sure - what do you think?

I actually do not understand, what you mean by this. :/

Sorry! I meant make the error -> FatalError conversion without any flag being present. So you wouldn't need a --drive-stop-if-daily-upload-limit-reached flag.

I think that I'm probably a bit nervous about that so I think I'd prefer to have a flag.

I'll have a go at making one in a bit and post it for you to try.

Temporary errors (api limits): temporary-err.log

Permanent errors (750gb reaches): permanent-err.log

Thank you for that.

Those are clearly coming from different servers which is interesting - you could certainly tell them apart from the headers. There isn't really anything interestingly different in the responses though.

So I think sticking with the original plan of looking at the error message is probably the best one.

Here they are for posterity

Temporary error

2020/01/12 12:11:13 DEBUG : HTTP RESPONSE (req 0xc000a9e000)
2020/01/12 12:11:13 DEBUG : HTTP/1.1 403 Forbidden
Transfer-Encoding: chunked
Alt-Svc: quic=":443"; ma=2592000; v="46,43",h3-Q050=":443"; ma=2592000,h3-Q049=":443"; ma=2592000,h3-Q048=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000
Cache-Control: private, max-age=0
Content-Type: application/json; charset=UTF-8
Date: Sun, 12 Jan 2020 06:41:10 GMT
Expires: Sun, 12 Jan 2020 06:41:10 GMT
Server: GSE
Vary: Origin
Vary: X-Origin
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-Xss-Protection: 1; mode=block

3c5
{
  "error": {
    "errors": [
      {
        "domain": "usageLimits",
        "reason": "userRateLimitExceeded",
        "message": "User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project={{redacted}}",
        "extendedHelp": "https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project={{redacted}}"
      }
    ],
    "code": 403,
    "message": "User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project={{redacted}}"
  }
}

Permanent error

2020/01/12 14:53:38 DEBUG : HTTP RESPONSE (req 0xc0002f4300)
2020/01/12 14:53:38 DEBUG : HTTP/1.1 403 Forbidden
Content-Length: 167
Alt-Svc: quic=":443"; ma=2592000; v="46,43",h3-Q050=":443"; ma=2592000,h3-Q049=":443"; ma=2592000,h3-Q048=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000
Content-Type: application/json; charset=UTF-8
Date: Sun, 12 Jan 2020 09:23:38 GMT
Server: UploadServer
Vary: Origin
Vary: X-Origin
X-Guploader-Uploadid: {{redacted}}

{
  "error": {
    "errors": [
      {
        "domain": "usageLimits",
        "reason": "userRateLimitExceeded",
        "message": "User rate limit exceeded."
      }
    ],
    "code": 403,
    "message": "User rate limit exceeded."
  }
}

@ncw
Copy link
Member

@ncw ncw commented Jan 12, 2020

OK I've given this a go! When it detects the limit it should write an ERROR in the log and convert the error to a Fatal error. This should stop the sync.

Please test and see if it works :-) Thanks!

https://beta.rclone.org/branch/v1.50.2-131-g58064bdd-fix-3857-drive-upload-limit-beta/ (uploaded in 15-30 mins)

--drive-stop-on-upload-limit

Make upload limit errors be fatal

At the time of writing it is only possible to upload 750GB of data to
Google Drive a day (this is an undocumented limit). When this limit is
reached Google Drive produces a slightly different error message. When
this flag is set it causes these errors to be fatal. These will stop
the in-progress sync.

Note that this detection is relying on error message strings which
Google don't document so it may break in the future.

See: #3857

  • Config: stop_on_upload_limit
  • Env Var: RCLONE_DRIVE_STOP_ON_UPLOAD_LIMIT
  • Type: bool
  • Default: false

@Dibbyo456
Copy link
Contributor Author

@Dibbyo456 Dibbyo456 commented Jan 12, 2020

Working perfectly. 馃憤

2020/01/12 22:05:26 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2020/01/12 22:05:26 ERROR : Google drive root 'hfs100': Received upload limit error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded
2020/01/12 22:05:26 ERROR : rclone2.log: Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded
2020/01/12 22:05:26 ERROR : Cancelling sync due to fatal error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded
2020/01/12 22:05:26 ERROR : Fatal error received - not attempting retries
2020/01/12 22:05:26 Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded

@ncw
Copy link
Member

@ncw ncw commented Jan 12, 2020

Nice one - thank you for testing :-)

I've merged this to master now which means it will be in the latest beta in 15-30 mins and released in v1.51

@zenjabba
Copy link

@zenjabba zenjabba commented Jan 20, 2020

What error level will it exit with when this is met?

@desimaniac
Copy link

@desimaniac desimaniac commented Jan 21, 2020

What error level will it exit with when this is met?

7

@chutzimir
Copy link

@chutzimir chutzimir commented May 25, 2020

At the time of writing it is only possible to upload 750GB of data to
Google Drive a day (this is an undocumented limit).

For what it's worth, this has been documented since at least 2018.

https://support.google.com/a/answer/172541

Individual users can only upload 750 GB each day between My Drive and all shared drives. Users who reach the 750-GB limit or upload a file larger than 750 GB cannot upload additional files that day. Uploads that are in progress will complete. The maximum individual file size that you can upload or synchronize is 5 TB.

@ncw
Copy link
Member

@ncw ncw commented May 26, 2020

For what it's worth, this has been documented since at least 2018.

https://support.google.com/a/answer/172541

Thanks for digging that up - not sure how I've never seen it before - very useful :-)

@silkyclouds
Copy link

@silkyclouds silkyclouds commented Jun 8, 2020

sorry guys but is this --drive-stop-on-upload-limit supposed to be implemented ?

2020/06/08 08:59:03 Fatal error: unknown flag: --drive-stop-on-upload-limit

I'm using:

rclone v1.49.5

  • os/arch: linux/amd64
  • go version: go1.12.10

I really would like to get this to be able to start several rclone copies one after the other.

EDIT: Sorry my fault. I upgraded from 1.49 to 1.52, problem gone ! thanks for this param !

Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 29, 2020
Handle edgecase when curl http code is 000 in case of resumeable uploads

proper handle other problems like rate limits, or any other api side problem

follow # rclone/rclone#3857 (comment)

never retry if upload limit is reached ( upload qouta )
Akianonymus added a commit to labbots/google-drive-upload that referenced this issue Oct 29, 2020
Handle edgecase when curl http code is 000 in case of resumeable uploads

proper handle other problems like rate limits, or any other api side problem

follow # rclone/rclone#3857 (comment)

never retry if upload limit is reached ( upload qouta )
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants