New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Flag to stop Rclone if google drive daily upload limit (750gb) reaches #3857
Comments
I'd be delighted to implement this if someone can work out a reliable signal for it from google. |
I have figured out a way to catch the upload limits reached signal. Rclone throws this error when Project limit reaches:
But when the daily upload limit (750gb) reaches, Rclone starts throw this errors:
I had to upload dummy contents to 3 of my google drive to confirm the error messages and all of them returned the same errors when upload limit reaches. The main part is: Error 403: User rate limit exceeded., userRateLimitExceeded Also note that User rate limit exceeded have different casing between the errors. I also tried to generate other errors by increasing checkers, burst, tps, but none of them matches errors like this one. I am 99% positive that this would work. |
@Dibbyo456 thanks for that analysis - that looks interesting. At the moment rclone is only looking at the Ignoring those parts which are the same we get
where 1) is the permanent limit and 2) is the temporary one. So what rclone could do is see if the error text is exactly
Wrapping it in This is perhaps specific enough that it doesn't need gating with a flag, not sure - what do you think? Is it possible to get a sample of the two messages using |
Yup.
Yup, sounds good to me.
I actually do not understand, what you mean by this. :/
Temporary errors (api limits): temporary-err.log Permanent errors (750gb reaches): permanent-err.log |
Sorry! I meant make the error -> FatalError conversion without any flag being present. So you wouldn't need a I think that I'm probably a bit nervous about that so I think I'd prefer to have a flag. I'll have a go at making one in a bit and post it for you to try.
Thank you for that. Those are clearly coming from different servers which is interesting - you could certainly tell them apart from the headers. There isn't really anything interestingly different in the responses though. So I think sticking with the original plan of looking at the error message is probably the best one. Here they are for posterity Temporary error
Permanent error
|
OK I've given this a go! When it detects the limit it should write an ERROR in the log and convert the error to a Fatal error. This should stop the sync. Please test and see if it works :-) Thanks! https://beta.rclone.org/branch/v1.50.2-131-g58064bdd-fix-3857-drive-upload-limit-beta/ (uploaded in 15-30 mins) --drive-stop-on-upload-limitMake upload limit errors be fatal At the time of writing it is only possible to upload 750GB of data to Note that this detection is relying on error message strings which See: #3857
|
Working perfectly.
|
Nice one - thank you for testing :-) I've merged this to master now which means it will be in the latest beta in 15-30 mins and released in v1.51 |
What error level will it exit with when this is met? |
|
For what it's worth, this has been documented since at least 2018. https://support.google.com/a/answer/172541
|
Thanks for digging that up - not sure how I've never seen it before - very useful :-) |
sorry guys but is this --drive-stop-on-upload-limit supposed to be implemented ? 2020/06/08 08:59:03 Fatal error: unknown flag: --drive-stop-on-upload-limit I'm using: rclone v1.49.5
I really would like to get this to be able to start several rclone copies one after the other. EDIT: Sorry my fault. I upgraded from 1.49 to 1.52, problem gone ! thanks for this param ! |
Handle edgecase when curl http code is 000 in case of resumeable uploads proper handle other problems like rate limits, or any other api side problem follow # rclone/rclone#3857 (comment) never retry if upload limit is reached ( upload qouta )
Handle edgecase when curl http code is 000 in case of resumeable uploads proper handle other problems like rate limits, or any other api side problem follow # rclone/rclone#3857 (comment) never retry if upload limit is reached ( upload qouta )
I'm already aware of the🤔
--bwlimit
and--max-transfer
flags but wouldn't be more helpful if we could stop/exit Rclone if the daily upload limit reaches? Something like--stop-if-daily-upload-limit-reached
The main issue with this idea is google drive does not send any particular error message in case of daily upload limit reaches which makes it harder to determine.😒
Similar discussion on forum.
The text was updated successfully, but these errors were encountered: