Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Server side copy in Google Drive stops working #1339

Closed
Arduingo opened this issue Apr 15, 2017 · 23 comments
Closed

Server side copy in Google Drive stops working #1339

Arduingo opened this issue Apr 15, 2017 · 23 comments

Comments

@Arduingo
Copy link

Arduingo commented Apr 15, 2017

I am using rclone v1.36
MacOS 10.11.6
Using Google Drive to -> google drive
rclone copy -vv -u -c remote:folder_1 remote:folder_2

After about 100GB of data transferred the transfer stops showing the following error

DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded

After about 24h rclone start working again for about the same amount of data transferred

This was not happening a week ago, seems like Google now enforce a data transfer quota for server side copy or something has changed in the way server side copy is implemented

@SpongeBobTheBuilder
Copy link

I am now having this exact problem. Last week server-server would transfer nearly instantly. Now, it transfers exactly 100GB, then my account gets locked.

ERROR : /: Failed to copy: googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded

Thanks for posting this. I've been beating my head against the wall trying to figure out what I've done wrong.

@dany20mh
Copy link

I have the same problem too, I can’t copy anything on server side anymore, and I keep getting error :

Failed to copy: googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded

@ncw
Copy link
Member

ncw commented Apr 18, 2017

seems like Google now enforce a data transfer quota for server side copy or something has changed in the way server side copy is implemented

That would seem the likely conclusion. Are you using your own credentials or rclone's?

@dany20mh
Copy link

I’m using rclone credential, but I will test my own clientID and report back as soon as I get chance.

@dany20mh
Copy link

dany20mh commented Apr 18, 2017

So I run the server copy code with my own ClientID and it’s interesting that at the beginning it had high usage of API call and it copy a few files, but all of the sudden request fall and it’s only error in the API console

screen shot 2017-04-18 at 1 40 37 pm

If there is anything else you need I check, let me know.

@SpongeBobTheBuilder
Copy link

SpongeBobTheBuilder commented Apr 18, 2017

My guess is it copied around 100GB before it locked you out, right? The API errors are normal. Its what you'll see when you try to do the copy after you're locked out.

I'm leaning towards this is an intentional (non-transparent) change by Google. Think about the number of people out there who brag about their 100TB+ collections. Then you could easily sync all that data to multiple other accounts in seconds. Its definitely in their best interest to not allow server side transfers at that large of a scale. Just a hunch :)

@dany20mh
Copy link

Maybe, but I hope it’s something temporary and get fix.

@dany20mh
Copy link

dany20mh commented Apr 19, 2017

@ncw
Well I just find out that this problem is from 1.36
Because as soon as I revert back to 1.35 I can copy all of my file with the server copy and it’s fast as before without any limit.

PS. Never mind, it stuck again :(

@shenzo1
Copy link

shenzo1 commented Apr 20, 2017

Having the same problem here. Transfered 120GB in 36h...error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded.

@CorentinB
Copy link

Same problem here ..

@kicker83
Copy link

I can copy al my files, but when I scan my library in Plex with rclone mount, I always have the 403. Here are my rclone mount flags:

rclone -v mount --allow-non-empty --allow-other --max-read-ahead 200M --drive-chunk-size 128M --checkers 40 --dir-cache-time 30m --transfers=32 gdrive-crypt: /mnt/gdrive-media &

@AloiSama
Copy link

AloiSama commented Apr 28, 2017

@kicker83 This is how I've mounted my Google Drive. It has been working perfectly for about 2 weeks now and still is.

rclone mount \
--read-only \
--allow-non-empty \
--allow-other \
--max-read-ahead 200M \
--checkers 16 \
--quiet \
--stats 0 \
egd1:/LIBRARY/MOVIES /home/gd-movies/&

@kicker83
Copy link

@AloiSama thanks so much! Im going to configure it, and Ill tell you the results ;)

@ajkis
Copy link

ajkis commented Apr 28, 2017

@AloiSama you can remove --checkers as they dont do anything in mount and as well --max-read-ahead since rclone default is 128k and thats max for OS.

@disgrace2029
Copy link

Has anyone managed to figure out if there is a fix for:
(error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)

This is what I get:
rclone-v1.36-windows-386\rclone-v1.36-windows-386>rclone -vv copyto Google:"TempFolder/Hawaii Five 0" Google:"temp3"
2017/04/29 15:02:56 DEBUG : rclone: Version "v1.36" starting with parameters ["rclone" "-vv" "copyto" "Google:TempFolder/Hawaii Five 0" "Google:temp3"]
2017/04/29 15:02:59 INFO : Google drive root 'temp3': Modify window is 1ms
2017/04/29 15:02:59 DEBUG : Google drive root 'temp3': Reading ""
2017/04/29 15:02:59 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Reading ""
2017/04/29 15:02:59 DEBUG : Google drive root 'temp3': Finished reading ""
2017/04/29 15:03:00 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Finished reading ""
2017/04/29 15:03:00 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Reading "Season 6/"
2017/04/29 15:03:01 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Reading "Season 3/"
2017/04/29 15:03:01 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Finished reading "Season 6/"
2017/04/29 15:03:01 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Reading "Season 4/"
2017/04/29 15:03:02 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Finished reading "Season 3/"
2017/04/29 15:03:02 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Reading "Season 5/"
2017/04/29 15:03:03 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Finished reading "Season 4/"
2017/04/29 15:03:03 DEBUG : pacer: Rate limited, sleeping for 1.94777941s (1 consecutive low level retries)
2017/04/29 15:03:03 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/04/29 15:03:03 DEBUG : pacer: Rate limited, sleeping for 2.082153551s (2 consecutive low level retries)
2017/04/29 15:03:03 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/04/29 15:03:03 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Reading "Season 2/"
2017/04/29 15:03:03 DEBUG : pacer: Rate limited, sleeping for 4.666145821s (3 consecutive low level retries)
2017/04/29 15:03:03 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/04/29 15:03:03 DEBUG : pacer: Rate limited, sleeping for 8.235010051s (4 consecutive low level retries)
2017/04/29 15:03:03 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/04/29 15:03:03 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Finished reading "Season 5/"
2017/04/29 15:03:03 DEBUG : pacer: Rate limited, sleeping for 16.287113937s (5 consecutive low level retries)
2017/04/29 15:03:03 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/04/29 15:03:04 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Reading "Season 1/"
2017/04/29 15:03:04 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Finished reading "Season 2/"
2017/04/29 15:03:04 DEBUG : Google drive root 'TempFolder/Hawaii Five 0': Finished reading "Season 1/"
2017/04/29 15:03:05 DEBUG : pacer: Rate limited, sleeping for 16.54916732s (6 consecutive low level retries)
2017/04/29 15:03:05 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/04/29 15:03:21 DEBUG : pacer: Rate limited, sleeping for 16.632969758s (7 consecutive low level retries)
2017/04/29 15:03:21 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/04/29 15:03:38 DEBUG : pacer: Rate limited, sleeping for 16.331776148s (8 consecutive low level retries)
2017/04/29 15:03:38 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/04/29 15:03:54 DEBUG : pacer: Rate limited, sleeping for 16.183117216s (9 consecutive low level retries)
2017/04/29 15:03:54 DEBUG : pacer: low level retry 3/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/04/29 15:03:59 INFO :
Transferred: 0 Bytes (0 Bytes/s)
Errors: 0
Checks: 0
Transferred: 0
Elapsed time: 1m2.7s
Transferring:

  • Season 6/Hawaii.Five-0.S06E01.720p.WEB-DL.x264.350MB-Pahe.in.mkv
  • Season 6/Hawaii.Five-0.S06E02.720p.WEB-DL.x264.350MB-Pahe.in.mkv
  • Season 6/Hawaii.Five-0.S06E03.720p.WEB-DL.x264.350MB-Pahe.in.mkv
  • Season 6/Hawaii.Five-0.S06E04.720p.WEB-DL.x264.350MB-Pahe.in.mkv

2017/04/29 15:04:11 DEBUG : pacer: Rate limited, sleeping for 16.480279449s (10 consecutive low level retries)
2017/04/29 15:04:11 DEBUG : pacer: low level retry 3/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)

@Cubox
Copy link
Contributor

Cubox commented Jul 18, 2017

I made a quick and dirty fix there https://github.com/Cubox-/rclone/blob/patch-1/drive/drive.go#L795 that seems to allow you to keep copying files, making you download them and upload them back again right after.

Not optimal, but works for me.

@Cubox
Copy link
Contributor

Cubox commented Jul 18, 2017

Update on that fix, with 64 transfers and a good dedicated server, works very well (even better than the server side copies if I had to guess)

@ncw is that a fix you think can be included in rclone?

@ncw
Copy link
Member

ncw commented Jul 18, 2017

@Cubox- having a flag to disable optional features sounds like a good plan. I'd probably make it a bit more general though. Can you make a new issue please and mention this one? We should make sure the drive docs get updated too!

I also note that you can disable server side copy by making a copy of your remote. So say your remote is called drive:, you make a copy of it called drive2:, then use rclone copy drive: drive2: - rclone won't do server side copies.

@Massaguana
Copy link

Here are an user more with this issue...

@eatnumber1
Copy link

Does rclone do batching of copy requests? You should be able to get further before hitting the rate limit by sending batched requests.

https://developers.google.com/drive/v3/web/batch#batch-example-request

@eatnumber1
Copy link

eatnumber1 commented Apr 29, 2018

Also, you can work around the rate limit entirely by using the service account impersonation feature to impersonate multiple users within a G Suite domain, load balancing the copies across the users. Then you can change the file owners to the real destination owner. It would be quite nice if rclone supported doing this natively, as it's quite an involved process to do manually.

I've tested this by manually creating multiple users, then using rclone with an impersonating service account to copy a large directory, canceling it when it hits the rate limit, then switching to the next user.

@TheLinuxGuy
Copy link

I am facing similar rate limits, would you have an example of how you did the rclone impersonation and then cleanup/reset to the intended owner?

is the impersonation feature something that needs to be enabled somewhere within the gsuite domain panel?

@Animosity022
Copy link
Collaborator

This is quota related on Google. Closing out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests