New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Server side copy in Google Drive stops working #1339
Comments
I am now having this exact problem. Last week server-server would transfer nearly instantly. Now, it transfers exactly 100GB, then my account gets locked. ERROR : /: Failed to copy: googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded Thanks for posting this. I've been beating my head against the wall trying to figure out what I've done wrong. |
I have the same problem too, I can’t copy anything on server side anymore, and I keep getting error : Failed to copy: googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded |
That would seem the likely conclusion. Are you using your own credentials or rclone's? |
I’m using rclone credential, but I will test my own clientID and report back as soon as I get chance. |
My guess is it copied around 100GB before it locked you out, right? The API errors are normal. Its what you'll see when you try to do the copy after you're locked out. I'm leaning towards this is an intentional (non-transparent) change by Google. Think about the number of people out there who brag about their 100TB+ collections. Then you could easily sync all that data to multiple other accounts in seconds. Its definitely in their best interest to not allow server side transfers at that large of a scale. Just a hunch :) |
Maybe, but I hope it’s something temporary and get fix. |
@ncw PS. Never mind, it stuck again :( |
Having the same problem here. Transfered 120GB in 36h...error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded. |
Same problem here .. |
I can copy al my files, but when I scan my library in Plex with rclone mount, I always have the 403. Here are my rclone mount flags: rclone -v mount --allow-non-empty --allow-other --max-read-ahead 200M --drive-chunk-size 128M --checkers 40 --dir-cache-time 30m --transfers=32 gdrive-crypt: /mnt/gdrive-media & |
@kicker83 This is how I've mounted my Google Drive. It has been working perfectly for about 2 weeks now and still is. rclone mount \
--read-only \
--allow-non-empty \
--allow-other \
--max-read-ahead 200M \
--checkers 16 \
--quiet \
--stats 0 \
egd1:/LIBRARY/MOVIES /home/gd-movies/& |
@AloiSama thanks so much! Im going to configure it, and Ill tell you the results ;) |
@AloiSama you can remove --checkers as they dont do anything in mount and as well --max-read-ahead since rclone default is 128k and thats max for OS. |
Has anyone managed to figure out if there is a fix for: This is what I get:
2017/04/29 15:04:11 DEBUG : pacer: Rate limited, sleeping for 16.480279449s (10 consecutive low level retries) |
I made a quick and dirty fix there https://github.com/Cubox-/rclone/blob/patch-1/drive/drive.go#L795 that seems to allow you to keep copying files, making you download them and upload them back again right after. Not optimal, but works for me. |
Update on that fix, with 64 transfers and a good dedicated server, works very well (even better than the server side copies if I had to guess) @ncw is that a fix you think can be included in rclone? |
@Cubox- having a flag to disable optional features sounds like a good plan. I'd probably make it a bit more general though. Can you make a new issue please and mention this one? We should make sure the drive docs get updated too! I also note that you can disable server side copy by making a copy of your remote. So say your remote is called drive:, you make a copy of it called drive2:, then use rclone copy drive: drive2: - rclone won't do server side copies. |
Here are an user more with this issue... |
Does rclone do batching of copy requests? You should be able to get further before hitting the rate limit by sending batched requests. https://developers.google.com/drive/v3/web/batch#batch-example-request |
Also, you can work around the rate limit entirely by using the service account impersonation feature to impersonate multiple users within a G Suite domain, load balancing the copies across the users. Then you can change the file owners to the real destination owner. It would be quite nice if rclone supported doing this natively, as it's quite an involved process to do manually. I've tested this by manually creating multiple users, then using rclone with an impersonating service account to copy a large directory, canceling it when it hits the rate limit, then switching to the next user. |
I am facing similar rate limits, would you have an example of how you did the rclone impersonation and then cleanup/reset to the intended owner? is the impersonation feature something that needs to be enabled somewhere within the gsuite domain panel? |
This is quota related on Google. Closing out. |
I am using rclone v1.36
MacOS 10.11.6
Using Google Drive to -> google drive
rclone copy -vv -u -c remote:folder_1 remote:folder_2
After about 100GB of data transferred the transfer stops showing the following error
DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded
After about 24h rclone start working again for about the same amount of data transferred
This was not happening a week ago, seems like Google now enforce a data transfer quota for server side copy or something has changed in the way server side copy is implemented
The text was updated successfully, but these errors were encountered: