Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: User rate limit exceeded. File: application (Copy Folder). Line: 446 #86

Closed
eshao opened this issue Feb 27, 2019 · 8 comments
Closed
Labels

Comments

@eshao
Copy link

eshao commented Feb 27, 2019

Are you requesting a feature or reporting a bug?

Reporting a bug

If reporting a bug, is there already an issue open for this same bug?

There is a similar bug, but referencing a different line number

What is the current behavior?

When running the script, eventually lots of files fail with the above error. Even after waiting more than 24 hours and resuming the script, I end up with the same error happening ~1000s of rows after only successfully copying ~20 files. The files I am copying are each around 100-500MB large.

Additionally, there are a lot of blank lines in my Copy Folder Log:

image

What is the expected behavior?

After waiting 24+ hours and resuming, would be able to copy more than 20 files.

How would you reproduce the current behavior (if this is a bug)?

Resume copying on my current folder.

Specify your

  • operating system: On both Windows and OSX
  • browser: Chrome
@eshao
Copy link
Author

eshao commented Feb 28, 2019

Related, every time I resume the script, it seems to be starting 6 copies of itself instead of only starting once.

image

@ericyd ericyd added the bug label Mar 16, 2019
@0x112
Copy link

0x112 commented Apr 4, 2019

Same error, Even though copy error, It will also be counted to "Total files copied".

@Elliander
Copy link

I am having the same exact issue, but it gets worse if you keep trying. After a little over 1000 files are copied, it will hang on pausing:

Paused due to Google quota limits - copy will resume in 1-2 minutes |   |   |   | 04-17-19 11:35:02 AM
Paused due to Google quota limits - copy will resume in 1-2 minutes |   |   |   | 04-17-19 11:41:42 AM
Paused due to Google quota limits - copy will resume in 1-2 minutes |   |   |   | 04-17-19 11:48:38 AM

If I manually pause it, and try again later, it might not do anything. Re-pausing a few hours later resulted in a time stamp that says it was stopped manually a few minutes after rather than a few hours after:

Stopped manually by user. Please use "Resume" button to restart copying |   |   |   | 04-17-19 4:14:56 PM
Stopped manually by user. Please use "Resume" button to restart copying |   |   |   | 04-17-19 4:17:12 PM

After waiting a day, if it hasn't hit the API error yet, it will copy 10 files before saying:

Error: API call to drive.files.copy failed with error: User rate limit exceeded. File: application (Copy Folder). Line: 495

(which increments the count, "total files copied")

and then it will continue this until it marks it as complete, rather than pausing the moment there is a problem.

Once this happens, it doesn't matter if I pause and try again a few hours later, or let it "complete" and try again a few hours later, it will copy the same 10 files it already copied and then give the API error again. It gets stuck in a loop, creating tens of duplicates of the same files if I keep trying to get it to finish, never copying anything more.

I initially thought that maybe the issue was related to google backup and sync going on at the same time, since it downloads and syncs the spreadsheet and the file that says it should not be modified or deleted, but after deleting everything and starting over from scratch without backup and sync downloading anything the same thing happened again. Only worse.

Second time around, it copied 95 files between the pause and the API error, so when it was resumed more than 24 hours later it just copied again those same 95 files (even though they were correctly marked as copied in the spreadsheet) and just goes straight back to the API error.

The pattern is extremely predictable. I can sit there and know exactly at what point it will hit the API error, and then it doesn't matter if I wait a minute, or an hour, or a day, it will behave the exact same way again.

Since it will resume immediately after pausing, I then tried manually pausing before the 95 were done copying (again) and then it continued where it left off copying those 95, so I paused and resumed a few times hoping it would break it out of the loop, but the moment it hit the 95th file it then went back to the API errors.

Obviously the API error is wrong. The user rate limit has not been exceeded. Otherwise, it wouldn't be able to try again and again copying the same 95 files dozens of times. Rather, it just doesn't seem to know anymore what files to copy.

It seems impossible to actually complete a copy operation involving thousands of files in the same folder no matter how many hours or days I wait. Worse still, because it's not done in alphabetical order, I can't even manually finish it if it was almost done. Instead, it looks like I will have to manually copy thousands of files from start to finish, which is going to be a real pain.

Interestingly though, when there are thousands of files across multiple folders I did not encounter this issue, so I think it is related in some way to the number of files in the same folder it is trying to read through.

Solution Idea:

Add a "fix resume" option which will check when resuming to make sure that files already copied are not already marked as copied in the spreadsheet (to avoid duplicated files) while it is doing this to make sure that the "total files copied" only reflects the total number of files marked as "copied" in the spreadsheet, and whatever error gets trigger creating the loop to begin with, to fix it so that you don't have to start over from scratch.

Workaround Idea:

If the copy operation checked the destination folder prior to copying, to ensure that duplicates are not created, it would allow us to start over when the copy operation breaks, and have it "skip over" everything already copied.

That way, if the problem is related to how long the operation was running, that would effectively fix it. If it does, you'd just have to put in checks for the looping behavior described and if caught it would automatically "restart" the operation while deleting and recreating the "spreadsheet" and "do not delete" files.

However, I suspect that the problem is related to how many files are in the destination folder. I noticed that after a thousand files are copied, the spreadsheet doesn't show up inside the google drive folder unless I specifically search for it, and as more files fill the folder the resume operation starts to fail as well until I search for it manually and open it. Adding evidence to this idea is that when I transferred more files over many folders there were no problems.

So, if the problem is related to too many files in the destination folder, it means that the spreadsheet file probably needs to be somewhere other than where the files are piling up. One solution to this would be to simply create a sub folder in the destination directory. Put the spreadsheet and "do not delete" file in the destination folder with that sub folder and then have all the files copied go there. If that doesn't fix it, then maybe the files could go in multiple sub-folders. Say, a thousand in each one.

@Ognisty321
Copy link

Any news?

@newbieh4cker
Copy link

@ericyd any news on this issue??

@ericyd
Copy link
Owner

ericyd commented Jan 20, 2020

I don't have time to work on this project, I'm consolidating all the "user rate limit exceeded" errors into #82 which is the oldest open bug for this error. I will gladly review any PRs to resolve this issue.

@ericyd ericyd closed this as completed Jan 20, 2020
@Eliz768653
Copy link

same here, also have this error

Error: API call to drive.files.copy failed with error: User rate limit exceeded. File: GDriveService (Copy Folder). Line: 93

@Ognisty321
Copy link

He don't know how to fix it :/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

7 participants