Skip to content
This repository has been archived by the owner on Apr 19, 2023. It is now read-only.

Script to Empty Google Drive Trash #436

Open
chris-mann-uk opened this issue Feb 25, 2019 · 6 comments
Open

Script to Empty Google Drive Trash #436

chris-mann-uk opened this issue Feb 25, 2019 · 6 comments

Comments

@chris-mann-uk
Copy link

As the subject title suggests, is there a way to empty the trash folder via command line?

I'm thinking this could potentially run as a cron once or twice a week, rather than being done manually through the web interface.

@mbenlioglu
Copy link
Contributor

You can list the files in trash and remove them all with gdrive delete like this:

$ gdrive list -q "trashed = true" --no-header --max 0 | cut -d" " -f1 - | xargs -L 1 gdrive delete -r

What this does is basically list all files that are in trash, extract their file IDs then pipe it to gdrive delete -r one by one

I strongly recommend you consider checking out #422 (comment) which has some tips to safe ways of implementing such queries too especially if you're planning to use cron jobs. Starting from this line

Before I finish, I want to point out 2 things in this kind of query.

Cheers

@chris-mann-uk
Copy link
Author

Thanks. The query seems to work but I very quickly run into problems with 'Error 403: Rate Limit Exceeded'.

The solution linking to the other issue was also one of mine, so I'm familiar with the code you provided a couple of weeks ago.

The situation I'm having is that I have a cron which runs at 3am every day. This creates a backup of my websites and databases, which is then sent over to Google Drive. The file is approximately 450MB.

Then, every Sunday, I have another cron which lists the backups that are older than one week and deletes them. It is this that seems to create the 403 error and I'm unsure as to a way around it.

@chris-mann-uk
Copy link
Author

Another solution I've thought of is using Google Drive's versioning feature. So, rather than having multiple backup files that are just deleted once a week, the latest backup is always available - with older versions available for 30 days or 100 versions through the web interface.

I'm unsure how to go about overwriting an exiting file, however, as the GDrive README seems to suggest downloading a file rather than uploading.

@mbenlioglu
Copy link
Contributor

The solution linking to the other issue was also one of mine, so I'm familiar with the code you provided a couple of weeks ago.

Ah my bad sorry, didn't realize it was you :)

Error:403: Rate Limit Exceeded happens because everyone that clones this repo shares same API keys (unfortunately) and that has a limit. So you need to recreate an API key from Google developer console and recompile the binaries to have your own individual quota. For detailed tutorial refer to #426

This behavior most likely will change in a future release and you won't need to recompile but just add the API keys

@mbenlioglu
Copy link
Contributor

Another solution I've thought of is using Google Drive's versioning feature. So, rather than having multiple backup files that are just deleted once a week, the latest backup is always available - with older versions available for 30 days or 100 versions through the web interface.

I'm unsure how to go about overwriting an exiting file, however, as the GDrive README seems to suggest downloading a file rather than uploading.

You can overwrite an existing file using gdrive update To see the usage of it, execute $ gdrive help update
Here's an example usage from README:

Update file (create new revision)

$ gdrive update 0B3X9GlR6EmbnNTk0SkV0bm5Hd0E gdrive-osx-x64
Uploading gdrive-osx-x64
Updated 0B3X9GlR6EmbnNTk0SkV0bm5Hd0E at 2.0 MB/s, total 8.3 MB

@chris-mann-uk
Copy link
Author

I've just done a trial run of the 'update' script and seem to have avoided a 403 error for now. Hopefully it will remain that way and, if so, it will be what I use going forward.

At 450MB a backup, using the versioning feature makes much more sense as I don't have to worry about deleting files to free up space. Essentially, it will save me ~3GB a week which is very useful when restricted to a 30GB plan of GSuite.

If I run into any problems, I'll report back.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants