-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow #clear! to accept block for conditional deletion #31
Conversation
* rescue NotFound error if the file was already deleted
Btw I realized that all |
Since |
loop do | ||
batch_delete(files.lazy.map(&:name)) | ||
files.each { |file| delete_file(file) if block.nil? || block.call(file) } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, but this implementation removes batching deletes, and now each will will be deleted individually. You can still elegantly keep batching with File::List#all
and lazy Enumerators:
def clear!(&block)
prefix = "#{@prefix}/" if @prefix
files = get_bucket.files prefix: prefix
all_files = files.all.lazy.select(&block)
batch_delete(all_files)
end
def batch_delete(files)
files.each_slice(100) do |file_batch|
file_batch.each(&:delete) # can we batch deletes with google-cloud-storage gem?
end
end
I now realized that batching deletes must have been removed when migrating from google-api-client
gem to google-cloud-storage
gem. I'm wondering if it's still possible to have batching.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't find how to do batch deletes while I was working on this PR, looks like there is no such option for Google Cloud.
Thanks for this PR, I will review it shortly. |
Thanks for the PR, merged! |
Enables conditional cache clearing and also doesn't raise if some other process just deleted the object you want to delete.