Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow #clear! to accept block for conditional deletion #31

Merged
merged 1 commit into from
Feb 12, 2019

Conversation

hwo411
Copy link
Contributor

@hwo411 hwo411 commented Oct 29, 2018

  • rescue NotFound error if the file was already deleted

Enables conditional cache clearing and also doesn't raise if some other process just deleted the object you want to delete.

* rescue NotFound error if the file was already deleted
@hwo411
Copy link
Contributor Author

hwo411 commented Oct 31, 2018

Btw I realized that all clear! methods in shrine storages have different arguments. Does it make sense to unify the interface of these methods in shrine?

@rosskevin
Copy link
Contributor

rosskevin commented Dec 21, 2018

I was just looking to do a similar thing (delete older than x), thanks for putting this up @hwo411. @renchap @janko-m do you have thoughts on unifying the clear! args or a preferred signature?

@janko
Copy link
Contributor

janko commented Dec 23, 2018

Since #clear! method is not used in Shrine, I didn't think about unifying it. I think the most flexible is a block-based API, the one S3 storage uses and the one you implemented here, so definitely 👍on this PR. I'm thinking of adding block-based API to FileSystem storage as well.

loop do
batch_delete(files.lazy.map(&:name))
files.each { |file| delete_file(file) if block.nil? || block.call(file) }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, but this implementation removes batching deletes, and now each will will be deleted individually. You can still elegantly keep batching with File::List#all and lazy Enumerators:

def clear!(&block)
  prefix = "#{@prefix}/" if @prefix
  files = get_bucket.files prefix: prefix

  all_files = files.all.lazy.select(&block)

  batch_delete(all_files)
end

def batch_delete(files)
  files.each_slice(100) do |file_batch|
    file_batch.each(&:delete) # can we batch deletes with google-cloud-storage gem?
  end
end

I now realized that batching deletes must have been removed when migrating from google-api-client gem to google-cloud-storage gem. I'm wondering if it's still possible to have batching.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't find how to do batch deletes while I was working on this PR, looks like there is no such option for Google Cloud.

@renchap
Copy link
Owner

renchap commented Feb 12, 2019

Thanks for this PR, I will review it shortly.
For the batched delete issue, I opened a feature request in google-cloud-storage: googleapis/google-cloud-ruby#2914

@renchap
Copy link
Owner

renchap commented Feb 12, 2019

Thanks for the PR, merged!
I will open an issue for the batch delete to be implemented once google-cloud-storage supports it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants