Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caching Processed Images #227

Merged
merged 1 commit into from May 22, 2019
Merged

Caching Processed Images #227

merged 1 commit into from May 22, 2019

Conversation

kean
Copy link
Owner

@kean kean commented May 6, 2019

Introduction

  • As a user, I would like to store processed images on disk, not the original ones, because recreating the processed ones is expensive and storing the original ones is wasteful
  • As a user, I would like to store both processed and original images on disk so that I could create different variations of the same image by applying different processors, on demand

Technical Solution

Extend ImagePipeline to support caching processed images and allow users to configure the pipeline to store only the original image data, or only the processed image data, or both.

The are multiple prerequisites for making this work.

#229 Decoupling Image Decompression

Decompression is currently part of image processing which it shouldn't be because now in the case when we have processed image data, we will need to decompress it but we won't need to process it again.

#228 Generating Keys

In Nuke 7 ImagePipeline simply uses request.urlString as a cache key which isn't enough for image processing. We need to be able to append image processors to these keys.

#233 Encoding Images

Nuke currently provides image decoding infrastrucure: ImageDecoding, ImageDecoder, ImageDecoderRegistry. But it would now also need to be able to encode processed images.

#235 ImagePipeline v2

The first version of ImagePipeline introduced in Nuke 7 has some technical debt which needs to be addressed. The primary reason for adding tech debt was introduction of progressive decoding without rethinking the existing solution.

Before #227 Caching Processing Image can be implemented this tech debt needs to be addressed.

Data Caching ✅

This one probably doesn't require any changes, we can use the existing DataCaching protocol for caching data:

public protocol DataCaching {
    func cachedData(for key: String) -> Data?
    func storeData(_ data: Data, for key: String)
}

Usage

Enabling data cache for processed images:

ImagePipeline {
    $0.datCache = /* create DataCache or a custom cache */
    $0.isDataCachingForProcessedImagesEnabled = true
}

If you only want to cache processed images set isDataCachingForOriginalImageDataEnabled to false.

Future Improvements

Allow user to change disk cache options per request:

  - ImagePipeline now caches processed images
  - Add isDataCacheForOriginalDataEnabled and isDataCacheForProcessedDataEnabled options to ImagePipeline.Configuration
  - ImageDecompressor is no longer ImageProcessor and it now runs on its own imageDecompressingQueue
  - ImageEncoder now supports macOS
@kean kean force-pushed the caching-processed-images branch from 4e9fef1 to ff4196a Compare May 22, 2019 06:01
@kean kean merged commit 09c13d5 into nuke8 May 22, 2019
@kean kean deleted the caching-processed-images branch May 24, 2019 20:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant