Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caching Processed Images #227

merged 1 commit into from May 22, 2019

Caching Processed Images #227

merged 1 commit into from May 22, 2019


Copy link

kean commented May 6, 2019


  • As a user, I would like to store processed images on disk, not the original ones, because recreating the processed ones is expensive and storing the original ones is wasteful
  • As a user, I would like to store both processed and original images on disk so that I could create different variations of the same image by applying different processors, on demand

Technical Solution

Extend ImagePipeline to support caching processed images and allow users to configure the pipeline to store only the original image data, or only the processed image data, or both.

The are multiple prerequisites for making this work.

#229 Decoupling Image Decompression

Decompression is currently part of image processing which it shouldn't be because now in the case when we have processed image data, we will need to decompress it but we won't need to process it again.

#228 Generating Keys

In Nuke 7 ImagePipeline simply uses request.urlString as a cache key which isn't enough for image processing. We need to be able to append image processors to these keys.

#233 Encoding Images

Nuke currently provides image decoding infrastrucure: ImageDecoding, ImageDecoder, ImageDecoderRegistry. But it would now also need to be able to encode processed images.

#235 ImagePipeline v2

The first version of ImagePipeline introduced in Nuke 7 has some technical debt which needs to be addressed. The primary reason for adding tech debt was introduction of progressive decoding without rethinking the existing solution.

Before #227 Caching Processing Image can be implemented this tech debt needs to be addressed.

Data Caching

This one probably doesn't require any changes, we can use the existing DataCaching protocol for caching data:

public protocol DataCaching {
    func cachedData(for key: String) -> Data?
    func storeData(_ data: Data, for key: String)


Enabling data cache for processed images:

ImagePipeline {
    $0.datCache = /* create DataCache or a custom cache */
    $0.isDataCachingForProcessedImagesEnabled = true

If you only want to cache processed images set isDataCachingForOriginalImageDataEnabled to false.

Future Improvements

Allow user to change disk cache options per request:

@kean kean force-pushed the master branch from e5cd16a to f1f5341 May 17, 2019
@kean kean force-pushed the caching-processed-images branch from 7a920a3 to 8e07cb5 May 19, 2019
@kean kean changed the base branch from master to nuke8 May 19, 2019
@kean kean force-pushed the caching-processed-images branch 4 times, most recently from 3fae4cd to 4e9fef1 May 20, 2019
  - ImagePipeline now caches processed images
  - Add isDataCacheForOriginalDataEnabled and isDataCacheForProcessedDataEnabled options to ImagePipeline.Configuration
  - ImageDecompressor is no longer ImageProcessor and it now runs on its own imageDecompressingQueue
  - ImageEncoder now supports macOS
@kean kean force-pushed the caching-processed-images branch from 4e9fef1 to ff4196a May 22, 2019
@kean kean merged commit 09c13d5 into nuke8 May 22, 2019
1 check passed
1 check passed
continuous-integration/travis-ci/push The Travis CI build passed
@kean kean deleted the caching-processed-images branch May 24, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet
1 participant
You can’t perform that action at this time.