Skip to content
Branch: master
Find file Copy path
Find file Copy path
1 contributor

Users who have contributed to this file

108 lines (81 sloc) 4.5 KB


Core Image is an image processing (and more) framework from Apple. It's easy to use but it requires some boilerplate code. This guide is a starting point for using Core Image with Nuke.

There are multiple ways to use Core Image. This guide only covers a case in which you apply image filters to the UIImage in a background. It doesn't cover Core Image basics, but it does feature some boilerplate code.

Core Image Usage

Creating CIContext

Before we create and apply an image filter we need an instance of CIContext class:

let sharedCIContext = CIContext()
// let sharedCIContext = CIContext(options: [kCIContextPriorityRequestLow: true])

kCIContextPriorityRequestLow option is a new addition in iOS 8:

If this value is true, use of the Core Image context from a background thread takes a lower priority than GPU usage from the main thread, allowing your app to perform Core Image rendering without disturbing the frame rate of UI animations.

Also new in iOS 7 is support for background renders. All background renders automatically use the slower Core Image CPU rendering path. There is no need to manually switch between GPU and CPU rendering paths when the application enters background.

Applying Filters

And here's a UIImage extension that shows one way to use CIContext to apply an image filter and produce an output image:

extension UIImage {
    func applyFilter(context: CIContext = sharedCIContext, closure: (CoreImage.CIImage) -> CoreImage.CIImage?) -> UIImage? {
        func inputImageForImage(_ image: Image) -> CoreImage.CIImage? {
            if let image = image.cgImage {
                return CoreImage.CIImage(cgImage: image)
            if let image = image.ciImage {
                return image
            return nil
        guard let inputImage = inputImageForImage(self),
            let outputImage = closure(inputImage) else {
            return nil
        guard let imageRef = context.createCGImage(outputImage, from: inputImage.extent) else {
            return nil
        return UIImage(cgImage: imageRef, scale: self.scale, orientation: self.imageOrientation)

    func applyFilter(filter: CIFilter?, context: CIContext = sharedCIContext) -> UIImage? {
        guard let filter = filter else {
            return nil
        return applyFilter(context: context) {
            filter.setValue($0, forKey: kCIInputImageKey)
            return filter.outputImage

Now lets create CIFilter and use our extension to apply it:

let filter = CIFilter(name: "CIGaussianBlur", withInputParameters: ["inputRadius" : 10.0])
let processedImage = image.applyFilter(filter)

Core Image in Nuke

Here's an example of a blur filter that implements Nuke's Processing protocol and uses our new extensions:

/// Blurs image using CIGaussianBlur filter.
struct GaussianBlur: ImageProcessing {
    private let radius: Int

    /// Initializes the receiver with a blur radius.
    init(radius: Int = 8) {
        self.radius = radius

    /// Applies CIGaussianBlur filter to the image.
    func process(_ image: UIImage) -> UIImage? {
        return image.applyFilter(filter: CIFilter(name: "CIGaussianBlur", withInputParameters: ["inputRadius" : radius]))

    /// Compares two filters based on their radius.
    static func ==(lhs: GaussianBlur, rhs: GaussianBlur) -> Bool {
        return lhs.radius == rhs.radius

Performance Considerations

  • Chaining multiple CIFilter objects is much more efficient than using ProcessorComposition to combine multiple instances of CoreImageFilter class.
  • Don’t create a CIContext object every time you render.


  1. Core Image Programming Guide
  2. Core Image Filter Reference
  3. Core Image Tutorial: Getting Started
  4. WWDC 2014 Session 514 - Advances in Core Image
  5. Core Image Shop - sample project that lets you experiment with Core Image filters
You can’t perform that action at this time.