Skip to content
Very accurate color filters reconstruction tools based on 3D color LUTs
Branch: master
Clone or download
homm Merge pull request #11 from grapeot/master
[TRIVIAL] Fix a syntax error.
Latest commit a01b174 Sep 10, 2018
Type Name Latest commit message Commit time
Failed to load latest commit information.
bin [TRIVIAL] Fix a syntax error. Sep 9, 2018
raw update readme Apr 24, 2018
static update images Apr 24, 2018
.gitignore Add test images to gitignore Apr 24, 2018
LICENSE Initial commit Apr 19, 2018 Be more specific in examples. update repo name May 4, 2018
list.txt Return profile to the sample Apr 22, 2018
requirements.txt requirements Apr 22, 2018
sample.jpg Return profile to the sample Apr 22, 2018

Accurate Instagram Filters Reconstruction

There's a bunch of apps out there (e.g., Instagram) allowing you to apply color filters to images. You might be interested in cloning their behavior: reconstruct them. This repository holds tools for semi-automatically and very accurately reconstructing color filters.

The truth is folks like Instagram filters. They are trying to reproduce them again and again. And again and again. And again and again. The problem with the attempts is they mostly deal with manually correcting colors. For me, it was more interesting to find a solution using a more robust method and maths.

This looks like the only attempt to provide an accurate color filter reconstruction. For instance, one of the following images was obtained by applying the Clarendon Instagram filter to the source image, while another one was derived via an accurate reconstruction. Try guessing which one was reconstructed.

reconstruction inst

To compare, here is the result of applying the same filter from a commercial set of Instagram-like filters.


How it works

This method is based on three-dimensional lookup tables and their two-dimensional representation: hald images. The core idea is simple: a sample hald image with a uniform color distribution is processed using a target color filter with an unknown transformation algorithm. The processed hald image can then be used as a filter for a very accurate approximation of that target color transformation.

A resulting hald image could then be used in various software such as GraphicsMagick or Adobe Photoshop. You can implement those hald images in your iOS or macOS app with CocoaLUT. Also, hald images could be converted to the 3D LUT cube file format, which is common in a great number of video editing software.


This method can capture color transformations only where no other variables are used for manipulations. For example, vignetting, scratches, gradients, and watermarks can not be captured. It also might be wrong when different filters are applied to different parts of a single image.


To generate and convert hald images, you will need git and a pip-enabled Python interpreter.

$ git clone
$ cd color-filters-reconstruction
$ pip install -r requirements.txt 

The resulting hald images can be applied to any visuals in your application using GraphicsMagick bindings for Python, Ruby, PHP, JavaScript™, and other programming languages or using CLI. No software from this repository is required.


  1. First, you'll need to create the identity image. Just run:

    $ ./bin/

    This will create a file named hald.5.png. The number in the filename stands for the square root of a 3D LUT size. For example, 5 means we're dealing with a 25×25×25 lookup table.


    This file doesn't look like other hald images. This image is specifically designed to oppose distortions which may occur during transformations such as vignetting, scratches, gradients, and JPEG artifacts.

  2. Process the identity image with a target software you prefer. Say, if you were using Instagram, you'd have to transfer the identity image to your device and post that image with one of the filters applied. After that, you'll see filtered identity image in your camera roll. Well, just transfer it back.


    Before continuing, make sure that the resolution of your filtered identity image exactly matches that of the source one.

  3. Convert the filtered to the real hald image:

    $ ./bin/ raw/1.Clarendon.jpg halds/

    Where halds/ is your output directory.

  4. That's it! You can now apply that resulting hald image to any input.

    $ gm convert sample.jpg -hald-clut halds/1.Clarendon.png out.jpeg

    sample Clarendon

Advanced tips

While the default parameters provide you with high-quality hald filters, there are some cases where it is not enough.

If the target filter has heavy distortions on the local level or significant gradients in the center of an image, some undesired effects may occur. The most noticeable one is color banding. This is an original image and the one processed with the Hudson-like filter, one of the most problematic in this aspect.

# Create hald image from processed by Instagram identity hald image
$ ./bin/ raw/15.Hudson.jpg halds/
# Apply hald image to the sample image
$ gm convert girl.jpg -hald-clut halds/15.Hudson.png girl.15.jpg

original hudson

You can notice that in the processed image many objects look flat and posterized: face, hair, chairs in the background. While posterization is one of the common image filters, it is not a part of the Hudson filter.

If you thoroughly look at the image with a Hudson-like filter applied, you'll see that it looks noisy, and that's where the problem comes from.


Fortunately enough, you can ask to apply a gaussian blur to the three-dimensional lookup table to reduce that noise. You'll need to install SciPy to continue.

# The next line is only needed once
$ pip install scipy
$ ./bin/ raw/15.Hudson.jpg halds/ --smooth 1.5
$ gm convert girl.jpg -hald-clut halds/15.Hudson.png girl.15.fixed.jpg

hudson fixed hudson

You can discover the additional options by executing ./bin/ --help.

Have fun with reverse engineering!

You can’t perform that action at this time.