Optimize images #14

Merged
merged 1 commit into from Apr 29, 2016

Projects

None yet

4 participants

@pathawks
Contributor

Saved 165.8KB by running images through ImageOptim's lossless compression.

@tobiasvl
Contributor

This is obviously commendable, but since binaries don't diff very well, this actually makes the repository itself bigger. I did a quick local merge, and the repo grew from 14M to 17M. It was 20M before pruning the remote-tracking branch and running git gc. Can the size be optimized further?

@Bluebie
Bluebie commented Apr 29, 2016

The ImageOptim compression backends are the gold standard of lossless compression. It's unlikely these images will be further compressible in the future without switching to entirely new image formats or applying lossy compression.

@pathawks
Contributor

I don’t mind trying once more. I know that I had all the settings tuned for max compression and ran many trials, but I will take another look just to be sure 👍

@pathawks
Contributor
pathawks commented Apr 29, 2016 edited

I ran ImageOptim again (version 1.6.1, the most recent at the moment) and was able to squeeze a few more bytes out bringing the total savings to 166636 bytes 164902 bytes (compared to current gh-pages branch)

You are right that merging this will actually make the repo itself take longer to clone. Most people, however, will access the images by browsing the website. For those users these changes will make the images download faster 👍

@pathawks pathawks Optimize images
Lossless compression performed using ImageOptim 1.6.1
5576961
@mislav
Owner
mislav commented Apr 29, 2016

Thanks! I don't mind so much about the repo clone size.

@mislav mislav merged commit 5a71642 into mislav:gh-pages Apr 29, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment