Saved 165.8KB by running images through ImageOptim's lossless compression.
This is obviously commendable, but since binaries don't diff very well, this actually makes the repository itself bigger. I did a quick local merge, and the repo grew from 14M to 17M. It was 20M before pruning the remote-tracking branch and running git gc. Can the size be optimized further?
The ImageOptim compression backends are the gold standard of lossless compression. It's unlikely these images will be further compressible in the future without switching to entirely new image formats or applying lossy compression.
I don’t mind trying once more. I know that I had all the settings tuned for max compression and ran many trials, but I will take another look just to be sure 👍
I ran ImageOptim again (version 1.6.1, the most recent at the moment) and was able to squeeze a few more bytes out bringing the total savings to 166636 bytes 164902 bytes (compared to current gh-pages branch)
You are right that merging this will actually make the repo itself take longer to clone. Most people, however, will access the images by browsing the website. For those users these changes will make the images download faster 👍
Lossless compression performed using ImageOptim 1.6.1
Thanks! I don't mind so much about the repo clone size.