You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am making use of a project called AdvanceCOMP, in it's compression options
allows zopfli (-4, or --shrink-insane). While going through PNG's for
Torchlight there are dozens of png's that end up larger. I've attached the
gloves so you can review and test it yourself as appropriate.
advpng.exe -z -4 */*.png
43653 43653 100% wardrobe/dragon_gloves.png (Bigger 45713)
89125 89125 100% wardrobe/heavyleather_boots.png (Bigger 96720)
11412 10817 94% wardrobe/heavyleather_boots_alt01.png
90564 90564 100% wardrobe/heavyleather_chest.png (Bigger 98420)
89328 89328 100% wardrobe/heavyleather_gloves.png (Bigger 97619)
At worst the compression should max at the same size; This means the problem's
root comes from some sections of the data are compressing better while others
end up being left as uncompressed. I noted this in some of my own compression
experiments years ago.
Viable solution:
With sections that have no compression on them (causing expansion) they should instead compress and find an identical matching length. This is most likely part of another match. If this at the beginning or end of another compressed section, that section will truncate to allow the uncompressed section to compress at a 1:1 rate (so long as the other match remains long enough to retain compression).
I am not aware of the full details of zlib compression, so there needs to be an
additional rule.
If in the instance that a match is found in the middle of another match, then
it should split the match into two matches avoiding the middle (on purpose) to
give the inner match to the uncompressed section. This should only happen in
the case that these 3 matches takes less space than 1 match & 1 non-match.
How much more complexity this will give I'm not sure, nor how much extra time
it will take.
Original issue reported on code.google.com by rtcv...@yahoo.com on 2 Nov 2013 at 9:00
Original issue reported on code.google.com by
rtcv...@yahoo.com
on 2 Nov 2013 at 9:00Attachments:
The text was updated successfully, but these errors were encountered: