I did some testing with the zstd dictionary feature, which speeds up compression/decompression speeds significantly and increases the compression ratio too.
I like to create a set of dictionaries for the future using them in ipfs for compression.
Currently, I only got a dictionary for the Linux-Kernel source code which can achieve a compression ratio of 5.931x. It should work on other c-projects quite well, too. (maybe some testing is needed)
There's no need for those files to have an open license (IMHO) since those dictionaries will just analyze the whole dataset for common patterns.
If you like to help: I search for large datasets of files (which are considered compressible), to make the most general dictionary for those file types as possible. Feel free to share either links here, via mail or via matrix ( ruben_kelevra ) with me.
More details on my research: ipld/specs#76 (comment)
(Please assign this ticket to me)
I did some testing with the zstd dictionary feature, which speeds up compression/decompression speeds significantly and increases the compression ratio too.
I like to create a set of dictionaries for the future using them in ipfs for compression.
Currently, I only got a dictionary for the Linux-Kernel source code which can achieve a compression ratio of
5.931x. It should work on other c-projects quite well, too. (maybe some testing is needed)There's no need for those files to have an open license (IMHO) since those dictionaries will just analyze the whole dataset for common patterns.
If you like to help: I search for large datasets of files (which are considered compressible), to make the most general dictionary for those file types as possible. Feel free to share either links here, via mail or via matrix ( ruben_kelevra ) with me.
More details on my research: ipld/specs#76 (comment)
(Please assign this ticket to me)