LZF-compress is a Java library for encoding and decoding data in LZF format, written by Tatu Saloranta (firstname.lastname@example.org)
Format differs slightly from some other adaptations, such as the one used
by H2 database project (by Thomas Mueller);
although internal block compression structure is the same, block identifiers differ.
This package uses the original LZF identifiers to be 100% compatible with existing command-line
LZF alfgorithm itself is optimized for speed, with somewhat more modest compression.
Compared to the standard
Deflate (algorithm gzip uses) LZF can be 5-6 times as fast to compress,
and twice as fast to decompress. Compression rate is lower since no Huffman-encoding is used
after lempel-ziv substring elimintation.
See Wiki for more details; here's a "TL;DNR" version.
Both compression and decompression can be done either by streaming approach:
InputStream in = new LZFInputStream(new FileInputStream("data.lzf")); OutputStream out = new LZFOutputStream(new FileOutputStream("results.lzf")); InputStream compIn = new LZFCompressingInputStream(new FileInputStream("stuff.txt"));
or by block operation:
byte compressed = LZFEncoder.encode(uncompressedData); byte uncompressed = LZFDecoder.decode(compressedData);
and you can even use the LZF jar as a command-line tool (it has manifest that points to 'com.ning.compress.lzf.LZF' as the class having main() method to call), like so:
java -jar compress-lzf-1.0.3.jar
(which will display necessary usage arguments for
Besides Java support, LZF codecs / bindings exist for non-JVM languages as well:
- C: liblzf (the original LZF package!)
- C#: C# LZF
- Go: Golly
- Perl: Compress::LZF
- Python: Python-LZF
- Ruby: glebtv/lzf, LZF/Ruby
Check out jvm-compress-benchmark for comparison of space- and time-efficiency of this LZF implementation, relative other available Java-accessible compression libraries.
Alternative High-Speed Lempel-Ziv Compressors
LZF belongs to a family of compression codecs called "simple Lempel-Ziv" codecs.
Since LZ compression is also the first part of
deflate compression (which is used,
along with simple framing, for
gzip), it can be viewed as "first-part of gzip"
(second part being Huffman-encoding of compressed content).
There are many other codecs in this category, most notable (and competitive being)
all of which have very similar compression ratios (due to same underlying algorithm, differences coming from slight encoding variations, and efficiency differences in back-reference matching), and similar performance profiles regarding ratio of compression vs uncompression speeds.