HTTPS clone URL
Subversion checkout URL
This is the Wiki home for LZF Compression library ("compress-lzf", or, for historical reasons, "Ning compress").
Compress-LZF is a Java library for encoding and decoding data, written by Tatu Saloranta (email@example.com)
The primary compression format is LZF; but starting with version 0.9, there is also improved support for basic GZIP; latter uses low-level JDK-provided Deflate functionality (which is based on native zlib).
LZF data format library supports is compatible with the original LZF library by Marc A Lehmann. There are other LZF variants that differ from this, such as one used by H2 database project (by Thomas Mueller); although internal block compression structure is the same, block identifiers differ. This package uses the original LZF identifiers to be 100% compatible with existing command-line lzf tool(s).
LZF alfgorithm itself is optimized for speed, with somewhat more modest compression: compared to GZIP, LZF can be 6-8 times as fast to compress, and 2-3 times as fast to decompress.
Finally, note that library also provides for a
parallel compressor implementation (
com.ning.compress.lzf.parallel.PLZOutputStream), which can encode (compress) content using multiple processing cores: concurrent compression works on chunk-by-chunk basis (64k max chunk size) so megabyte-sized content can be processed very efficiently uing
From Maven repository (http://repo1.maven.org/maven2/com/ning/compress-lzf/)
- 1.0.3 (15-Aug-2014)
Typical usage is by using one of programmatic interfaces:
- block-based interface (
- streaming interface (
- or, for 'reverse' direction: 'LZFCompressingInputStream'
- or, for parallel compression:
- "push" interface (reverse of streaming):
LZFUncompressor(NOTE: only for un-/decompression)
When reading compressed data from a file you can do it simply creating a
LZFInputStream for other kinds of input) and use it for reading content
InputStream in = new LZFFileInputStream("data.lzf");
(note, too, that stream is buffered: there is no need to or benefit from using
and similarly you can compress content using
OutputStream out = new LZFFileOutputStream("results.lzf");
or, you can even do the reverse, and read uncompressed data, compress as you read by doing this:
InputStream compressingIn = new LZFCompressingInputStream("results.txt");
Compressing and decompressing individual blocks is as simple:
byte compressed = LZFEncoder.encode(uncompressedData); byte uncompressed = LZFDecoder.decode(compressedData);
Finally, note that LZF encoded chunks have length of at most 65535 bytes; longer content will be split into such chunks. This is done transparently so that you can compress/uncompressed blocks of any size; chunking is handled by LZF encoders and decoders.
It is also possibly to use jar as a command-line tool since it has manifest that points to 'com.ning.compress.lzf.LZF' as the class having main() method to call.
This means that you can use it like:
java -jar compress-lzf-0.9.8.jar
(which will display necessary usage arguments)
Finally, jar is also a valid (and extremely simple) OSGi bundle to make it work nicely on OSGi containers.
Check out jvm-compress-benchmark for comparison of space- and time-efficiency of this LZF implementation, relative other available Java-accessible compression libraries.