Permalink
Browse files

Merge pull request #44 from twitter/readme_update_build_docs

Update readme with build and troubleshooting instructions.
  • Loading branch information...
2 parents 6bb1b7f + 4f44b1b commit 6697da92f48ae7f121c80f584d6a7712d3b25236 @yuxutw yuxutw committed Apr 18, 2012
Showing with 50 additions and 2 deletions.
  1. +50 −2 README.md
View
@@ -16,11 +16,24 @@ This project builds off the great work done at [http://code.google.com/p/hadoop-
LZO is a wonderful compression scheme to use with Hadoop because it's incredibly fast, and (with a bit of work) it's splittable. Gzip is decently fast, but cannot take advantage of Hadoop's natural map splits because it's impossible to start decompressing a gzip stream starting at a random offset in the file. LZO's block format makes it possible to start decompressing at certain specific offsets of the file -- those that start new LZO block boundaries. In addition to providing LZO decompression support, these classes provide an in-process indexer (com.hadoop.compression.lzo.LzoIndexer) and a map-reduce style indexer which will read a set of LZO files and output the offsets of LZO block boundaries that occur near the natural Hadoop block boundaries. This enables a large LZO file to be split into multiple mappers and processed in parallel. Because it is compressed, less data is read off disk, minimizing the number of IOPS required. And LZO decompression is so fast that the CPU stays ahead of the disk read, so there is no performance impact from having to decompress data as it's read off disk.
+You can read more about Hadoop, LZO, and how we're using it at Twitter at [http://www.cloudera.com/blog/2009/11/17/hadoop-at-twitter-part-1-splittable-lzo-compression/](http://www.cloudera.com/blog/2009/11/17/hadoop-at-twitter-part-1-splittable-lzo-compression/).
+
### Building and Configuring
-To get started, see [http://code.google.com/p/hadoop-gpl-compression/wiki/FAQ](http://code.google.com/p/hadoop-gpl-compression/wiki/FAQ). This project is built exactly the same way; please follow the answer to "How do I configure Hadoop to use these classes?" on that page.
+To get started, see [http://code.google.com/p/hadoop-gpl-compression/wiki/FAQ](http://code.google.com/p/hadoop-gpl-compression/wiki/FAQ). This project is built exactly the same way; please follow the answer to "How do I configure Hadoop to use these classes?" on that page, or follow the summarized version here.
-You can read more about Hadoop, LZO, and how we're using it at Twitter at [http://www.cloudera.com/blog/2009/11/17/hadoop-at-twitter-part-1-splittable-lzo-compression/](http://www.cloudera.com/blog/2009/11/17/hadoop-at-twitter-part-1-splittable-lzo-compression/).
+LZO 2.x is required, and most easily installed via the package manager on your system. If you choose to install manually for whatever reason (developer OSX machines is a common use-case) this is accomplished as follows:
+
+1. Download the latest LZO release from http://www.oberhumer.com/opensource/lzo/
+1. Configure LZO to build a shared library (required) and use a package-specific prefix (optional but recommended): `./configure --enable-shared --prefix /usr/local/lzo-2.06`
+1. Build and install LZO: `make && sudo make install`
+
+Now let's build hadoop-lzo.
+
+ JAVA_HOME=/Library/Java/JavaVirtualMachines/1.6.0_29-b11-402.jdk/Contents/Home \
+ C_INCLUDE_PATH=/usr/local/lzo-2.06/include \
+ LIBRARY_PATH=/usr/local/lzo-2.06/lib \
+ ant clean test
Once the libs are built and installed, you may want to add them to the class paths and library paths. That is, in hadoop-env.sh, set
@@ -33,6 +46,41 @@ Note that there seems to be a bug in /path/to/hadoop/bin/hadoop; comment out the
because it keeps Hadoop from keeping the alteration you made to JAVA_LIBRARY_PATH above. (Update: see [https://issues.apache.org/jira/browse/HADOOP-6453](https://issues.apache.org/jira/browse/HADOOP-6453)). Make sure you restart your jobtrackers and tasktrackers after uploading and changing configs so that they take effect.
+### Build Troubleshooting
+
+The following missing LZO header error suggests LZO was installed in non-standard location and
+cannot be found at build time. Double-check the environment variable C_INCLUDE_PATH is set to the
+LZO include directory. For example: `C_INCLUDE_PATH=/usr/local/lzo-2.06/include`
+
+ [exec] checking lzo/lzo2a.h presence... no
+ [exec] checking for lzo/lzo2a.h... no
+ [exec] configure: error: lzo headers were not found...
+ [exec] gpl-compression library needs lzo to build.
+ [exec] Please install the requisite lzo development package.
+
+The following `Can't find library for '-llzo2'` error suggests LZO was installed to a non-standard location and cannot be located at build time. This could be one of two issues:
+
+1. LZO was not built as a shared library. Double-check the location you installed LZO contains shared libraries (probably something like `/usr/lib64/liblzo2.so.2` on Linux, or `/usr/local/lzo-2.06/lib/liblzo2.dylib` on OSX).
+1. LZO was not added to the library path. Double-check the environment varialbe LIBRARY_PATH points as the LZO lib directory (for example `LIBRARY_PATH=/usr/local/lzo-2.06/lib`).
+
+ [exec] checking lzo/lzo2a.h usability... yes
+ [exec] checking lzo/lzo2a.h presence... yes
+ [exec] checking for lzo/lzo2a.h... yes
+ [exec] checking Checking for the 'actual' dynamic-library for '-llzo2'... configure: error: Can't find library for '-llzo2'
+
+The following "Native java headers not found" error indicates the Java header files are not available.
+
+ [exec] checking jni.h presence... no
+ [exec] checking for jni.h... no
+ [exec] configure: error: Native java headers not found. Is $JAVA_HOME set correctly?
+
+Header files are not available in all Java installs. Double-check you are using a JAVA_HOME that has an `include` directory. On OSX you may need to install a developer Java package.
+
+ $ ls -d /Library/Java/JavaVirtualMachines/1.6.0_29-b11-402.jdk/Contents/Home/include
+ /Library/Java/JavaVirtualMachines/1.6.0_29-b11-402.jdk/Contents/Home/include
+ $ ls -d /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/include
+ ls: /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/include: No such file or directory
+
### Using Hadoop and LZO
#### Reading and Writing LZO Data

0 comments on commit 6697da9

Please sign in to comment.