Skip to content
Example source code accompanying O'Reilly's "Hadoop: The Definitive Guide" by Tom White
Makefile Java Shell Scala Perl Python Other
Find file
Pull request Compare This branch is 156 commits behind master.
Failed to load latest commit information.
app3/src/main/sh Change example code directory structure to be more maven-like: src/ma… Jul 23, 2010
book Upgrade from HBase 0.90 to 0.94. Jan 12, 2014
ch02 Allow snippet version overrides to be done by major Hadoop version. Jan 10, 2014
ch03 Upgrade from Hadoop 2.0.3-alpha to 2.2.0. Jan 9, 2014
ch04-avro Remove imports from snippet. Feb 8, 2012
ch04 Get builds working and tests passing. Feb 1, 2012
ch05 Allow different Hadoop versions to be specified. Jan 9, 2014
ch07 Allow snippet version overrides to be done by major Hadoop version. Jan 10, 2014
ch08 Allow snippet version overrides to be done by major Hadoop version. Jan 10, 2014
ch09/src/main Add some missing files. Jan 11, 2012
ch11 Updated minor grunt output changes in the tests for Pig 0.9.1. Jan 18, 2012
ch12 Add Hive code for conversions and indexes. Jan 28, 2012
ch13 Upgrade from HBase 0.90 to 0.94. Jan 12, 2014
ch14 Fix bug in ZK program. Feb 5, 2012
ch15 Move creation of examples jar into chapter modules themselves. (Excep… Jan 12, 2012
ch16 Support Hadoop 0.20.x, 0.23.0-SNAPSHOT Aug 22, 2011
common Get builds working and tests passing. Feb 2, 2012
experimental Add more experimental code. Jan 11, 2012
hadoop-examples Change a few Oozie paths. Jan 24, 2012
hadoop-meta Introduce hadoop.distro parameter to avoid problems with profile acti… Jan 10, 2014
input Add Hive code for conversions and indexes. Jan 29, 2012
snippet Allow snippet version overrides to be done by major Hadoop version. Jan 10, 2014
.gitignore Add Hive code for conversions and indexes. Jan 29, 2012
README Upgrade from HBase 0.90 to 0.94. Jan 12, 2014
pom.xml Move creation of examples jar into chapter modules themselves. (Excep… Jan 13, 2012

README

Example code for "Hadoop: The Definitive Guide, Third Edition" by Tom White.
Copyright (C) 2011 Tom White, 978-1-449-31152-0

http://www.hadoopbook.com/
http://oreilly.com/catalog/9781449311520/

The code is hosted at http://github.com/tomwhite/hadoop-book/. You can find code
for the first edition at http://github.com/tomwhite/hadoop-book/tree/1e, and
for the second edition at http://github.com/tomwhite/hadoop-book/tree/2e.

This version of the code has been tested with:
 * Hadoop 1.2.1/0.22.0/0.23.x/2.2.0
 * Avro 1.5.4
 * Pig 0.9.1
 * Hive 0.8.0
 * HBase 0.90.4/0.94.15
 * ZooKeeper 3.4.2
 * Sqoop 1.4.0-incubating
 * MRUnit 0.8.0-incubating

Before running the examples you need to install Hadoop, Pig, Hive, HBase,
ZooKeeper, and Sqoop (as appropriate) as explained in the book.

You also need to install Maven.

Then you can build the code with:

% mvn package -DskipTests

By default Hadoop 1.2.1 is used. This can be changed by specifying the
hadoop.version property, e.g.

% mvn package -DskipTests -Dhadoop.version=1.2.0

There are profiles for different Hadoop major versions and distributions,
specified in hadoop-meta/pom.xml, and they are specified using the hadoop.distro
property. For example, to use the default version of Hadoop 2:

% mvn package -DskipTests -Dhadoop.distro=apache-2

Again, you can specify hadoop.version to use a particular Hadoop 2 version:

% mvn package -DskipTests -Dhadoop.distro=apache-2 -Dhadoop.version=2.1.1-beta

You should then be able to run the examples from the book.

Chapter names for "Hadoop: The Definitive Guide", Third Edition

ch01 - Meet Hadoop
ch02 - MapReduce
ch03 - The Hadoop Distributed Filesystem
ch04 - Hadoop I/O
ch05 - Developing a MapReduce Application
ch06 - How MapReduce Works
ch07 - MapReduce Types and Formats
ch08 - MapReduce Features
ch09 - Setting Up a Hadoop Cluster
ch10 - Administering Hadoop
ch11 - Pig
ch12 - Hive
ch13 - HBase
ch14 - ZooKeeper
ch15 - Sqoop
ch16 - Case Studies

app1 - Installing Apache Hadoop
app2 - Cloudera's Distribution for Hadoop
app3 - Preparing the NCDC Weather Data
Something went wrong with that request. Please try again.