Spring for Apache Hadoop is a framework for application developers to take advantage of the features of both Hadoop and Spring.
Java JavaScript Ruby Python XSLT Groovy
Latest commit 12b5910 Dec 2, 2016 @jvalkeal jvalkeal Upgrade to Boot 1.4.2
- SHDP-578
- Use new spring-boot-test package as stuff
  has shifted over there.
- Use plugin org.springframework.boot as old
  spring-boot is deprecated.
Permalink
Failed to load latest commit information.
docs/src Update copyright years Jun 29, 2016
gradle/wrapper SHDP-514 Use gradle 2.4 Aug 3, 2015
samples Updating all springsource references and URLs May 5, 2014
spring-hadoop-batch/src/main/java/org/springframework/data/hadoop/batch SHDP-396 Reorganize build Sep 17, 2014
spring-hadoop-boot/src SHDP-517 Add jobHistoryAddress to boot config Aug 17, 2015
spring-hadoop-build-tests SHDP-557 Add build support for HDP 2.4 Mar 8, 2016
spring-hadoop-cluster-tests/src/test Disable test temporarily Nov 11, 2015
spring-hadoop-config/src/main/java/org/springframework/data/hadoop/config SHDP-569 Fix JavaDoc warnings Jun 29, 2016
spring-hadoop-core/src/main/java/org/springframework/data/hadoop SHDP-452 Support boot config props metadata Aug 3, 2015
spring-hadoop-hbase/src/main/java/org/springframework/data/hadoop/hbase SHDP-493 Adding HbaseTemplateTest, JavaDoc changes May 26, 2015
spring-hadoop-hive/src/main/java/org/springframework/data/hadoop SHDP-331 Improving HiveServer2 support Aug 4, 2015
spring-hadoop-namespace/src/main SHDP-525 Separate namespace support from annotation config Sep 21, 2015
spring-hadoop-pig/src/main/java/org/springframework/data/hadoop SHDP-396 Reorganize build Sep 17, 2014
spring-hadoop-spark/src SHDP-397 Adding additional properties to Spark YARN tasklet Sep 10, 2015
spring-hadoop-sqoop2/src/main/java/org/springframework/data/hadoop/batch/sqoop2 SHDP-506 Adding a basic Sqoop2 tasklet implementation Aug 14, 2015
spring-hadoop-store/src SHDP-569 Fix JavaDoc warnings Jun 29, 2016
spring-hadoop-test/src SHDP-533 Fix MR minicluster test dirs Nov 11, 2015
spring-hadoop-util/src SHDP-569 Fix JavaDoc warnings Jun 29, 2016
spring-yarn SHDP-569 Fix JavaDoc warnings Jun 29, 2016
.gitignore SHDP-399 Fix issues running Hive tests Sep 17, 2014
.springBeans + initial migration to Gradle Sep 19, 2011
CODE_OF_CONDUCT.adoc Add contributor covenant Feb 2, 2016
CONTRIBUTING.md Add contributor covenant Feb 2, 2016
README.md Add contributor covenant Feb 2, 2016
build.gradle Upgrade to Boot 1.4.2 Dec 2, 2016
gradle.properties Upgrade to Boot 1.4.2 Dec 2, 2016
gradlew upgrade to Gradle 1.3 Nov 27, 2012
gradlew.bat Upgrading to Gradle wrapper 1.9 Dec 12, 2013
maven.gradle Updating all springsource references and URLs May 5, 2014
settings.gradle SHDP-556 Create a util sub-project and move HostInfoDiscovery code Mar 7, 2016

README.md

The Spring for Apache Hadoop project provides extensions to Spring, Spring Batch, and Spring Integration to build manageable and robust pipeline solutions around Hadoop.

Spring for Apache Hadoop extends Spring Batch by providing support for reading from and writing to HDFS, running various types of Hadoop jobs (Java MapReduce, Streaming, Hive, Pig) and HBase. An important goal is to provide excellent support for non-Java based developers to be productive using Spring Hadoop and not have to write any Java code to use the core feature set.

Spring for Apache Hadoop also applies the familiar Spring programming model to Java MapReduce jobs by providing support for dependency injection of simple jobs as well as a POJO based MapReduce programming model that decouples your MapReduce classes from Hadoop specific details such as base classes and data types.

Docs

You can find out more details from the user documentation or by browsing the javadocs. If you have ideas about how to improve or extend the scope, please feel free to contribute.

Artifacts

For build dependencies to use in your own projects see our Quick Start page.

Building

Spring for Apache Hadoop uses Gradle as its build system. To build the system simply run:

gradlew

from the project root folder. This will compile the sources, run the tests and create the artifacts. Note that the tests by default tries to access a localhost single-node Hadoop cluster.

Supported distros

By default Spring for Apache Hadoop compiles against the Apache Hadoop 2.7.x stable relase (hadoop27).

The following distros and versions are supported:

  • Apache Hadoop 2.7.x (hadoop27) default
  • Apache Hadoop 2.6.x (hadoop26)
  • Pivotal HD 3.0 (phd30)
  • Cloudera CDH5 (cdh5)
  • Hortonworks HDP 2.3 (hdp23)

For anyone using older distros and versions we recommend using either one of these:

To compile against a specific distro version pass the -Pdistro=<label> project property, like so:

gradlew -Pdistro=hadoop26 build

Note that the chosen distro is displayed on the screen:

Using Apache Hadoop 2.6.x [2.6.0]

In this case, the specified Hadoop distribution (above Apache Hadoop 2.6.x) is used to create the project binaries.

CI Builds

The results for CI builds are available at Spring Data Hadoop: Project Summary - Spring CI

Testing

For its testing, Spring for Apache Hadoop expects a pseudo-distributed/local Hadoop instalation available on localhost configured with a port of 8020 for HDFS. The local Hadoop setup allows the project classpath to be automatically used by the Hadoop job tracker. These settings can be customized in two ways:

  • Build properties

From the command-line, use hd.fs for the file-system (to avoid confusion, specify the protocol such as 'hdfs://', 's3://', etc - if none is specified, hdfs:// will be used), hd.rm for the YARN resourcemanager, hd.jh for the jobhistory and hd.hive for the Hive host/port information, to override the defaults. For example to run against HDFS at dumbo:8020 one would use:

gradlew -Phd.fs=hdfs://dumbo:8020 build
  • Properties file

Through the test.properties file under src/test/resources folder (further tweaks can be applied through hadoop-ctx.xml file under src/test/resources/org/springframework/data/hadoop).

Enabling Hbase/Hive/Pig/WebHdfs Tests

Note that by default, only the vanilla Hadoop tests are running - you can enable additional tests (such as Hive or Pig) by adding the tasks enableHBaseTests, enableHiveTests, enablePigTests or enableWebHdfsTests (or enableAllTests in short). Use test.properties file for customizing the default location for these services as well.

Disabling test execution

You can disable all tests by skipping the test task:

gradlew -x test

Contributing

Here are some ways for you to get involved in the community:

  • Get involved with the Spring community on StackOverflow using the spring-data-hadoop tag to post and answer questions.
  • Create JIRA tickets for bugs and new features and comment and vote on the ones that you are interested in.
  • Watch for upcoming articles on Spring by subscribing to the Spring Blog.

Github is for social coding: if you want to write code, we encourage contributions through pull requests from forks of this repository. If you want to contribute code this way, read the Spring Framework contributor guidelines.

Code of Conduct

This project adheres to the Contributor Covenant code of conduct. By participating, you are expected to uphold this code. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.

Staying in touch

Follow the project team (Mark, Thomas or Janne) on Twitter.

In-depth articles can be found at the Spring blog, and releases are announced via our news feed.