Skip to content

Commit

Permalink
update documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
jetoile committed Nov 2, 2017
1 parent 26ea0f9 commit d5265a8
Show file tree
Hide file tree
Showing 5 changed files with 15 additions and 2 deletions.
5 changes: 4 additions & 1 deletion hadoop-unit-site/src/site/markdown/cli.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Hadoop-unit can be used with common tools such as:
hdfs dfs -ls hdfs://localhost:20112/
```

**For windows user, you could have some issue like `-classpath is not known`. The cause of these errors are because your `JAVA_HOME` has space into. If your `JAVA_HOME` is linked to `C:\Program File\Java\...` then declared it as `C:\Progra~1\Java\...`
**For windows user, you could have some issue like `-classpath is not known`. The cause of these errors are because your `JAVA_HOME` has space into. If your `JAVA_HOME` is linked to `C:\Program File\Java\...` then declared it as `C:\Progra~1\Java\...`**

<div id="kafka-console-command"/>
# Kafka-console command
Expand All @@ -30,12 +30,14 @@ hdfs dfs -ls hdfs://localhost:20112/
```bash
kafka-console-consumer --zookeeper localhost:22010 --topic topic
```

<div id="hbase-shell"/>
# HBase Shell

* Download and unzip HBase
* set variable `HBASE_HOME`
* edit file `HBASE_HOME/conf/hbase-site.xml`:

```xml
<configuration>
<property>
Expand All @@ -54,6 +56,7 @@ kafka-console-consumer --zookeeper localhost:22010 --topic topic
```bash
hbase shell
```

<div id="hive-shell"/>
# Hive Shell

Expand Down
3 changes: 3 additions & 0 deletions hadoop-unit-site/src/site/markdown/focus.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
# Focus on ElasticSearch

Under the hood, to run ElasticSearch, this is how it works:

* ElasticSearch is downloaded
* A Java Process is run to start ElasticSearch

Expand All @@ -19,6 +20,7 @@ To know which version of ElasticSearch has to be downloaded, the variable ```ela
# Focus on Redis

Under the hood, to run Redis, this is how it works:

* Redis is downloaded
* An ant task is run to execute the ```make``` command

Expand All @@ -32,5 +34,6 @@ To know which version of Redis has to be downloaded, the variable ```redis.versi
# Focus on Oozie

To use oozie, you need:

* to download the [oozie's share libs](http://s3.amazonaws.com/public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.1.0/tars/oozie/oozie-4.2.0.2.6.1.0-129-distro.tar.gz)
* to edit the configuration file ```conf/hadoop-unit-default.properties``` and to set the variable ```oozie.sharelib.path``` to where you downloaded the oozie's share libs
2 changes: 2 additions & 0 deletions hadoop-unit-site/src/site/markdown/howto-build.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,12 @@
# Hadoop Unit build

To build Hadoop Unit, you need:

* jdk 1.8
* maven 3.0+

Run:

```bash
mvn install -DskipTests
```
Expand Down
1 change: 1 addition & 0 deletions hadoop-unit-site/src/site/markdown/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@ Welcome to the Hadoop Unit wiki!
* [Why Hadoop Unit](why-hadoop-unit.html)

* Installation

* [Install the standalone Hadoop Unit mode](install-hadoop-unit-standalone.html)
* [Integrate Hadoop Unit in your maven project](maven-usage.html)

Expand Down
6 changes: 5 additions & 1 deletion hadoop-unit-site/src/site/markdown/why-hadoop-unit.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,13 @@ Moreover, Hadoop Unit allow users to work on their laptop which bring a better f

As Hadoop's components have not been designed to run in the same JVM, a lot of problems occurs.

In fact, Hadoop Unit run each component in his own classloader to avoid classpath issues. To do this, it is using maven resolver which is the dependency engine of maven and hadoop mini cluster which allows to make integration tests for hortonworks much easier.
In fact, Hadoop Unit run each component in his own classloader to avoid classpath issues (for standalone mode and maven integration plugin with mode embedded). To do this, it is using maven resolver which is the dependency engine of maven and hadoop mini cluster which allows to make integration tests for hortonworks much easier.

In fact, hadoop mini cluster is using stuff like MiniDFS, LocalOozie which are available in the Hadoop world.

For the mode [Simple dependency usage](maven-usage.html#simple-dependency-usage), a *service provider interface* is used under the hood. This is why if, for example, you need hbase, it is not provided to add zookeeper in your test dependencies since the maven dependencies transitivity is used.

Moreover, Hadoop Unit manage the right order for components' bootstrap for you.



0 comments on commit d5265a8

Please sign in to comment.