Skip to content
Browse files

Update docstrings for Hadoop 0.20 changes

Hadoop's 0.20 API replaced Collectors with Contexts, JobConf with Job.
  • Loading branch information...
1 parent b242d6a commit edc9326b2a48f22a2953d13384bffc44dfceb8ee @davelambert davelambert committed
Showing with 6 additions and 8 deletions.
  1. +3 −5 src/clojure_hadoop/wrap.clj
  2. +3 −3 test/clojure_hadoop/examples/wordcount1.clj
View
8 src/clojure_hadoop/wrap.clj
@@ -30,7 +30,7 @@ Clojure jobs."
(fn [] (map (fn [^Writable v] (read-string (.toString v))) wvalues))])
(defn clojure-writer
- "Sends key and value to the OutputCollector by calling pr-str on key
+ "Sends key and value to the Context by calling pr-str on key
and value and wrapping them in Hadoop Text objects."
[^TaskInputOutputContext context key value]
(binding [*print-dup* true]
@@ -50,8 +50,7 @@ Clojure jobs."
Hadoop and returns a [key value] pair for f.
writer is a function that receives each [key value] pair returned by
- f and sends the appropriately-type arguments to the Hadoop
- OutputCollector.
+ f and sends the appropriately-type arguments to the Hadoop Context.
If not given, reader and writer default to clojure-map-reader and
clojure-writer, respectively."
@@ -79,8 +78,7 @@ Clojure jobs."
Hadoop and returns a [key values-function] pair for f.
writer is a function that receives each [key value] pair returned by
- f and sends the appropriately-typed arguments to the Hadoop
- OutputCollector.
+ f and sends the appropriately-typed arguments to the Hadoop Context.
If not given, reader and writer default to clojure-reduce-reader and
clojure-writer, respectively."
View
6 test/clojure_hadoop/examples/wordcount1.clj
@@ -41,8 +41,8 @@
"This is our implementation of the Mapper.map method. The key and
value arguments are sub-classes of Hadoop's Writable interface, so
we have to convert them to strings or some other type before we can
- use them. Likewise, we have to call the OutputCollector.collect
- method with objects that are sub-classes of Writable."
+ use them. Likewise, we have to call the Context.collect method with
+ objects that are sub-classes of Writable."
[this key value ^MapContext context]
(doseq [word (enumeration-seq (StringTokenizer. (str value)))]
(.write context (Text. word) (LongWritable. 1))))
@@ -66,7 +66,7 @@
(defn tool-run
"This is our implementation of the Tool.run method. args are the
command-line arguments as a Java array of strings. We have to
- create a JobConf object, set all the MapReduce job parameters, then
+ create a Job object, set all the MapReduce job parameters, then
call the JobClient.runJob static method on it.
This method must return zero on success or Hadoop will report that

0 comments on commit edc9326

Please sign in to comment.
Something went wrong with that request. Please try again.