Skip to content
This repository

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

making the examples preformatted #2

Merged
merged 1 commit into from almost 2 years ago

2 participants

Yanick Champoux Andrew Grangaard
Yanick Champoux
yanick commented July 07, 2012

add CRs in the doc so that the examples are seen by POD as preformatted.

Andrew Grangaard spazm merged commit ce87eec into from July 09, 2012
Andrew Grangaard spazm closed this July 09, 2012
Andrew Grangaard
Owner

finally pushed the updated changes to cpan, Hadoop-Streaming-0.122420.

I have no excuse for taking this long, total steps involved:

git pull
dzil release
Yanick Champoux

\o/. Thanks! Both for the merge and the very handy module. :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Showing 1 unique commit by 1 author.

Jul 07, 2012
Yanick Champoux making the examples preformatted 34ec156
This page is out of date. Refresh to see the latest.

Showing 1 changed file with 5 additions and 0 deletions. Show diff stats Hide diff stats

  1. 5  lib/Hadoop/Streaming.pm
5  lib/Hadoop/Streaming.pm
@@ -89,11 +89,13 @@ Reduce jobs are provided a stream of key\tvalue lines.  multivalued keys appear
89 89
 Hadoop::Mapper consumes and chomps lines from STDIN and calls map($line) once per line.  This is initiated by the run() method.
90 90
 
91 91
 example mapper input:
  92
+
92 93
     line1
93 94
     line2
94 95
     line3
95 96
 
96 97
 Hadoop::Mapper transforms this into 3 calls to map()
  98
+
97 99
     map(line1)
98 100
     map(line2)
99 101
     map(line3)
@@ -103,6 +105,7 @@ Hadoop::Mapper transforms this into 3 calls to map()
103 105
 Hadoop::Reducer abstracts this stream into an interface of (key, value-iterator).  reduce() is called once per key, instead of once per line.  The reduce job pulls values from the iterator and outputs key/value pairs to STDOUT.  emit() is provided as a convenience for outputing key/value pairs.
104 106
 
105 107
 example reducer input:
  108
+
106 109
     key1 value1
107 110
     key2 valuea
108 111
     key2 valuec
@@ -111,6 +114,7 @@ example reducer input:
111 114
     key3 valuebar
112 115
 
113 116
 Hadoop::Streaming::Reduce transforms this input into three calls to reduce():
  117
+
114 118
     reduce( key,  iterator_over(qw(value1)) );
115 119
     reduce( key2, iterator_over(qw(valuea valuec valueb)) );
116 120
     reduce( key3, iterator_over(qw(valuefoo valuebarr)) );
@@ -118,6 +122,7 @@ Hadoop::Streaming::Reduce transforms this input into three calls to reduce():
118 122
 =item Hadoop::Streaming::Combiner interface
119 123
 
120 124
 The Hadoop::Streaming::Combiner interface is analagous to the Hadoop::Streaming::Reducer interface.  combine() is called instead of reduce() for each key.  The above example would produce three calls to combine():
  125
+
121 126
     combine( key,  iterator_over(qw(value1)) );
122 127
     combine( key2, iterator_over(qw(valuea valuec valueb)) );
123 128
     combine( key3, iterator_over(qw(valuefoo valuebarr)) );
Commit_comment_tip

Tip: You can add notes to lines in a file. Hover to the left of a line to make a note

Something went wrong with that request. Please try again.