From 192878c05d73ccb963c8fe97ff595a4244c2f361 Mon Sep 17 00:00:00 2001 From: Seshadri Chakkravarthy Date: Tue, 30 May 2017 13:54:03 -0700 Subject: [PATCH] Removed valueTranslation part --- src/documentation/io/built-in-hadoop.md | 24 ++++-------------------- 1 file changed, 4 insertions(+), 20 deletions(-) diff --git a/src/documentation/io/built-in-hadoop.md b/src/documentation/io/built-in-hadoop.md index 90602578b87..722facb5f2f 100644 --- a/src/documentation/io/built-in-hadoop.md +++ b/src/documentation/io/built-in-hadoop.md @@ -217,28 +217,12 @@ org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(hcatConf, "my_databa Call Read transform as follows: ```java -PCollection> hcatData = +PCollection> hcatData = p.apply("read", - HadoopInputFormatIO.read() - .withConfiguration(hcatConf) - .withValueTranslation(hcatOutputValueType); + HadoopInputFormatIO.read() + .withConfiguration(hcatConf); ``` ```py # The Beam SDK for Python does not support Hadoop InputFormat IO. -``` - -The `HCatInputFormat` key class is `java.lang.Long` `Long`, which has a Beam `Coder`. The `HCatInputFormat` value class is `org.apache.hive.hcatalog.data.HCatRecord` `HCatRecord`, which does not have a Beam `Coder`. Rather than write a new coder, you can provide your own translation method, as follows: - -```java -SimpleFunction hcatOutputValueType = SimpleFunction() -{ - public String apply(HCatRecord record) { - return record.toString(); - } -}; -``` - -```py - # The Beam SDK for Python does not support Hadoop InputFormat IO. -``` +``` \ No newline at end of file