You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
thanks for a great piece of software.
I am running cloudera CDH4.3 with hive .10 on a three node cluster.
I have created Hive "reading" and Hive "writing" tables pointing to my elasticsearch server.
I can read fine however when I try to push data as per your example:
INSERT OVERWRITE TABLE es_tasks_write select "false","titi","tata" from sample_table limit 10;
I get an error message:
2013-07-12 16:31:33,709 FATAL ExecReducer: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{},"value":{"_col0":"false","_col1":"titi","_col2":"tata"},"alias":0}
at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:258)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:506)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:447)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct cannot be cast to [Ljava.lang.Object;
at org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:155)
at org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:56)
at org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:40)
at org.elasticsearch.hadoop.serialization.ContentBuilder.value(ContentBuilder.java:242)
The text was updated successfully, but these errors were encountered:
Could you please post your ES mapping and Hive script (table creation and query)? It would help in reproducing the issue accurately.
Also any particular configuration changes to Hive that are worth mentioning?
Hi everyone
thanks for a great piece of software.
I am running cloudera CDH4.3 with hive .10 on a three node cluster.
I have created Hive "reading" and Hive "writing" tables pointing to my elasticsearch server.
I can read fine however when I try to push data as per your example:
INSERT OVERWRITE TABLE es_tasks_write select "false","titi","tata" from sample_table limit 10;
I get an error message:
2013-07-12 16:31:33,709 FATAL ExecReducer: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{},"value":{"_col0":"false","_col1":"titi","_col2":"tata"},"alias":0}
at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:258)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:506)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:447)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct cannot be cast to [Ljava.lang.Object;
at org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:155)
at org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:56)
at org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:40)
at org.elasticsearch.hadoop.serialization.ContentBuilder.value(ContentBuilder.java:242)
The text was updated successfully, but these errors were encountered: