You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a es-hive table created using this statement below 馃憤
CREATE EXTERNAL TABLE stg.elastic_test (id int, name string)
STORED BY 'org.elasticsearch.hadoop.hive.EsStorageHandler'
TBLPROPERTIES('es.resource' = 'work/test','es.nodes' = '#es_host:9200',
'es.field.read.empty.as.null' = 'yes','es.index.read.missing.as.empty'='yes');
When i insert data (only about 8 records ) into this table, the insertion operation completes successfully. But when i try to access the data using "select * from stg.elastic_test" , it throws this error 馃憤
Failed with exception java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.IntWritable
The same table , if i change id to string type and cast the the values as string when inserting, it works fine. Can you tell me what is the issue here and how can that be fixed ?
Thanks,
Nivetha
The text was updated successfully, but these errors were encountered:
The error is caused by a type mismatch in your mapping - namely you are trying to read a Long as a Int. This typically occurs due to automatic mapping and it explained in the docs (as well on how to address it), namely here.
I have a es-hive table created using this statement below 馃憤
CREATE EXTERNAL TABLE stg.elastic_test (id int, name string)
STORED BY 'org.elasticsearch.hadoop.hive.EsStorageHandler'
TBLPROPERTIES('es.resource' = 'work/test','es.nodes' = '#es_host:9200',
'es.field.read.empty.as.null' = 'yes','es.index.read.missing.as.empty'='yes');
When i insert data (only about 8 records ) into this table, the insertion operation completes successfully. But when i try to access the data using "select * from stg.elastic_test" , it throws this error 馃憤
Failed with exception java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.IntWritable
The same table , if i change id to string type and cast the the values as string when inserting, it works fine. Can you tell me what is the issue here and how can that be fixed ?
Thanks,
Nivetha
The text was updated successfully, but these errors were encountered: