New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spark SQL 1.3 - Exception in Elasticsearch when executing JOIN with DataFrame created from Oracle's table #449
Comments
Update: the same Exception with JOIN happens when joining 2 DataFrames loaded from Elasticsearch as well, so it is not related to Oracle: val hours = sqlContext.load("summary/hours", "org.elasticsearch.spark.sql")
hours.registerTempTable("HOURS_DF")
val days = sqlContext.load("summary/days", "org.elasticsearch.spark.sql")
days.registerTempTable("DAYS_DF")
val hoursAug = sqlContext.sql("SELECT H.Hour, D.Day " +
"FROM HOURS_DF H, DAYS_DF D " +
"WHERE H.User = D.User")
|
Hi Costin, Thank's a lot for the fix.
|
Better control over the internal data structure in EsScalaRow to help Kryo serialize/deserialize the data relates #449
@FDmitriy I have tried to reproduce the error to no avail (with a local, locally-remote and fully remote Spark cluster). I even called the Kryo serialization directly with a Thanks, |
Hi Costin, I tried it again with 2.1.0.rc1 and development snapshot versions and the issue didn't occur. |
Thanks for confirming. Cheers! |
Hi,
Environment: Spark 1.3.0
Elasticsearch: 1.4.4
Elasticsearch-Hadoop: 2.1.0.Beta4
Oracle Express 11g XE
I am trying to run a join between 2 DataFrames, one from Elasticsearch and another one from Oracle and getting exception as below.
This issue only happens when loading Elasticsearch data using Spark SQL "load" function. It doesn't happen when loading using esDF function. However, when using esDF function, mappings of columns to column names is broken (issue: #451)
Code:
The elements of Elasticsearch and Oracle DataFrames are printed successfully on the screen (first two foreach{...}) and the exceptions happens when join is executed.
Can you please advise if this is a bug and if yes, is there any workaround?
Thanks,
Dmitriy Fingerman
The text was updated successfully, but these errors were encountered: