Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark 2.1 support #915

Closed
acchen97 opened this issue Jan 11, 2017 · 7 comments
Closed

Spark 2.1 support #915

acchen97 opened this issue Jan 11, 2017 · 7 comments
Assignees

Comments

@acchen97
Copy link
Contributor

This issue tracks adding support for Spark 2.1.

http://spark.apache.org/releases/spark-release-2-1-0.html

@jbaiera
Copy link
Member

jbaiera commented Jan 17, 2017

The upgrade process didn't run into any issues. This should be easy to get in for 5.2.

@jbaiera jbaiera added the v5.2.0 label Jan 17, 2017
@pfcoperez
Copy link
Contributor

pfcoperez commented Jan 20, 2017

@acchen97 @jbaiera I've performed some simple examples and it seems to work nicely with Spark 2.1 using local mode.

Is it the case that this task cover the fact that it has to be thoroughly checked before being considered as officially supported? just that a new version needs to be released addressing Spark 2.1 dependencies (as it currently fails because serialization signatures in standalone mode)?

If not, what cases are failing in Spark 2.1?

Thanks

@acchen97
Copy link
Contributor Author

@pfcoperez thanks for testing it. Yes, it's predominantly making sure everything works before we consider Spark 2.1 formally supported.

Let us know if anything else pops up for you.

@jbaiera jbaiera removed the v5.3.0 label Jan 31, 2017
@jbaiera
Copy link
Member

jbaiera commented Jan 31, 2017

Released in 5.2.0

@jbaiera jbaiera closed this as completed Jan 31, 2017
@cjuexuan
Copy link

hi,I found last elasticsearch-spark_2.11 version is 5.0.0-alpha4,and use spark_1.6,and in elasticsearch-hadoop-5.2.0 ,scala version is 2.10,so I think we need a version adaptation spark_2.11 and scala_2.11

@jbaiera
Copy link
Member

jbaiera commented Feb 23, 2017

@cjuexuan Please see the Spark section of the installation guide in the documentation. It outlines the correct artifacts for usage when working with the different versions of Spark and Scala.

@cjuexuan
Copy link

@jbaiera ,thanks

jbaiera added a commit that referenced this issue May 8, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants