New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Documentation][ newHadoopRDD not a member of SparkContext #390
Comments
MLnick
added a commit
to MLnick/elasticsearch-hadoop
that referenced
this issue
Mar 23, 2015
It is |
Ok thanks! Then we should close the issue once the PR has been accepted |
costin
pushed a commit
that referenced
this issue
Mar 30, 2015
@eliasah can close this now I think :) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Following the documentation section about the Apache Spark support
I have noticed the following when defining the context to read data from Elasticsearch :
Beside the extra parenthesis by the end, newHadoopRDD is not a member of SparkContext. So how do we define the RDD using the new Hadoop API.
Thanks! :)
The text was updated successfully, but these errors were encountered: