Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Documentation][ newHadoopRDD not a member of SparkContext #390

Closed
eliasah opened this issue Mar 11, 2015 · 3 comments
Closed

[Documentation][ newHadoopRDD not a member of SparkContext #390

eliasah opened this issue Mar 11, 2015 · 3 comments

Comments

@eliasah
Copy link
Contributor

eliasah commented Mar 11, 2015

Following the documentation section about the Apache Spark support

I have noticed the following when defining the context to read data from Elasticsearch :

val esRDD = sc.newHadoopRDD(conf, classOf[EsInputFormat[Text, MapWritable]],     
                              classOf[Text], classOf[MapWritable]))

Beside the extra parenthesis by the end, newHadoopRDD is not a member of SparkContext. So how do we define the RDD using the new Hadoop API.

Thanks! :)

MLnick added a commit to MLnick/elasticsearch-hadoop that referenced this issue Mar 23, 2015
@MLnick
Copy link
Contributor

MLnick commented Mar 23, 2015

It is sc.newAPIHadoopRDD(...) - I have submitted a PR for updating the doc

@eliasah
Copy link
Contributor Author

eliasah commented Mar 23, 2015

Ok thanks! Then we should close the issue once the PR has been accepted

costin pushed a commit that referenced this issue Mar 30, 2015
@MLnick
Copy link
Contributor

MLnick commented Mar 31, 2015

@eliasah can close this now I think :)

@eliasah eliasah closed this as completed Mar 31, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants