Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not able to locate scala.XML while adding 2.1.0.Beta3 version in dependency #374

Closed
deepakas opened this issue Feb 7, 2015 · 7 comments
Closed

Comments

@deepakas
Copy link

deepakas commented Feb 7, 2015

I was able to process XML in spark using scala.xml.XML class. For some reason once I add the elasticsearch-hadoop version 2.1.0.Beta3 in mvn as a dependency it is not able to recognize the scala.xml class in the project. Any thoughts ?

Error:(2, 14) object xml is not a member of package scala
import scala.xml.XML
^

@costin
Copy link
Member

costin commented Feb 7, 2015

Sorry, no. Better ask maven: mvn dependency:dependency and see what happens to scala.xml.XML

@costin
Copy link
Member

costin commented Feb 7, 2015

Just looked into the dependencies - considering scala.xml is part of the Scala package most likely you are not adding scala/spark/es-hadoop to compile scope and potentially use it only at runtime. A guess that in other words, you are telling Maven to use Scala when running your code not when it compiles it.

@deepakas
Copy link
Author

deepakas commented Feb 7, 2015

Thanks Costin for checking. It finds the scala.xml package class when
I use 2.1.0.Beta1
but not with 2.1.0.Beta2 or 2.1.0.Beta3. When I use 2.1.0.Beta1 I am
getting another error. I am looking into the dependency.

15/02/07 16:43:15 INFO Slf4jLogger: Slf4jLogger started
Exception in thread "main" java.lang.ClassNotFoundException:
com.google.protobuf_spark.GeneratedMessage

On Sat, Feb 7, 2015 at 4:31 PM, Costin Leau notifications@github.com
wrote:

Just looked into the dependencies - considering scala.xml is part of the
Scala package most likely you are not adding scala/spark/es-hadoop to
compile scope and potentially use it only at runtime. A guess that in
other words, you are telling Maven to use Scala when running your code not
when it compiles it.


Reply to this email directly or view it on GitHub
#374 (comment)
.

Deepak Subhramanian

@deepakas
Copy link
Author

deepakas commented Feb 8, 2015

I was able to resolve the ClassNotFoundException by adding this exclusion.
I am using spark-core_2.11 with 1.2.0 version.

15/02/07 16:43:15 INFO Slf4jLogger: Slf4jLogger started
Exception in thread "main" java.lang.ClassNotFoundException:
com.google.protobuf_spark.GeneratedMessage

org.elasticsearch elasticsearch-hadoop 2.1.0.Beta1 org.apache.spark spark-core_2.10

On Sat, Feb 7, 2015 at 4:44 PM, Deepak Subhramanian <
deepak.subhramanian@gmail.com> wrote:

Thanks Costin for checking. It finds the scala.xml package class when I
use 2.1.0.Beta1 but not with 2.1.0.Beta2 or 2.1.0.Beta3. When I use
2.1.0.Beta1 I am getting another error. I am looking into the dependency.

15/02/07 16:43:15 INFO Slf4jLogger: Slf4jLogger started
Exception in thread "main" java.lang.ClassNotFoundException:
com.google.protobuf_spark.GeneratedMessage

On Sat, Feb 7, 2015 at 4:31 PM, Costin Leau notifications@github.com
wrote:

Just looked into the dependencies - considering scala.xml is part of the
Scala package most likely you are not adding scala/spark/es-hadoop to
compile scope and potentially use it only at runtime. A guess that in
other words, you are telling Maven to use Scala when running your code not
when it compiles it.


Reply to this email directly or view it on GitHub
#374 (comment)
.

Deepak Subhramanian

Deepak Subhramanian

@costin
Copy link
Member

costin commented Feb 9, 2015

@deepakas It looks like the issue is caused by using Spark compiled against Scala 2.11 while es-hadoop/spark relies on Spark compiled with Scala 2.10 (that's what the official release anyway).
I've created an issue (##376) to provide a 2.11 compatible release for the es-spark jar; es-hadoop will likely remain on Scala 2.10 as 2.11 seems to be added only since 1.2.0 and is not yet fully supported.

Closing this issue.

P.S. By the way, if you use only the spark module, you can also opt for elasticsearch-spark module instead.

@deepakas
Copy link
Author

deepakas commented Feb 9, 2015

Hi Costin,

Thank you very much for the clarification and suggestion. I will use the
spark version in elasticsearch-hadoop .

Thanks, Deepak

On Mon, Feb 9, 2015 at 6:27 PM, Costin Leau notifications@github.com
wrote:

@deepakas https://github.com/deepakas It looks like the issue is caused
by using Spark compiled against Scala 2.11 while es-hadoop/spark relies on
Spark compiled with Scala 2.10 (that's what the official release anyway).
I've created an issue (##376
#376) to
provide a 2.11 compatible release for the es-spark jar; es-hadoop will
likely remain on Scala 2.10 as 2.11 seems to be added only since 1.2.0 and
is not yet fully supported.

Closing this issue.

P.S. By the way, if you use only the spark module, you can also opt for
elasticsearch-spark module instead.


Reply to this email directly or view it on GitHub
#374 (comment)
.

Deepak Subhramanian

@costin
Copy link
Member

costin commented Feb 9, 2015

@deepakas I've pushed a dev build for 2.11 - you can find it maven already (see the docs - the install chapter). Can you please try it out and let me know how it works for you? Let's continue the discussion on #376 - thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants