New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vcf2adam -> Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; #871

Closed
jerryivanhoe opened this Issue Oct 27, 2015 · 18 comments

Comments

Projects
None yet
6 participants
@jerryivanhoe

jerryivanhoe commented Oct 27, 2015

Hi,
I installed version 0.17.1 on top of AWS EMR with Spark 1.3.1.
Then I loaded chromosome 1 from 1000genomes project as vcf and copied this into hdfs
[hadoop@ip-172-31-21-10 jerry]$ adam-submit vcf2adam hdfs://chr1/chr1 hdfs://chr1/chr1.adam
Using SPARK_SUBMIT=/home/hadoop/spark/bin/spark-submit
15/10/27 10:05:51 INFO cli.ADAMMain: ADAM invoked with args: "vcf2adam" "hdfs://chr1/chr1" "hdfs://chr1/chr1.adam"
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
at org.bdgenomics.adam.cli.ADAMMain$.main(ADAMMain.scala:105)

any idea would be highly appreciated
Jerry

@jerryivanhoe

This comment has been minimized.

Show comment
Hide comment
@heuermh

This comment has been minimized.

Show comment
Hide comment
@heuermh

heuermh Oct 27, 2015

Member

I believe this is when Spark and/or ADAM have been built for Scala 2.11 and then are run with Scala 2.10. If you could confirm that the Scala versions of Spark and ADAM match, that should help.

Member

heuermh commented Oct 27, 2015

I believe this is when Spark and/or ADAM have been built for Scala 2.11 and then are run with Scala 2.10. If you could confirm that the Scala versions of Spark and ADAM match, that should help.

@jerryivanhoe

This comment has been minimized.

Show comment
Hide comment
@jerryivanhoe

jerryivanhoe Oct 28, 2015

Thank you Michael.
I installed 2 older versions:
drwxr-xr-x 6 hadoop hadoop 4096 Oct 28 05:36 adam-distribution_2.10-0.17.1
drwxr-xr-x 6 hadoop hadoop 4096 Oct 28 05:36 adam-distribution_2.10-0.18.1
This looks a bit better, because the error appears later ...
e.g. with adam-distribution_2.10-0.17.1:
15/10/28 05:40:56 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
Exception in thread "main" java.lang.IllegalArgumentException: java.net.UnknownHostException: chr1
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)
I attached the full output.
convert_1
convert_2
convert_3
convert_4

jerryivanhoe commented Oct 28, 2015

Thank you Michael.
I installed 2 older versions:
drwxr-xr-x 6 hadoop hadoop 4096 Oct 28 05:36 adam-distribution_2.10-0.17.1
drwxr-xr-x 6 hadoop hadoop 4096 Oct 28 05:36 adam-distribution_2.10-0.18.1
This looks a bit better, because the error appears later ...
e.g. with adam-distribution_2.10-0.17.1:
15/10/28 05:40:56 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
Exception in thread "main" java.lang.IllegalArgumentException: java.net.UnknownHostException: chr1
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)
I attached the full output.
convert_1
convert_2
convert_3
convert_4

@laserson

This comment has been minimized.

Show comment
Hide comment
@laserson

laserson Oct 28, 2015

Contributor

HDFS URIs typically look like this:

hdfs://namenode-host/path/to/whatever

So Hadoop/Spark thinks that the host running the namenode server is called chr1. You probably meant something like

hdfs:///chr1/chr1

(note the difference with the slashes). This will use the default namenode that is set with any associated Hadoop client config.

Contributor

laserson commented Oct 28, 2015

HDFS URIs typically look like this:

hdfs://namenode-host/path/to/whatever

So Hadoop/Spark thinks that the host running the namenode server is called chr1. You probably meant something like

hdfs:///chr1/chr1

(note the difference with the slashes). This will use the default namenode that is set with any associated Hadoop client config.

@jerryivanhoe

This comment has been minimized.

Show comment
Hide comment
@jerryivanhoe

jerryivanhoe Oct 28, 2015

WOW Uri !!! Thanks so much ! It's running now.

Using only 2 out of 4 Clusternodes, but that's another issue, I will try to fix next !

best !
-Jerry

jerryivanhoe commented Oct 28, 2015

WOW Uri !!! Thanks so much ! It's running now.

Using only 2 out of 4 Clusternodes, but that's another issue, I will try to fix next !

best !
-Jerry

@heuermh

This comment has been minimized.

Show comment
Hide comment
@heuermh

heuermh Oct 28, 2015

Member

Good to hear!

There is a minor issue with the 0.18.1 distribution, reported here (#872). If you run into that, just move or remove the -sources.jar as shown in the issue description.

Member

heuermh commented Oct 28, 2015

Good to hear!

There is a minor issue with the 0.18.1 distribution, reported here (#872). If you run into that, just move or remove the -sources.jar as shown in the issue description.

@laserson

This comment has been minimized.

Show comment
Hide comment
@laserson

laserson Oct 28, 2015

Contributor

Great, closing this. Definitely report it if you have additional issues.

Contributor

laserson commented Oct 28, 2015

Great, closing this. Definitely report it if you have additional issues.

@laserson laserson closed this Oct 28, 2015

@jerryivanhoe

This comment has been minimized.

Show comment
Hide comment
@jerryivanhoe

jerryivanhoe Oct 29, 2015

Perfect job ! Thanks a lot, Michael and Uri !

jerryivanhoe commented Oct 29, 2015

Perfect job ! Thanks a lot, Michael and Uri !

@ankushreddy

This comment has been minimized.

Show comment
Hide comment
@ankushreddy

ankushreddy Jan 28, 2016

hi @laserson
am still facing the same error when am trying to run adam 0.18.1

./adam-submit transform /shared/avocado_test/NA06984.454.ssaha.SRP000033.2009_10.bam /shared/avocado_out/NA06984.454.ssaha.SRP000033.2009_10.bam_tags.adam -add_md_tags /shared/avocado_test/human_b36_male.fa
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using SPARK_SUBMIT=/usr/hdp/2.2.4.2-2/spark-1.4.1/bin/spark-submit
16/01/28 14:33:15 INFO cli.ADAMMain: ADAM invoked with args: "transform" "/shared/avocado_test/NA06984.454.ssaha.SRP000033.2009_10.bam" "/shared/avocado_out/NA06984.454.ssaha.SRP000033.2009_10.bam_tags.adam" "-add_md_tags" "/shared/avocado_test/human_b36_male.fa"
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
at org.bdgenomics.adam.cli.ADAMMain.apply(ADAMMain.scala:120)
at org.bdgenomics.adam.cli.ADAMMain$.main(ADAMMain.scala:77)
at org.bdgenomics.adam.cli.ADAMMain.main(ADAMMain.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/01/28 14:33:15 INFO util.Utils: Shutdown hook called

using the version spark 1.4.1
hadoop 2.6.0

Is there any workaround for this issue.
used this maven command to build.
mvn clean package -Dscala-2.10.5 -DskipTests

and used
mvn clean package -DskipTests

even then am facing the same issue.

ankushreddy commented Jan 28, 2016

hi @laserson
am still facing the same error when am trying to run adam 0.18.1

./adam-submit transform /shared/avocado_test/NA06984.454.ssaha.SRP000033.2009_10.bam /shared/avocado_out/NA06984.454.ssaha.SRP000033.2009_10.bam_tags.adam -add_md_tags /shared/avocado_test/human_b36_male.fa
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using SPARK_SUBMIT=/usr/hdp/2.2.4.2-2/spark-1.4.1/bin/spark-submit
16/01/28 14:33:15 INFO cli.ADAMMain: ADAM invoked with args: "transform" "/shared/avocado_test/NA06984.454.ssaha.SRP000033.2009_10.bam" "/shared/avocado_out/NA06984.454.ssaha.SRP000033.2009_10.bam_tags.adam" "-add_md_tags" "/shared/avocado_test/human_b36_male.fa"
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
at org.bdgenomics.adam.cli.ADAMMain.apply(ADAMMain.scala:120)
at org.bdgenomics.adam.cli.ADAMMain$.main(ADAMMain.scala:77)
at org.bdgenomics.adam.cli.ADAMMain.main(ADAMMain.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/01/28 14:33:15 INFO util.Utils: Shutdown hook called

using the version spark 1.4.1
hadoop 2.6.0

Is there any workaround for this issue.
used this maven command to build.
mvn clean package -Dscala-2.10.5 -DskipTests

and used
mvn clean package -DskipTests

even then am facing the same issue.

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Jan 28, 2016

Member

Hi @ankushreddy

We do not have a -Dscala-2.10.5 switch for our build. Are you running on a Spark build that is built for Scala 2.10 or 2.11?

Member

fnothaft commented Jan 28, 2016

Hi @ankushreddy

We do not have a -Dscala-2.10.5 switch for our build. Are you running on a Spark build that is built for Scala 2.10 or 2.11?

@ankushreddy

This comment has been minimized.

Show comment
Hide comment
@ankushreddy

ankushreddy Jan 28, 2016

HI @fnothaft

we are running spark on scala 2.10

ankushreddy commented Jan 28, 2016

HI @fnothaft

we are running spark on scala 2.10

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Jan 28, 2016

Member

When you do git status in the ADAM repository, what do you get?

Member

fnothaft commented Jan 28, 2016

When you do git status in the ADAM repository, what do you get?

@fnothaft fnothaft reopened this Jan 28, 2016

@ankushreddy

This comment has been minimized.

Show comment
Hide comment
@ankushreddy

ankushreddy Jan 29, 2016

I actually downloaded the source code of adam 0.18 from
https://github.com/bigdatagenomics/adam/releases

and compiled it on my hadoop I actually cloned the new repository to my desktop there am getting the status like.

C:\Users\asugured\Documents\GitHub\adam [master]> git status
On branch master
Your branch is up-to-date with 'origin/master'.
nothing to commit, working directory clean

ankushreddy commented Jan 29, 2016

I actually downloaded the source code of adam 0.18 from
https://github.com/bigdatagenomics/adam/releases

and compiled it on my hadoop I actually cloned the new repository to my desktop there am getting the status like.

C:\Users\asugured\Documents\GitHub\adam [master]> git status
On branch master
Your branch is up-to-date with 'origin/master'.
nothing to commit, working directory clean

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Jan 29, 2016

Member

Which release of ADAM 0.18 did you download? We package both Scala 2.10 and 2.11 artifacts with each release. The error you're getting (missing method from inside of a Scala standard library) indicates that you have a Scala version mismatch between your Spark and ADAM distros. This can happen if you downloaded the ADAM 0.18 Scala 2.11 artifacts, but are running with Spark built for Scala 2.10. Can you run git log | head -n 1 | awk '{ print $2 }'? This will give us the commit hash for the latest commit, which will confirm which set of ADAM artifacts you've pulled down. You can also run grep scala.version pom.xml, which will return either 2.10 or 2.11.

Member

fnothaft commented Jan 29, 2016

Which release of ADAM 0.18 did you download? We package both Scala 2.10 and 2.11 artifacts with each release. The error you're getting (missing method from inside of a Scala standard library) indicates that you have a Scala version mismatch between your Spark and ADAM distros. This can happen if you downloaded the ADAM 0.18 Scala 2.11 artifacts, but are running with Spark built for Scala 2.10. Can you run git log | head -n 1 | awk '{ print $2 }'? This will give us the commit hash for the latest commit, which will confirm which set of ADAM artifacts you've pulled down. You can also run grep scala.version pom.xml, which will return either 2.10 or 2.11.

@faissl-asu

This comment has been minimized.

Show comment
Hide comment
@faissl-asu

faissl-asu Jan 29, 2016

I performed the “grep scala.version pom.xml” in Ankush’s ADAM directory and got:

 Scala Version: 2.10

It sounds like we need to update the pom.xml file?

From: Frank Austin Nothaft [mailto:notifications@github.com]
Sent: Thursday, January 28, 2016 5:50 PM
To: bigdatagenomics/adam adam@noreply.github.com
Subject: Re: [adam] vcf2adam -> Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; (#871)

Which release of ADAM 0.18 did you download? We package both Scala 2.10 and 2.11 artifacts with each release. The error you're getting (missing method from inside of a Scala standard library) indicates that you have a Scala version mismatch between your Spark and ADAM distros. This can happen if you downloaded the ADAM 0.18 Scala 2.11 artifacts, but are running with Spark built for Scala 2.10. Can you run git log | head -n 1 | awk '{ print $2 }'? This will give us the commit hash for the latest commit, which will confirm which set of ADAM artifacts you've pulled down. You can also run grep scala.version pom.xml, which will return either 2.10 or 2.11.


Reply to this email directly or view it on GitHubhttps://github.com/bigdatagenomics/adam/issues/871#issuecomment-176500650.

faissl-asu commented Jan 29, 2016

I performed the “grep scala.version pom.xml” in Ankush’s ADAM directory and got:

 Scala Version: 2.10

It sounds like we need to update the pom.xml file?

From: Frank Austin Nothaft [mailto:notifications@github.com]
Sent: Thursday, January 28, 2016 5:50 PM
To: bigdatagenomics/adam adam@noreply.github.com
Subject: Re: [adam] vcf2adam -> Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; (#871)

Which release of ADAM 0.18 did you download? We package both Scala 2.10 and 2.11 artifacts with each release. The error you're getting (missing method from inside of a Scala standard library) indicates that you have a Scala version mismatch between your Spark and ADAM distros. This can happen if you downloaded the ADAM 0.18 Scala 2.11 artifacts, but are running with Spark built for Scala 2.10. Can you run git log | head -n 1 | awk '{ print $2 }'? This will give us the commit hash for the latest commit, which will confirm which set of ADAM artifacts you've pulled down. You can also run grep scala.version pom.xml, which will return either 2.10 or 2.11.


Reply to this email directly or view it on GitHubhttps://github.com/bigdatagenomics/adam/issues/871#issuecomment-176500650.

@ankushreddy

This comment has been minimized.

Show comment
Hide comment
@ankushreddy

ankushreddy Jan 29, 2016

Thanks for reply @fnothaft and @faissl-asu we actually have spark built on scala 2.10 so we need to download adam-0.18_2.10 for scala 2.10 and adam-0.18_2.11 for scala 2.11.

Correct me if am wrong @fnothaft

Thanks for directing me in the correct way :)

Thanks & Regards,
Ankush Reddy.

ankushreddy commented Jan 29, 2016

Thanks for reply @fnothaft and @faissl-asu we actually have spark built on scala 2.10 so we need to download adam-0.18_2.10 for scala 2.10 and adam-0.18_2.11 for scala 2.11.

Correct me if am wrong @fnothaft

Thanks for directing me in the correct way :)

Thanks & Regards,
Ankush Reddy.

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Jan 29, 2016

Member

Thanks for reply @fnothaft and @faissl-asu we actually have spark built on scala 2.10 so we need to download adam-0.18_2.10 for scala 2.10 and adam-0.18_2.11 for scala 2.11.

This is correct.

I performed the “grep scala.version pom.xml” in Ankush’s ADAM directory and got:

Scala Version: 2.10

It sounds like we need to update the pom.xml file?

No, this looks like the correct version if you're running Spark built for Scala 2.10.

Thanks for directing me in the correct way :)

Definitely! Hopefully we can resolve this soon.

Member

fnothaft commented Jan 29, 2016

Thanks for reply @fnothaft and @faissl-asu we actually have spark built on scala 2.10 so we need to download adam-0.18_2.10 for scala 2.10 and adam-0.18_2.11 for scala 2.11.

This is correct.

I performed the “grep scala.version pom.xml” in Ankush’s ADAM directory and got:

Scala Version: 2.10

It sounds like we need to update the pom.xml file?

No, this looks like the correct version if you're running Spark built for Scala 2.10.

Thanks for directing me in the correct way :)

Definitely! Hopefully we can resolve this soon.

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Jul 6, 2016

Member

Closing as resolved.

Member

fnothaft commented Jul 6, 2016

Closing as resolved.

@fnothaft fnothaft closed this Jul 6, 2016

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment