New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

first run of ADAM #1582

Closed
leoSeattle opened this Issue Jun 27, 2017 · 7 comments

Comments

Projects
4 participants
@leoSeattle

leoSeattle commented Jun 27, 2017

Hi
I installed ADAM, and there is no error during the installation. But it seems running ./scripts/jenkins-test was not successful.
Here is my system:
Ubuntu 16.04
Hadoop: 2.7.3
Spark:: 2.0.2
Scala: 2.10
Adam: 2.10

Here is output of running ./scripts/jenkins-test :

make a tempdir for writing maven cruft to

ADAM_MVN_TMP_DIR=$(mktemp -d -t adamTestMvnXXXXXXX)
mktemp -d -t adamTestMvnXXXXXXX
++ mktemp -d -t adamTestMvnXXXXXXX

  • ADAM_MVN_TMP_DIR=/tmp/adamTestMvnrvTbc7q

add this tempdir to the poms...

find . -name pom.xml
-exec sed -i.bak
-e "s:sun.io.serialization.extendedDebugInfo=true:sun.io.serialization.extendedDebugInfo=true -Djava.io.tmpdir=${ADAM_MVN_TMP_DIR}:g"
{} ;

  • find . -name pom.xml -exec sed -i.bak -e 's:sun.io.serialization.extendedDebugInfo=true:sun.io.serialization.extendedDebugInfo=true -Djava.io.tmpdir=/tmp/adamTestMvnrvTbc7q:g' '{}' ';'
    find . -name "*.bak" -exec rm -f {} ;
  • find . -name '*.bak' -exec rm -f '{}' ';'

variable declarations

export PATH=${JAVA_HOME}/bin/:${PATH}

  • export PATH=/bin/:/usr/local/share/spark/spark-2.0.2/bin:/home/leo/bin:/home/leo/.local/bin:/home/leo/bin:/home/leo/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
  • PATH=/bin/:/usr/local/share/spark/spark-2.0.2/bin:/home/leo/bin:/home/leo/.local/bin:/home/leo/bin:/home/leo/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
    export MAVEN_OPTS="-Xmx1536m -XX:MaxPermSize=1g -Dfile.encoding=utf-8"
  • export 'MAVEN_OPTS=-Xmx1536m -XX:MaxPermSize=1g -Dfile.encoding=utf-8'
  • MAVEN_OPTS='-Xmx1536m -XX:MaxPermSize=1g -Dfile.encoding=utf-8'
    DIR=$( cd $( dirname ${BASH_SOURCE[0]} ) && pwd )
    cd $( dirname ${BASH_SOURCE[0]} ) && pwd
    dirname ${BASH_SOURCE[0]}
    +++ dirname ./scripts/jenkins-test
    ++ cd ./scripts
    ++ pwd
  • DIR=/home/leo/adam/scripts
    PROJECT_ROOT=${DIR}/..
  • PROJECT_ROOT=/home/leo/adam/scripts/..
    VERSION=$(grep "" ${PROJECT_ROOT}/pom.xml | head -2 | tail -1 | sed 's/ *//g' | sed 's/</version>//g')
    grep "" ${PROJECT_ROOT}/pom.xml | head -2 | tail -1 | sed 's/ *//g' | sed 's/</version>//g'
    ++ grep '' /home/leo/adam/scripts/../pom.xml
    ++ sed 's/</version>//g'
    ++ sed 's/ *//g'
    ++ tail -1
    ++ head -2
  • VERSION=0.23.0-SNAPSHOT

is the hadoop version set?

if ! [[ ${HADOOP_VERSION} ]];
then
echo "HADOOP_VERSION environment variable is not set."
echo "Please set this variable before running."

exit 1

fi

  • [[ -n 2.7.3 ]]

is the spark version set?

if ! [[ ${SPARK_VERSION} ]];
then
echo "SPARK_VERSION environment variable is not set."
echo "Please set this variable before running."

exit 1

fi

  • [[ -n 2.0.2 ]]

this next line is supposed to fail

set +e

  • set +e

echo "Rewriting POM.xml files to Scala 2.10 and Spark 1 should error..."

  • echo 'Rewriting POM.xml files to Scala 2.10 and Spark 1 should error...'
    Rewriting POM.xml files to Scala 2.10 and Spark 1 should error...
    ./scripts/move_to_spark_1.sh
  • ./scripts/move_to_spark_1.sh
    POM is already set up for Spark 1 (Spark 1/2 artifacts are missing -spark2 suffix in artifact names).
    Cowardly refusing to move to Spark 1 a second time...
    if [[ $? == 0 ]];
    then
    echo "Running move_to_spark_1.sh when POMs are set up for Spark 1 should fail, but error code was 0 (success)."
    exit 1
    fi
  • [[ 1 == 0 ]]

./scripts/move_to_scala_2.10.sh

  • ./scripts/move_to_scala_2.10.sh
    Scala version is already set to 2.10 (Scala artifacts have _2.10 version suffix in artifact name).
    Cowardly refusing to move to Scala 2.10 a second time...
    if [[ $? == 0 ]];
    then
    echo "Running move_to_scala_2.10.sh when POMs are set up for Scala 2.10 should fail, but error code was 0 (success)."
    exit 1
    fi
  • [[ 1 == 0 ]]

set -e

  • set -e

are we testing for spark 2.0.0? if so, we need to rewrite our poms first

if [ ${SPARK_VERSION} == 2.0.0 ];
then

echo "Rewriting POM.xml files for Spark 2."
./scripts/move_to_spark_2.sh

# shouldn't be able to move to spark 2 twice
set +e
./scripts/move_to_spark_2.sh
if [[ $? == 0 ]];
then
    echo "We have already moved to Spark 2, so running move_to_spark_2.sh a second time should fail, but error code was 0 (success)."
    exit 1
fi
set -e

fi

  • '[' 2.0.2 == 2.0.0 ']'

are we testing for scala 2.11? if so, we need to rewrite our poms to 2.11 first

if [ ${SCALAVER} == 2.11 ];
then
echo "Rewriting POM.xml files for Scala 2.11."
./scripts/move_to_scala_2.11.sh

# shouldn't be able to move to scala 2.11 twice
set +e
./scripts/move_to_scala_2.11.sh
if [[ $? == 0 ]];
then
    echo "We have already moved to Scala 2.11, so running move_to_scala_2.11.sh a second time should fail, but error code was 0 (success)."
    exit 1
fi
set -e

fi

  • '[' == 2.11 ']'
    ./scripts/jenkins-test: line 79: [: ==: unary operator expected

print versions

echo "Testing ADAM version ${VERSION} on Spark ${SPARK_VERSION} and Hadoop ${HADOOP_VERSION}"

  • echo 'Testing ADAM version 0.23.0-SNAPSHOT on Spark 2.0.2 and Hadoop 2.7.3'
    Testing ADAM version 0.23.0-SNAPSHOT on Spark 2.0.2 and Hadoop 2.7.3

first, build the sources, run the unit tests, and generate a coverage report

mvn clean
-Dhadoop.version=${HADOOP_VERSION}
-Dspark.version=${SPARK_VERSION}

  • mvn clean -Dhadoop.version=2.7.3 -Dspark.version=2.0.2
    OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=1g; support was removed in 8.0
    [INFO] Scanning for projects...
    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Build Order:
    [INFO]
    [INFO] ADAM_2.10
    [INFO] ADAM_2.10: Core
    [INFO] ADAM_2.10: APIs for Java
    [INFO] ADAM_2.10: CLI
    [INFO] ADAM_2.10: Assembly
    [INFO]
    [INFO] ------------------------------------------------------------------------
    [INFO] Building ADAM_2.10 0.23.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    [INFO]
    [INFO] --- maven-clean-plugin:3.0.0:clean (default-clean) @ adam-parent_2.10 ---
    [INFO]
    [INFO] ------------------------------------------------------------------------
    [INFO] Building ADAM_2.10: Core 0.23.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    [INFO]
    [INFO] --- maven-clean-plugin:3.0.0:clean (default-clean) @ adam-core_2.10 ---
    [INFO]
    [INFO] ------------------------------------------------------------------------
    [INFO] Building ADAM_2.10: APIs for Java 0.23.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    [INFO]
    [INFO] --- maven-clean-plugin:3.0.0:clean (default-clean) @ adam-apis_2.10 ---
    [INFO]
    [INFO] ------------------------------------------------------------------------
    [INFO] Building ADAM_2.10: CLI 0.23.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    [INFO]
    [INFO] --- maven-clean-plugin:3.0.0:clean (default-clean) @ adam-cli_2.10 ---
    [INFO]
    [INFO] ------------------------------------------------------------------------
    [INFO] Building ADAM_2.10: Assembly 0.23.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    [INFO]
    [INFO] --- maven-clean-plugin:3.0.0:clean (default-clean) @ adam-assembly_2.10 ---
    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary:
    [INFO]
    [INFO] ADAM_2.10 .......................................... SUCCESS [ 0.459 s]
    [INFO] ADAM_2.10: Core .................................... SUCCESS [ 0.042 s]
    [INFO] ADAM_2.10: APIs for Java ........................... SUCCESS [ 0.005 s]
    [INFO] ADAM_2.10: CLI ..................................... SUCCESS [ 0.031 s]
    [INFO] ADAM_2.10: Assembly ................................ SUCCESS [ 0.005 s]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 0.972 s
    [INFO] Finished at: 2017-06-27T08:31:55-07:00
    [INFO] Final Memory: 9M/208M
    [INFO] ------------------------------------------------------------------------

if this is a pull request, we need to set the coveralls pr id

if [[ ! -z $ghprbPullId ]];
then
COVERALLS_PRB_OPTION="-DpullRequest=${ghprbPullId}"
fi

  • [[ ! -z '' ]]

coveralls token should not be visible

set +x +v

  • set +x +v
    Coveralls token is not set. Exiting...
@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Jun 27, 2017

Member

Hi @leoSeattle. The ./scripts/jenkins-test script is not expected to run correctly outside of our continuous integration environment, as it depends on private configuration values.

Member

fnothaft commented Jun 27, 2017

Hi @leoSeattle. The ./scripts/jenkins-test script is not expected to run correctly outside of our continuous integration environment, as it depends on private configuration values.

@leoSeattle

This comment has been minimized.

Show comment
Hide comment
@leoSeattle

leoSeattle Jun 27, 2017

Thanks for quick response. so here is a real error:
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] ADAM_2.10 .......................................... SUCCESS [ 15.387 s]
[INFO] ADAM_2.10: Core .................................... SUCCESS [03:09 min]
[INFO] ADAM_2.10: APIs for Java ........................... SUCCESS [ 16.876 s]
[INFO] ADAM_2.10: CLI ..................................... SUCCESS [ 38.317 s]
[INFO] ADAM_2.10: Assembly ................................ SUCCESS [ 11.778 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:32 min
[INFO] Finished at: 2017-06-27T08:44:43-07:00
[INFO] Final Memory: 66M/301M
[INFO] ------------------------------------------------------------------------
leo@leo-ThinkPad-T510:/adam$
leo@leo-ThinkPad-T510:
/adam$ ./bin/adam-submit transformAlignments adam-core/src/test/resources/small.sam /tmp/small.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using SPARK_SUBMIT=/usr/local/share/spark/spark-2.0.2/bin/spark-submit
Exception in thread "main" java.lang.IncompatibleClassChangeError: Implementing class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.bdgenomics.adam.cli.TransformAlignments.(TransformAlignments.scala:128)
at org.bdgenomics.adam.cli.TransformAlignments$.apply(TransformAlignments.scala:41)
at org.bdgenomics.adam.cli.TransformAlignments$.apply(TransformAlignments.scala:36)
at org.bdgenomics.adam.cli.ADAMMain.apply(ADAMMain.scala:126)
at org.bdgenomics.adam.cli.ADAMMain$.main(ADAMMain.scala:65)
at org.bdgenomics.adam.cli.ADAMMain.main(ADAMMain.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
leo@leo-ThinkPad-T510:~/adam$

leoSeattle commented Jun 27, 2017

Thanks for quick response. so here is a real error:
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] ADAM_2.10 .......................................... SUCCESS [ 15.387 s]
[INFO] ADAM_2.10: Core .................................... SUCCESS [03:09 min]
[INFO] ADAM_2.10: APIs for Java ........................... SUCCESS [ 16.876 s]
[INFO] ADAM_2.10: CLI ..................................... SUCCESS [ 38.317 s]
[INFO] ADAM_2.10: Assembly ................................ SUCCESS [ 11.778 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:32 min
[INFO] Finished at: 2017-06-27T08:44:43-07:00
[INFO] Final Memory: 66M/301M
[INFO] ------------------------------------------------------------------------
leo@leo-ThinkPad-T510:/adam$
leo@leo-ThinkPad-T510:
/adam$ ./bin/adam-submit transformAlignments adam-core/src/test/resources/small.sam /tmp/small.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using SPARK_SUBMIT=/usr/local/share/spark/spark-2.0.2/bin/spark-submit
Exception in thread "main" java.lang.IncompatibleClassChangeError: Implementing class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.bdgenomics.adam.cli.TransformAlignments.(TransformAlignments.scala:128)
at org.bdgenomics.adam.cli.TransformAlignments$.apply(TransformAlignments.scala:41)
at org.bdgenomics.adam.cli.TransformAlignments$.apply(TransformAlignments.scala:36)
at org.bdgenomics.adam.cli.ADAMMain.apply(ADAMMain.scala:126)
at org.bdgenomics.adam.cli.ADAMMain$.main(ADAMMain.scala:65)
at org.bdgenomics.adam.cli.ADAMMain.main(ADAMMain.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
leo@leo-ThinkPad-T510:~/adam$

@devin-petersohn

This comment has been minimized.

Show comment
Hide comment
@devin-petersohn

devin-petersohn Jun 27, 2017

Member

What Scala version are you using?

Member

devin-petersohn commented Jun 27, 2017

What Scala version are you using?

@leoSeattle

This comment has been minimized.

Show comment
Hide comment
@leoSeattle

leoSeattle Jun 27, 2017

I didn't install Scala. I installed Spark 2.0.2 which has Scala 2.11.8. Should I install Scala?

leoSeattle commented Jun 27, 2017

I didn't install Scala. I installed Spark 2.0.2 which has Scala 2.11.8. Should I install Scala?

@devin-petersohn

This comment has been minimized.

Show comment
Hide comment
@devin-petersohn

devin-petersohn Jun 27, 2017

Member

The first thing to do is to run the scripts/move_to_spark_2.sh and scripts/move_to_scala_2.11.sh. You will have to rebuild after you run those two scripts. Try the example again and let me know if it works.

Member

devin-petersohn commented Jun 27, 2017

The first thing to do is to run the scripts/move_to_spark_2.sh and scripts/move_to_scala_2.11.sh. You will have to rebuild after you run those two scripts. Try the example again and let me know if it works.

@leoSeattle

This comment has been minimized.

Show comment
Hide comment
@leoSeattle

leoSeattle Jun 27, 2017

It worked!
Thank you so much!

leoSeattle commented Jun 27, 2017

It worked!
Thank you so much!

@devin-petersohn

This comment has been minimized.

Show comment
Hide comment
@devin-petersohn

devin-petersohn Jun 27, 2017

Member

No problem, enjoy.

Closing issue as resolved.

Member

devin-petersohn commented Jun 27, 2017

No problem, enjoy.

Closing issue as resolved.

@heuermh heuermh modified the milestone: 0.23.0 Jul 22, 2017

@heuermh heuermh added this to Completed in Release 0.23.0 Jan 4, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment