New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove python and R profiles from release script #1842

Merged
merged 3 commits into from Dec 28, 2017

Conversation

Projects
4 participants
@heuermh
Member

heuermh commented Dec 21, 2017

Opened a new branch for updates to the release script.

I'm not confident this will work, the build from source instructions look for the assembly jar relative to an ${ADAM_HOME}, which will keep moving during a release build (once at the normal depth and then nested for the build from the checked out release tag)

# put adam jar on the pyspark path
ASSEMBLY_DIR="${ADAM_HOME}/adam-assembly/target"
ASSEMBLY_JAR="$(ls -1 "$ASSEMBLY_DIR" | grep "^adam[0-9A-Za-z\.\_-]*\.jar$" | grep -v -e javadoc -e sources || true)"
export PYSPARK_SUBMIT_ARGS="--jars ${ASSEMBLY_DIR}/${ASSEMBLY_JAR} --driver-class-path ${ASSEMBLY_DIR}/${ASSEMBLY_JAR} pyspark-shell"
# put adam jar on the SparkR path
ASSEMBLY_DIR="${ADAM_HOME}/adam-assembly/target"
ASSEMBLY_JAR="$(ls -1 "$ASSEMBLY_DIR" | grep "^adam[0-9A-Za-z\_\.-]*\.jar$" | grep -v javadoc | grep -v sources || true)"
export SPARKR_SUBMIT_ARGS="--jars ${ASSEMBLY_DIR}/${ASSEMBLY_JAR} --driver-class-path ${ASSEMBLY_DIR}/${ASSEMBLY_JAR} sparkr-shell"
@AmplabJenkins

This comment has been minimized.

Show comment
Hide comment
@AmplabJenkins

AmplabJenkins Dec 21, 2017

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/ADAM-prb/2523/
Test PASSed.

AmplabJenkins commented Dec 21, 2017

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/ADAM-prb/2523/
Test PASSed.

@heuermh

This comment has been minimized.

Show comment
Hide comment
@heuermh

heuermh Dec 21, 2017

Member

The more I play with this, the less likely I think it will work.

mvn --batch-mode -P distribution,python,r ... fails for me now without starting a conda enviroment, and I believe adding that to the release script may also run into an issue with the outer and inner maven release builds

[INFO] --- exec-maven-plugin:1.5.0:exec (dev-python) @ adam-python-spark2_2.11 ---
pip install -e .
bash: pip: command not found
make: *** [develop] Error 127
[ERROR] Command execution failed.
org.apache.commons.exec.ExecuteException: Process exited with an error: 2 (Exit value: 2)

I know we wanted to add the python egg #1667 and R tarball #1668 to the release distribution tarball, I'm not seeing how to make that work in a single maven call (i.e. the inner call to maven on the checked out release tag). In fact, it is not currently working for snapshots uploaded by Jenkins.

$ wget https://oss.sonatype.org/content/repositories/snapshots/org/bdgenomics/adam/adam-distribution-spark2_2.11/0.23.0-SNAPSHOT/adam-distribution-spark2_2.11-0.23.0-20171221.152447-595-bin.tar.gz
$ tar xvfz adam-distribution-spark2_2.11-0.23.0-20171221.152447-595-bin.tar.gz
$ cd adam-distribution-spark2_2.11-0.23.0-SNAPSHOT
$ ls | cat
CHANGES.md
CONTRIBUTING.md
LICENSE.txt
README.md
bin
javadocs
repo
scaladocs
$ find . -name "*.tar.gz"
$ find . -name "*.egg"

I'm thinking we may need to remove python,r profiles from the release script, cut the release without them, build them after the fact from the release tags, and then upload separately to the Github releases page.

Ping @fnothaft @pbashyal-nmdp

Member

heuermh commented Dec 21, 2017

The more I play with this, the less likely I think it will work.

mvn --batch-mode -P distribution,python,r ... fails for me now without starting a conda enviroment, and I believe adding that to the release script may also run into an issue with the outer and inner maven release builds

[INFO] --- exec-maven-plugin:1.5.0:exec (dev-python) @ adam-python-spark2_2.11 ---
pip install -e .
bash: pip: command not found
make: *** [develop] Error 127
[ERROR] Command execution failed.
org.apache.commons.exec.ExecuteException: Process exited with an error: 2 (Exit value: 2)

I know we wanted to add the python egg #1667 and R tarball #1668 to the release distribution tarball, I'm not seeing how to make that work in a single maven call (i.e. the inner call to maven on the checked out release tag). In fact, it is not currently working for snapshots uploaded by Jenkins.

$ wget https://oss.sonatype.org/content/repositories/snapshots/org/bdgenomics/adam/adam-distribution-spark2_2.11/0.23.0-SNAPSHOT/adam-distribution-spark2_2.11-0.23.0-20171221.152447-595-bin.tar.gz
$ tar xvfz adam-distribution-spark2_2.11-0.23.0-20171221.152447-595-bin.tar.gz
$ cd adam-distribution-spark2_2.11-0.23.0-SNAPSHOT
$ ls | cat
CHANGES.md
CONTRIBUTING.md
LICENSE.txt
README.md
bin
javadocs
repo
scaladocs
$ find . -name "*.tar.gz"
$ find . -name "*.egg"

I'm thinking we may need to remove python,r profiles from the release script, cut the release without them, build them after the fact from the release tags, and then upload separately to the Github releases page.

Ping @fnothaft @pbashyal-nmdp

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Dec 22, 2017

Member

I'm thinking we may need to remove python,r profiles from the release script, cut the release without them, build them after the fact from the release tags, and then upload separately to the Github releases page.

Fine by me.

Member

fnothaft commented Dec 22, 2017

I'm thinking we may need to remove python,r profiles from the release script, cut the release without them, build them after the fact from the release tags, and then upload separately to the Github releases page.

Fine by me.

@antonkulaga

This comment has been minimized.

Show comment
Hide comment
@antonkulaga

antonkulaga Dec 23, 2017

Contributor

maybe you can also update it to use 2.2.1 version of Spark and 2.11.12 version of Scala?

Contributor

antonkulaga commented Dec 23, 2017

maybe you can also update it to use 2.2.1 version of Spark and 2.11.12 version of Scala?

@heuermh heuermh changed the title from Release script to Remove python and R profiles from release script Dec 27, 2017

@AmplabJenkins

This comment has been minimized.

Show comment
Hide comment
@AmplabJenkins

AmplabJenkins Dec 27, 2017

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/ADAM-prb/2529/
Test PASSed.

AmplabJenkins commented Dec 27, 2017

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/ADAM-prb/2529/
Test PASSed.

@heuermh heuermh added this to the 0.23.0 milestone Dec 27, 2017

@fnothaft fnothaft merged commit 12728ff into bigdatagenomics:master Dec 28, 2017

2 checks passed

Codacy/PR Quality Review Good work! A positive pull request.
Details
default Merged build finished.
Details
@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Dec 28, 2017

Member

Merged! Thanks @heuermh!

Member

fnothaft commented Dec 28, 2017

Merged! Thanks @heuermh!

@heuermh heuermh deleted the heuermh:release-script branch Dec 28, 2017

@heuermh

This comment has been minimized.

Show comment
Hide comment
@heuermh

heuermh Dec 28, 2017

Member

maybe you can also update it to use 2.2.1 version of Spark and 2.11.12 version of Scala?

We will update the compile time dependency versions for 0.24.0 and will update the Jenkins build targets as well. See #1597

Member

heuermh commented Dec 28, 2017

maybe you can also update it to use 2.2.1 version of Spark and 2.11.12 version of Scala?

We will update the compile time dependency versions for 0.24.0 and will update the Jenkins build targets as well. See #1597

@heuermh heuermh added this to Completed in Release 0.23.0 Jan 4, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment