Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-22167][R][BUILD] sparkr packaging issue allow zinc #19402

Conversation

holdenk
Copy link
Contributor

@holdenk holdenk commented Sep 30, 2017

What changes were proposed in this pull request?

When zinc is running the pwd might be in the root of the project. A quick solution to this is to not go a level up incase we are in the root rather than root/core/. If we are in the root everything works fine, if we are in core add a script which goes and runs the level up

How was this patch tested?

set -x in the SparkR install scripts.

…uick solution to this is to not go a level up incase we are in the root rather than root/core/. If we are in the root everything works fine, if we are in core add a script which goes and runs the level up
@holdenk
Copy link
Contributor Author

holdenk commented Sep 30, 2017

Note: set -x is intentionally left in so that during the build it is clear which R source is being built.

@holdenk
Copy link
Contributor Author

holdenk commented Sep 30, 2017

cc @felixcheung @vanzin

core/pom.xml Outdated
@@ -499,7 +499,7 @@
</execution>
</executions>
<configuration>
<executable>..${file.separator}R${file.separator}install-dev${script.extension}</executable>
<executable>${file.separator}R${file.separator}install-dev${script.extension}</executable>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would it work if this is

<executable>${project.basedir}${file.separator}..${file.separator}R${file.separator}install-dev${script.extension}</executable>

@holdenk
Copy link
Contributor Author

holdenk commented Sep 30, 2017

That should work more simply, thanks. I'll try that later on tonight.

@SparkQA
Copy link

SparkQA commented Sep 30, 2017

Test build #82355 has finished for PR 19402 at commit 8d59d54.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@holdenk
Copy link
Contributor Author

holdenk commented Sep 30, 2017

@felixcheung can you trigger the R tests for this?

@SparkQA
Copy link

SparkQA commented Sep 30, 2017

Test build #82356 has finished for PR 19402 at commit aea4ccf.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@felixcheung
Copy link
Member

I'm not sure why appveyor is not triggered...
but I thought this is more of a java/scala change. since jenkins passes and if you have verified release-build.sh works with this with zinc on then we should be good to merge

@felixcheung
Copy link
Member

building your change here #19403

@felixcheung
Copy link
Member

felixcheung commented Oct 1, 2017

passed
https://ci.appveyor.com/project/ApacheSoftwareFoundation/spark/build/1804-master

[00:05:24] [INFO] --- exec-maven-plugin:1.6.0:exec (sparkr-pkg) @ spark-core_2.11 ---
[00:05:24] Downloading: https://repo1.maven.org/maven2/org/codehaus/plexus/plexus-component-annotations/1.5.4/plexus-component-annotations-1.5.4.pom
[00:05:24] 815/815 B                 
[00:05:24]             
[00:05:24] Downloaded: https://repo1.maven.org/maven2/org/codehaus/plexus/plexus-component-annotations/1.5.4/plexus-component-annotations-1.5.4.pom (815 B at 24.9 KB/sec)
[00:05:24] Downloading: https://repo1.maven.org/maven2/org/codehaus/plexus/plexus-containers/1.5.4/plexus-containers-1.5.4.pom
[00:05:24] 3/5 KB      
[00:05:24] 5/5 KB   
[00:05:24]          
[00:05:24] Downloaded: https://repo1.maven.org/maven2/org/codehaus/plexus/plexus-containers/1.5.4/plexus-containers-1.5.4.pom (5 KB at 133.6 KB/sec)
[00:05:25] * installing *source* package 'SparkR' ...
[00:05:25] Warning in as.POSIXlt.POSIXct(x, tz) :
[00:05:25]   unable to identify current timezone 'C':
[00:05:25] please set environment variable 'TZ'
[00:05:25] ** R
[00:05:25] ** inst
[00:05:25] ** preparing package for lazy loading
[00:05:25] Creating a new generic function for 'as.data.frame' in package 'SparkR'
[00:05:25] Creating a new generic function for 'colnames' in package 'SparkR'
[00:05:25] Creating a new generic function for 'colnames<-' in package 'SparkR'
[00:05:25] Creating a new generic function for 'cov' in package 'SparkR'
[00:05:25] Creating a new generic function for 'drop' in package 'SparkR'
[00:05:25] Creating a new generic function for 'na.omit' in package 'SparkR'
[00:05:25] Creating a new generic function for 'filter' in package 'SparkR'
[00:05:25] Creating a new generic function for 'intersect' in package 'SparkR'
[00:05:25] Creating a new generic function for 'sample' in package 'SparkR'
[00:05:25] Creating a new generic function for 'transform' in package 'SparkR'
[00:05:25] Creating a new generic function for 'subset' in package 'SparkR'
[00:05:25] Creating a new generic function for 'summary' in package 'SparkR'
[00:05:25] Creating a new generic function for 'union' in package 'SparkR'
[00:05:25] Creating a new generic function for 'endsWith' in package 'SparkR'
[00:05:25] Creating a new generic function for 'startsWith' in package 'SparkR'
[00:05:25] Creating a new generic function for 'lag' in package 'SparkR'
[00:05:25] Creating a new generic function for 'rank' in package 'SparkR'
[00:05:25] Creating a new generic function for 'sd' in package 'SparkR'
[00:05:25] Creating a new generic function for 'var' in package 'SparkR'
[00:05:25] Creating a new generic function for 'window' in package 'SparkR'
[00:05:25] Creating a new generic function for 'predict' in package 'SparkR'
[00:05:25] Creating a new generic function for 'rbind' in package 'SparkR'
[00:05:26] Creating a generic function for 'substr' from package 'base' in package 'SparkR'
[00:05:26] Creating a generic function for '%in%' from package 'base' in package 'SparkR'
[00:05:26] Creating a generic function for 'lapply' from package 'base' in package 'SparkR'
[00:05:26] Creating a generic function for 'Filter' from package 'base' in package 'SparkR'
[00:05:26] Creating a generic function for 'nrow' from package 'base' in package 'SparkR'
[00:05:26] Creating a generic function for 'ncol' from package 'base' in package 'SparkR'
[00:05:26] Creating a generic function for 'factorial' from package 'base' in package 'SparkR'
[00:05:26] Creating a generic function for 'atan2' from package 'base' in package 'SparkR'
[00:05:26] Creating a generic function for 'ifelse' from package 'base' in package 'SparkR'
[00:05:27] ** help
[00:05:27] No man pages found in package  'SparkR' 
[00:05:27] *** installing help indices
[00:05:27] ** building package indices
[00:05:27] ** installing vignettes
[00:05:27] ** testing if installed package can be loaded
[00:05:27] *** arch - i386
[00:05:28] *** arch - x64
[00:05:29] * DONE (SparkR)

@holdenk
Copy link
Contributor Author

holdenk commented Oct 1, 2017

Sounds good :)

core/pom.xml Outdated
@@ -499,7 +499,7 @@
</execution>
</executions>
<configuration>
<executable>..${file.separator}R${file.separator}install-dev${script.extension}</executable>
<executable>${project.basedir}${file.separator}..${file.separator}R${file.separator}install-dev${script.extension}</executable>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks this tab is inserted mistakenly BTW.

asfgit pushed a commit that referenced this pull request Oct 2, 2017
## What changes were proposed in this pull request?

When zinc is running the pwd might be in the root of the project. A quick solution to this is to not go a level up incase we are in the root rather than root/core/. If we are in the root everything works fine, if we are in core add a script which goes and runs the level up

## How was this patch tested?

set -x in the SparkR install scripts.

Author: Holden Karau <holden@us.ibm.com>

Closes #19402 from holdenk/SPARK-22167-sparkr-packaging-issue-allow-zinc.

(cherry picked from commit 8fab799)
Signed-off-by: Holden Karau <holden@us.ibm.com>
asfgit pushed a commit that referenced this pull request Oct 2, 2017
## What changes were proposed in this pull request?

When zinc is running the pwd might be in the root of the project. A quick solution to this is to not go a level up incase we are in the root rather than root/core/. If we are in the root everything works fine, if we are in core add a script which goes and runs the level up

## How was this patch tested?

set -x in the SparkR install scripts.

Author: Holden Karau <holden@us.ibm.com>

Closes #19402 from holdenk/SPARK-22167-sparkr-packaging-issue-allow-zinc.

(cherry picked from commit 8fab799)
Signed-off-by: Holden Karau <holden@us.ibm.com>
@asfgit asfgit closed this in 8fab799 Oct 2, 2017
@holdenk
Copy link
Contributor Author

holdenk commented Oct 2, 2017

merged to master, branch-2.2, and branch-2.1

@SparkQA
Copy link

SparkQA commented Oct 2, 2017

Test build #82393 has finished for PR 19402 at commit 40a7f6c.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

MatthewRBruce pushed a commit to Shopify/spark that referenced this pull request Jul 31, 2018
## What changes were proposed in this pull request?

When zinc is running the pwd might be in the root of the project. A quick solution to this is to not go a level up incase we are in the root rather than root/core/. If we are in the root everything works fine, if we are in core add a script which goes and runs the level up

## How was this patch tested?

set -x in the SparkR install scripts.

Author: Holden Karau <holden@us.ibm.com>

Closes apache#19402 from holdenk/SPARK-22167-sparkr-packaging-issue-allow-zinc.

(cherry picked from commit 8fab799)
Signed-off-by: Holden Karau <holden@us.ibm.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants