Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Java version detected but couldn't parse version from: java version "10" 2018-03-20 #1383

Closed
grantog opened this issue Mar 31, 2018 · 13 comments · Fixed by #3150
Closed

Java version detected but couldn't parse version from: java version "10" 2018-03-20 #1383

grantog opened this issue Mar 31, 2018 · 13 comments · Fixed by #3150
Labels
Milestone

Comments

@grantog
Copy link

grantog commented Mar 31, 2018

Reporting an Issue with sparklyr

I continue to get this error despite trying different versions of Spark and Java. Any help would be appreciated.

library(sparklyr)
library(dplyr)


options("java.home"="/Library/Java/JavaVirtualMachines/jdk-10.jdk/Contents/Home/lib")
Sys.setenv(LD_LIBRARY_PATH='$JAVA_HOME/server')
dyn.load('/Library/Java/JavaVirtualMachines/jdk-10.jdk/Contents/Home/lib/server/libjvm.dylib')
library(rJava)
sc <- spark_connect(master = "local")

Returns:

* Using Spark: 2.2.0
Error in validate_java_version_line(master, version) : 
  Java version detected but couldn't parse version from: java version "10" 2018-03-20

Output of utils::sessionInfo() results in:

> utils::sessionInfo()
R version 3.4.2 (2017-09-28)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS High Sierra 10.13.4

Matrix products: default
BLAS: /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/3.4/Resources/lib/libRlapack.dylib

locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] rJava_0.9-9         sparklyr_0.7.0-9014

loaded via a namespace (and not attached):
 [1] Rcpp_0.12.15     dbplyr_1.2.1     compiler_3.4.2   pillar_1.1.0     plyr_1.8.4       bindr_0.1        base64enc_0.1-3  tools_3.4.2      digest_0.6.15   
[10] memoise_1.1.0    jsonlite_1.5     tibble_1.4.2     nlme_3.1-131     lattice_0.20-35  pkgconfig_2.0.1  rlang_0.2.0      psych_1.8.3.3    shiny_1.0.5     
[19] DBI_0.8          rstudioapi_0.7   yaml_2.1.16      parallel_3.4.2   bindrcpp_0.2     withr_2.1.2      dplyr_0.7.4      httr_1.3.1       stringr_1.3.0   
[28] rappdirs_0.3.1   devtools_1.13.4  rprojroot_1.3-2  grid_3.4.2       glue_1.2.0       R6_2.2.2         foreign_0.8-69   tidyr_0.8.0      purrr_0.2.4     
[37] reshape2_1.4.3   magrittr_1.5     backports_1.1.2  htmltools_0.3.6  assertthat_0.2.0 mnormt_1.5-5     mime_0.5         xtable_1.8-2     httpuv_1.3.5    
[46] config_0.3       stringi_1.1.7    lazyeval_0.2.1   broom_0.4.4  
@kevinykuo
Copy link
Collaborator

Try installing Java 8.

If that doesn't just work, set the JAVA_HOME environment variable.

@grantog
Copy link
Author

grantog commented Apr 2, 2018

@kevinykuo what would I set the JAVA_HOME variable to? You're referring to the /.profile file, right?

@bkottmann
Copy link

bkottmann commented Apr 2, 2018

Uninstall Java 10 and (re)install Java 8. You should be able to invoke Spark after that.

Uninstall from console:
$ brew cask remove java

@kevinykuo
Copy link
Collaborator

@grantog Spark only supports Java 8 so you'll need to install that. If sparklyr doesn't find it after it's been installed, you'll need to Sys.setenv(JAVA_HOME = "/path/to/java/installation").

@kevinykuo
Copy link
Collaborator

Actually, we can probably do a better error message here. Reopening to track.

@kevinykuo kevinykuo reopened this Apr 13, 2018
@kevinykuo kevinykuo added this to the 0.9.0 milestone Apr 13, 2018
@ChuliangXiao
Copy link

I had similar problem on Mac.

> utils::sessionInfo()
R version 3.4.4 (2018-03-15)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS High Sierra 10.13.4

Matrix products: default
BLAS: /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/3.4/Resources/lib/libRlapack.dylib

Somehow I installed three java versions.

$ s -1 /Library/Java/JavaVirtualMachines/
1.6.0.jdk
jdk-10.0.1.jdk
jdk1.8.0_172.jdk

After I specify the JAVA_HOME of Java 8, sparklyr works. Thank you @kevinykuo .

Sys.setenv(JAVA_HOME = "/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home")

@benjoaquin
Copy link

This doesn't help the package development but for folks struggling to use sparklyr on a Mac, I was able quickly resolve this Java dependency by letting brew do the work. You might try:

brew tap caskroom/versions
brew cask install java8

@Ni-Ar
Copy link

Ni-Ar commented Feb 17, 2019

Thanks for explaining!
I collected your answers in a stackoverflow question that I hope could be useful in the future.
https://stackoverflow.com/a/54737965/9938003

@leynu
Copy link

leynu commented Aug 25, 2019

It took me a couple of hours to get sparklyr working.

brew cask install java8 - doesn't work.
Cask 'java8' is unavailable: No Cask with this name exists.

As Marcelo Xavier](https://stackoverflow.com/questions/24342886/how-to-install-java-8-on-mac) pointed out in his comment:

It is not possible to install Java8 using Brew anymore because of Oracle license changes. Java8 is not public anymore for download. – Marcelo Xavier Apr 24 at 13:05

The workaround was:
brew cask install homebrew/cask-versions/adoptopenjdk8
, suggested in the comment by Sean Breckenridge Apr 19 at 21:31

In addition, I needed to specifiy JAVA_HOME of Java 8
Sys.setenv(JAVA_HOME = "/Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/Home") in R.

Finally, I got sparklyr working.

@jlucasmckay
Copy link

I followed a similar path to above but the environment variables were still not meshing with r correctly.
I called:
/usr/libexec/java_home -V
To identify the jdk8 location and then called the following in studio:

Sys.setenv(JAVA_HOME="/Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/Home")
sc <- spark_connect(master = "local")

@nassuphis
Copy link

im getting the same error.

> version<-system2(spark_get_java(), "-version", stderr = TRUE, stdout = TRUE)
> version
[1] "java version \"16\" 2021-03-16"                                           
[2] "Java(TM) SE Runtime Environment (build 16+36-2231)"                       
[3] "Java HotSpot(TM) 64-Bit Server VM (build 16+36-2231, mixed mode, sharing)"

the code in the package producing the error is

> sparklyr:::validate_java_version_line("local", version)
Error in sparklyr:::validate_java_version_line("local", version) : 
  Java version detected but couldn't parse version from: java version "16" 2021-03-16
> 

and validate_java_version_line cannot parse the 3-line result coming out of java -version

@bpvgoncalves
Copy link
Contributor

bpvgoncalves commented Jul 16, 2021

Hi all.

I also got the same error with Java 15.

$ java -version
java version "15.0.2" 2021-01-19
Java(TM) SE Runtime Environment (build 15.0.2+7-27)
Java HotSpot(TM) 64-Bit Server VM (build 15.0.2+7-27, mixed mode, sharing)

As far as I could determine after some testing, the problem is not that the validate_java_version_line() cannot parse the 3-line output from java -version. It can successfully identify the first line as the relevant one and tries to parse it.

The problem seems to be at some point between versions 8 and 10 java -version changed its behavior returning a date (¿build date?) together with the actual version number and also returning version 'x.' instead of '1.x.'. Not sure if both changes occurred at the same version.
While the later doesn't seem to be a problem, the inclusion of the date gets us in trouble. Under some circumstances the regex expression used since commit 7029541 (matching '9' OR 'x.y.z') matches both of them (version and date) and the validation function has 2 different values to interpret as version and not knowing how to proceed fails the validation. In my case the date seems to be matched because of day 19 on the string, but it seems any 9 on the date triggers this behavior.

For instance:

versionLine <- 'java version "15.0.2" 2021-01-19'
splat <- strsplit(versionLine, "\s+", perl = TRUE)[[1]]
splat[grepl("9|[0-9]+\.[0-9]+\.[0-9]+", splat)]
[1] ""15.0.2"" "2021-01-19"

which fails because length(splatVersion) != 1, but:

versionLine <- 'java version "15.0.2" 2021-01-18'
splat <- strsplit(versionLine, "\s+", perl = TRUE)[[1]]
splat[grepl("9|[0-9]+\.[0-9]+\.[0-9]+", splat)]
[1] ""15.0.2""

works as expected.
Under other circumstances, when a Major GA version is present (x.0.0) it is only reported as x instead of x.0.0 (and no 9 inside the date), no match is found!

By tweaking the regex string I could get the package to correctly identify the java version. The package then passed all but 2 tests (seems unrelated to Java version), compiled successfully and it seems to be working with Spark 3.0.0/3.1.1 (both with Hadoop 3.2).

I'm not sure whether or not Java 15, or anything different from 8 and 11, is officially supported.
Spark documentation only talks about Java 8 / 11.

bpvgoncalves added a commit to bpvgoncalves/sparklyr that referenced this issue Jul 16, 2021
…sion is present and when dates are part of 'java -version'

Issue sparklyr#1383
@yitao-li yitao-li linked a pull request Jul 16, 2021 that will close this issue
@yitao-li
Copy link
Contributor

At the moment only Java 8 or 11 are officially supported for Spark AFAIK. However, I guess your change to ignore the date string after the . versions should be fine. So, I'll accept it once all existing sparklyr tests pass.

yitao-li pushed a commit that referenced this issue Jul 16, 2021
* Making sure validation function can parse version when only major version is present and when dates are part of 'java -version'

Issue #1383

* Undo automatic indentation changes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.