-
Notifications
You must be signed in to change notification settings - Fork 302
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Java version detected but couldn't parse version from: java version "10" 2018-03-20 #1383
Comments
Try installing Java 8. If that doesn't just work, set the |
@kevinykuo what would I set the |
Uninstall Java 10 and (re)install Java 8. You should be able to invoke Spark after that. Uninstall from console: |
@grantog Spark only supports Java 8 so you'll need to install that. If sparklyr doesn't find it after it's been installed, you'll need to |
Actually, we can probably do a better error message here. Reopening to track. |
I had similar problem on Mac.
Somehow I installed three java versions.
After I specify the
|
This doesn't help the package development but for folks struggling to use
|
Thanks for explaining! |
It took me a couple of hours to get sparklyr working.
As Marcelo Xavier](https://stackoverflow.com/questions/24342886/how-to-install-java-8-on-mac) pointed out in his comment:
The workaround was: In addition, I needed to specifiy JAVA_HOME of Java 8 Finally, I got sparklyr working. |
I followed a similar path to above but the environment variables were still not meshing with r correctly.
|
im getting the same error.
the code in the package producing the error is
and |
Hi all. I also got the same error with Java 15.
As far as I could determine after some testing, the problem is not that the The problem seems to be at some point between versions 8 and 10 For instance:
which fails because
works as expected. By tweaking the regex string I could get the package to correctly identify the java version. The package then passed all but 2 tests (seems unrelated to Java version), compiled successfully and it seems to be working with Spark 3.0.0/3.1.1 (both with Hadoop 3.2). I'm not sure whether or not Java 15, or anything different from 8 and 11, is officially supported. |
…sion is present and when dates are part of 'java -version' Issue sparklyr#1383
At the moment only Java 8 or 11 are officially supported for Spark AFAIK. However, I guess your change to ignore the date string after the . versions should be fine. So, I'll accept it once all existing |
* Making sure validation function can parse version when only major version is present and when dates are part of 'java -version' Issue #1383 * Undo automatic indentation changes
Reporting an Issue with sparklyr
I continue to get this error despite trying different versions of Spark and Java. Any help would be appreciated.
Returns:
Output of
utils::sessionInfo()
results in:The text was updated successfully, but these errors were encountered: