Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-28756][R][FOLLOW-UP] Specify minimum and maximum Java versions #25490

Closed
wants to merge 1 commit into from

Conversation

HyukjinKwon
Copy link
Member

@HyukjinKwon HyukjinKwon commented Aug 19, 2019

What changes were proposed in this pull request?

This PR proposes to set minimum and maximum Java version specification. (see https://cran.r-project.org/doc/manuals/r-release/R-exts.html#Writing-portable-packages).

Seems there is not the standard way to specify both given the documentation and other packages (see https://gist.github.com/glin/bd36cf1eb0c7f8b1f511e70e2fb20f8d).

I found two ways from existing packages on CRAN.

Package (<= 1 & > 2)
Package (<= 1, > 2)

The latter seems closer to other standard notations such as R (>= 2.14.0), R (>= r56550). So I have chosen the latter way.

Why are the changes needed?

Seems the package might be rejected by CRAN. See #25472 (comment)

Does this PR introduce any user-facing change?

No.

How was this patch tested?

JDK 8

./build/mvn -DskipTests -Psparkr clean package
./R/run-tests.sh

...
basic tests for CRAN: .............
...

JDK 11

./build/mvn -DskipTests -Psparkr -Phadoop-3.2 clean package
./R/run-tests.sh

...
basic tests for CRAN: .............
...

@dongjoon-hyun
Copy link
Member

dongjoon-hyun commented Aug 19, 2019

Oh, thank you for the quick follow-up, @HyukjinKwon !

Could you review this, @felixcheung ?

@felixcheung
Copy link
Member

felixcheung commented Aug 19, 2019 via email

@SparkQA
Copy link

SparkQA commented Aug 19, 2019

Test build #109315 has finished for PR 25490 at commit cbd0441.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member Author

retest this please

@HyukjinKwon HyukjinKwon reopened this Aug 19, 2019
@SparkQA
Copy link

SparkQA commented Aug 19, 2019

Test build #109323 has finished for PR 25490 at commit cbd0441.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member Author

AppVeyor failure seems unrelated - it is already failed with the same error in other builds.

@HyukjinKwon
Copy link
Member Author

Let me merge this. The failure is being investigated #25493 and the root cause is orthogonal. We don't run CRAN on Windows anyway.

@HyukjinKwon
Copy link
Member Author

Merged to master.

@felixcheung
Copy link
Member

felixcheung commented Aug 19, 2019 via email

@HyukjinKwon
Copy link
Member Author

HyukjinKwon commented Aug 19, 2019

The issue seems to be related to scala-maven-package's version. When we jump up to 4.0.0, the build on Windows fails. The only radical changes I see between 3.4.6 and 4.0.0 are related tk incremental complication and zinc. It might be the cause.

I'm running Appveyor builds at #25497. The current master is now fixed because I reverted scala-maven-plugin version bumpup

Copy link
Member

@srowen srowen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If for some reason we find that JDK 11 + SparkR or CRAN is having trouble, we can just tell CRAN to test vs 8 for its purposes. +1

@felixcheung
Copy link
Member

felixcheung commented Aug 20, 2019 via email

@felixcheung
Copy link
Member

If for some reason we find that JDK 11 + SparkR or CRAN is having trouble, we can just tell CRAN to test vs 8 for its purposes. +1

believe me, we couldn't "tell" them to do stuff... very long story.

@srowen
Copy link
Member

srowen commented Sep 1, 2019

@felixcheung Oh yes I just meant specify in the CRAN config or whatever that it uses Java 8 only, if that causes it to test vs Java 8

@felixcheung
Copy link
Member

felixcheung commented Sep 1, 2019 via email

@HyukjinKwon HyukjinKwon deleted the SPARK-28756 branch March 3, 2020 01:18
@Bidek56
Copy link
Contributor

Bidek56 commented Oct 22, 2021

I think Java needs to be bumped again in DESCRIPTION

@srowen
Copy link
Member

srowen commented Oct 22, 2021

It's currently >= 8 and < 12, which seems accurate? (though should actually 99% work with Java 17 already in 3.2.0)

@Bidek56
Copy link
Contributor

Bidek56 commented Oct 22, 2021

PYSpark will work with 17 but not RSpark because of this

@dongjoon-hyun
Copy link
Member

Sounds like a good suggestion. Could you make a PR, @Bidek56 ?

@dongjoon-hyun
Copy link
Member

You can create a subtask JIRA under https://issues.apache.org/jira/browse/SPARK-33772.

@Bidek56
Copy link
Contributor

Bidek56 commented Oct 22, 2021

PR has been created for this issue already.

@dongjoon-hyun
Copy link
Member

Thanks, @Bidek56 .

@HyukjinKwon
Copy link
Member Author

@Bidek56 why does the DESCRIPTION matter? it's just a metadata. Also JDK 17 isn't fully supported yet.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
6 participants