Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-21601][BUILD] Modify the pom.xml file, increase the maven compiler jdk attribute #18807

Closed
wants to merge 2 commits into from

Conversation

highfei2011
Copy link

What changes were proposed in this pull request?

When using maven to compile spark, I want to add a modified jdk property. This is user-friendly.

How was this patch tested?

mvn test

…iler jdk attribute

## What changes were proposed in this pull request?

Modify the pom.xml file,

## How was this patch tested?

mvn test

Author: highfei2011 <highfei2011@126.com>
@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@markhamstra
Copy link
Contributor

These are maven-compiler-plugin configurations. We don't use maven-compiler-plugin to compile Java code: 74cda94

@highfei2011
Copy link
Author

Ok,Thanks, @markhamstra .

@highfei2011 highfei2011 closed this Aug 2, 2017
@srowen
Copy link
Member

srowen commented Aug 2, 2017

@highfei2011 @markhamstra I think this is legit actually but for a different reason: https://issues.apache.org/jira/browse/SPARK-21605

@markhamstra
Copy link
Contributor

Hmmm... that's arguably broken behavior on the part of IntelliJ or something to be worked around in IntelliJ configuration, not by hacking our POM. Without the POM hack, though, it is definitely something that deserves explanation in the Building Spark docs.

@srowen
Copy link
Member

srowen commented Aug 2, 2017

The Maven build is the build of reference, and we rely on IDEs to parse the details of the build from the pom.xml file. So it's right for the IDE to need to find information like JDK level (among a hundred other things) in the POM file. The maven-compiler-plugin is a standard built-in plugin whose config it consults to figure this out. It's not at all a hack. At least: we've always had this (and other superfluous config) for the compiler plugin in the build. This just cropped up because we trimmed it, and trimmed two lines too far.

@markhamstra
Copy link
Contributor

Maven is the build of reference, true; but maven itself doesn't need the JDK version to be specified both in the scala plugin configuration and in the compiler plugin configuration. While I understand that IntelliJ is used by many Spark developers, it isn't some sort of officially preferred IDE of reference. That's why I have some qualms about embedding workarounds into our POM just to satisfy the idiosyncrasies of a particular IDE.

@srowen
Copy link
Member

srowen commented Aug 2, 2017

Sure, it really doesn't hurt, because we already define and maintain ${java.version} anyway. It's two lines of standard config that were already there. This is truly not an issue.

@markhamstra
Copy link
Contributor

Yes, it doesn't really hurt anything except to be confusing cruft that has a tendency to accumulate in POMs. If we're going to put those lines back, I suggest that they be accompanied by a comment that they are known to be a useful assist for certain versions of IntelliJ and are not needed for the correctness of the build itself.

@highfei2011
Copy link
Author

Hi, @srowen @markhamstra ,This problem has been solved,https://github.com/apache/spark/pull/18808

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants