-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-33441][BUILD] Add unused-imports compilation check and remove all unused-imports #30351
[SPARK-33441][BUILD] Add unused-imports compilation check and remove all unused-imports #30351
Conversation
@LuciferYang, since you're here, can you see if |
Kubernetes integration test starting |
Kubernetes integration test status success |
@HyukjinKwon Let me have a try ~ |
@HyukjinKwon Looks like After add
After add |
Yeah, we can add it into |
Test build #130988 has finished for PR 30351 at commit
|
Address 986ffe5 add
spark/core/src/main/scala/org/apache/spark/resource/ResourceAllocator.scala Lines 23 to 44 in a3d2954
|
Seems GitHub Action compile with |
Kubernetes integration test starting |
@HyukjinKwon There are total 505 |
Yes, GitHub Actions build use SBT with |
You'll probably have to specify |
OK ~ It doesn't look like a minor anymore, It's a little late today in my timezone and I will try to finish the work tomorrow :) |
Test build #131006 has finished for PR 30351 at commit
|
Kubernetes integration test status success |
Test build #131042 has finished for PR 30351 at commit
|
OK ~ |
Test build #131279 has finished for PR 30351 at commit
|
Kubernetes integration test starting |
Kubernetes integration test status success |
Test build #131294 has finished for PR 30351 at commit
|
Okay, this gets conflicted easily. Let's merge this in, @LuciferYang can you resolve the conflicts? |
Test build #131311 has finished for PR 30351 at commit
|
Kubernetes integration test starting |
@HyukjinKwon done ~ Address 3498654 resolve the conflicts and Address ef2ff08 fix new added |
Kubernetes integration test status failure |
Kubernetes integration test starting |
Kubernetes integration test status success |
Merged to master. |
Test build #131315 has finished for PR 30351 at commit
|
Thanks for your review ~ @HyukjinKwon @srowen |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is dev/scalastyle
supposed to catch unused imports?
$ ./dev/scalastyle
Scalastyle checks passed.
$ build/sbt -Phadoop-3.2 -Phive-2.3 -Pyarn -Phadoop-cloud -Phive-thriftserver -Pkubernetes -Pmesos -Phive -Pkinesis-asl -Pspark-ganglia-lgpl test:package streaming-kinesis-asl-assembly/assembly
[error] /Users/maximgekk/proj/show-partitions-exec-v2-test/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DataSourceV2Strategy.scala:23:105: Unused import
[error] import org.apache.spark.sql.catalyst.analysis.{ResolvedNamespace, ResolvedPartitionSpec, ResolvedTable, ResolvedView}
[error] ^
[error] /Users/maximgekk/proj/show-partitions-exec-v2-test/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/ShowPartitionsExec.scala:20:29: Unused import
[error] import org.apache.spark.sql.AnalysisException
[error] ^
[error] two errors found
[error] (sql / Compile / compileIncremental) Compilation failed
[error] Total time: 32 s, completed Nov 19, 2020 6:30:02 PM
@MaxGekk use |
@MaxGekk seems no corresponding rule scalastyle-rules |
Just in case, Maven doesn't detect unused imports too:
|
@MaxGekk |
@HyukjinKwon @srowen @MaxGekk do you know how to add maven:
sbt:
seems need add
And only add |
Doing in SBT side only is fine. The purpose of doing this is to catch the unused imports in PRs. |
@@ -164,6 +164,7 @@ | |||
<commons.collections.version>3.2.2</commons.collections.version> | |||
<scala.version>2.12.10</scala.version> | |||
<scala.binary.version>2.12</scala.binary.version> | |||
<scalac.arg.unused-imports>-Ywarn-unused-import</scalac.arg.unused-imports> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shall we just move it to SparkBuild.scala
and let Maven doesn't care about that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There should be only some warnings when maven build now, seems that @MaxGekk wants maven check it as error too?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is my guess right? @MaxGekk
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If it's easy to make both Maven and SBT build throw an error, it's fine. If that's difficult, let's move it to SparkBuild.scala
, and make it SBT specific.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shall we just move it to SparkBuild.scala and let Maven doesn't care about that?
It may be useful for Maven, it can also help Maven users to check this although it is only warnings with maven build
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@HyukjinKwon SPARK-33560 added :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @LuciferYang
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@HyukjinKwon Is it necessary for us to add more compiler checking further? like unused-locals?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can give a shot. Are they a lot of instances of unused-locals
to fix? If there are too many, I think we should collect some more feedback from other committers because it makes more difficult to maintain the codes (e.g., reverting and backporting).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK ~ I'll collect the details of the issue like unused-locals
first, then file a new Jira for tracking and discussion. thanks @HyukjinKwon
What changes were proposed in this pull request?
This pr add a new Scala compile arg to
pom.xml
to defense against new unused imports:-Ywarn-unused-import
for Scala 2.12-Wconf:cat=unused-imports:e
for Scala 2.13The other fIles change are remove all unused imports in Spark code
Why are the changes needed?
Cleanup code and add guarantee to defense against new unused imports
Does this PR introduce any user-facing change?
No
How was this patch tested?
Pass the Jenkins or GitHub Action