New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-45629][CORE][SQL][CONNECT][ML][STREAMING][BUILD][EXAMPLES]Fix Implicit definition should have explicit type
#43526
Conversation
connector/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/JsonUtils.scala
Outdated
Show resolved
Hide resolved
841a381
to
ed749cf
Compare
efa03f4
to
1701eef
Compare
Hi, brother [error] /Users/yangjie01/SourceCode/git/spark-mine-sbt/core/src/main/scala/org/apache/spark/deploy/FaultToleranceTest.scala:343:16: Implicit definition should have explicit type (inferred org.json4s.DefaultFormats.type) [quickfixable]
[error] Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=other-implicit-type, site=org.apache.spark.deploy.TestMasterInfo.formats
[error] implicit val formats = org.json4s.DefaultFormats
[error] |
spark/project/SparkBuild.scala Line 251 in e1bc48b
@laglangyue You can delete the above line and then compile with
If no further compilation errors occur, we can consider the task completed. |
1701eef
to
a826ec7
Compare
Implicit definition should have explicit type
Implicit definition should have explicit type
Exciting for building successful,I has finished. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you update the pr title to add the corresponding module label? Can you update the pr description to specifically explain why this work needs to be done (I don't think the subtask of SPARK-45314 is a rigorous reason)? Also, is it still a compile warning in Scala 3, or will it turn into a compile error?
core/src/main/scala/org/apache/spark/resource/ResourceInformation.scala
Outdated
Show resolved
Hide resolved
core/src/main/scala/org/apache/spark/executor/CoarseGrainedExecutorBackend.scala
Outdated
Show resolved
Hide resolved
@@ -340,7 +341,7 @@ private object FaultToleranceTest extends App with Logging { | |||
private class TestMasterInfo(val ip: String, val dockerId: DockerId, val logFile: File) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FaultToleranceTest
is a strange file, it's a test class, but it's in the production directory (so it probably has never been run by GA task). Can we directly delete this file? @srowen @dongjoon-hyun @HyukjinKwon
connector/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/JsonUtils.scala
Outdated
Show resolved
Hide resolved
...ftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/HiveThriftServer2Suites.scala
Show resolved
Hide resolved
...e-managers/yarn/src/main/scala/org/apache/spark/scheduler/cluster/YarnSchedulerBackend.scala
Outdated
Show resolved
Hide resolved
@laglangyue Additionally, GA's testing has not yet passed, we need to make them all pass |
aa97726
to
c140705
Compare
09cd042
to
3e391bc
Compare
@LuciferYang there is a CI error,I don't know why. |
I think we can just retry the failed task |
Is this one ready to review? |
yes. |
@@ -23,7 +23,7 @@ import java.nio.file.Files | |||
import scala.collection.mutable | |||
import scala.util.control.NonFatal | |||
|
|||
import org.json4s.{DefaultFormats, Extraction} | |||
import org.json4s._ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
import org.json4s._ | |
import org.json4s.{DefaultFormats, Extraction, Formats} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm... seems like this has been revert again?
|
||
/** Needed to serialize type T into JSON when using Jackson */ | ||
@nowarn |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nowarn | |
@scala.annotation.nowarn |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When this annotation is only needed in one place in this class, it is customary to use xx directly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks, get it firstly.
@@ -36,8 +33,8 @@ class FileStreamSourceLog( | |||
path: String) | |||
extends CompactibleFileStreamLog[FileEntry](metadataLogVersion, sparkSession, path) { | |||
|
|||
import CompactibleFileStreamLog._ | |||
import FileStreamSourceLog._ | |||
import org.apache.spark.sql.execution.streaming.CompactibleFileStreamLog._ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is it necessary to change to import with the full name here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe changed by Idea. I have reverted
|
||
/** Needed to serialize type T into JSON when using Jackson */ | ||
@nowarn |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nowarn | |
@scala.annotation.nowarn |
|
||
@nowarn |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nowarn | |
@scala.annotation.nowarn |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
3ea9d4b
to
995c82e
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM , pending test
@laglangyue Could you modify the description of this PR again? Need to focus on the description of the In addition to fixing a compilation warning in Scala 2.13, you may need to summarize the key points in #43526 (comment), and add it to the pr description so that the changes in this pr can be considered necessary. Note, this is just an example, if you have a more solid reason to describe |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for being late for this PR. Feel free to proceed to merge, @LuciferYang .
Merged into master for Spark 4.0. Thanks @laglangyue and @dongjoon-hyun ~ |
… `Implicit definition should have explicit type` ### What changes were proposed in this pull request? This PR aims to fix `Implicit definition should have explicit type` in Scala 2.13. This pr includes: 1. Declaration types for global variables of implicit 2. Add scala.annotation.warn ### Why are the changes needed? - For implicit global variables without explicit type declaration, will get warnning : warning: Implicit definition should have explicit type (inferred String) [quickfixable] - No modifications are required for local variables. Additionally, to handle cases involving reflection-related types like ClassTag in implicit variables, the [scala.annotation.warn](https://github.com/scala.annotation.warn) annotation is used to suppress the warning. Furthermore, warnings generated in Spark will be treated as errors: [error] ... Implicit definition should have explicit type (inferred org.json4s.DefaultFormats.type) [quickfixable] ... [error] implicit val formats = org.json4s.DefaultFormats Jira link: SPARK-45314: https://issues.apache.org/jira/browse/SPARK-45629 Related issue link about `implicit` : scala/bug#5265 ### Does this PR introduce _any_ user-facing change? no ### How was this patch tested? Most of the testing is completed through CI, and the example module is locally compiled and tested in IDEA Additionally, there are some writing changes that are verified through demo code ### Was this patch authored or co-authored using generative AI tooling? no Closes apache#43526 from laglangyue/SPARK-45629. Lead-authored-by: tangjiafu <jiafu.tang@qq.com> Co-authored-by: tangjiafu <tangjiafu@corp.netease.com> Signed-off-by: yangjie01 <yangjie01@baidu.com>
What changes were proposed in this pull request?
This PR aims to fix
Implicit definition should have explicit type
in Scala 2.13.This pr includes:
Why are the changes needed?
warning: Implicit definition should have explicit type (inferred String) [quickfixable]
Additionally, to handle cases involving reflection-related types like ClassTag in implicit variables, the @scala.annotation.warn annotation is used to suppress the warning.
Furthermore, warnings generated in Spark will be treated as errors:
[error] ... Implicit definition should have explicit type (inferred org.json4s.DefaultFormats.type) [quickfixable]
...
[error] implicit val formats = org.json4s.DefaultFormats
Jira link:
SPARK-45314: https://issues.apache.org/jira/browse/SPARK-45629
Related issue link about
implicit
:scala/bug#5265
Does this PR introduce any user-facing change?
no
How was this patch tested?
Most of the testing is completed through CI, and the example module is locally compiled and tested in IDEA Additionally, there are some writing changes that are verified through demo code
Was this patch authored or co-authored using generative AI tooling?
no