-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-12653][SQL] Re-enable test "SPARK-8489: MissingRequirementError during reflection" #11744
Conversation
…ing reflection" Current Error: ``` Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.$lessinit$greater$default$6()Lscala/collection/Map; ``` Scala 2.11 Error with test.jar built by Scala 2.10.5: ``` Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror; ```
Test build #53246 has finished for PR 11744 at commit
|
// This test uses a pre-built jar to test SPARK-8489. In a nutshell, this test creates | ||
// a HiveContext and uses it to create a data frame from an RDD using reflection. | ||
// Before the fix in SPARK-8470, this results in a MissingRequirementError because | ||
// the HiveContext code mistakenly overrides the class loader that contains user classes. | ||
// For more detail, see sql/hive/src/test/resources/regression-test-SPARK-8489/*scala. | ||
val testJar = "sql/hive/src/test/resources/regression-test-SPARK-8489/test.jar" | ||
import Properties.versionString | ||
val version = versionString.substring(versionString.indexOf(" ") + 1, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it might be better if you explicitly match 2.10 and 2.11, and throw exceptions for other things. I think the error message then would be more obvious when we introduce support for 2.12.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you. I see. I will change soon.
LGTM other than that one comment. |
I updated the code to use exact matching and tested on both 2.10 and 2.11 again. |
LGTM pending Jenkins |
Test build #53300 has finished for PR 11744 at commit
|
Merged to master. We'll watch the 2.10 builds too. |
Thank you, @srowen . |
…r during reflection" ## What changes were proposed in this pull request? The purpose of [SPARK-12653](https://issues.apache.org/jira/browse/SPARK-12653) is re-enabling a regression test. Historically, the target regression test is added by [SPARK-8498](apache@093c348), but is temporarily disabled by [SPARK-12615](apache@8ce645d) due to binary compatibility error. The following is the current error message at the submitting spark job with the pre-built `test.jar` file in the target regression test. ``` Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.$lessinit$greater$default$6()Lscala/collection/Map; ``` Simple rebuilding `test.jar` can not recover the purpose of testcase since we need to support both Scala 2.10 and 2.11 for a while. For example, we will face the following Scala 2.11 error if we use `test.jar` built by Scala 2.10. ``` Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror; ``` This PR replace the existing `test.jar` with `test-2.10.jar` and `test-2.11.jar` and improve the regression test to use the suitable jar file. ## How was this patch tested? Pass the existing Jenkins test. Author: Dongjoon Hyun <dongjoon@apache.org> Closes apache#11744 from dongjoon-hyun/SPARK-12653.
## What changes were proposed in this pull request? Introduced by #21320 and #11744 ``` $ sbt > ++2.12.6 > project sql > compile ... [error] [warn] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/ProjectionOverSchema.scala:41: match may not be exhaustive. [error] It would fail on the following inputs: (_, ArrayType(_, _)), (_, _) [error] [warn] getProjection(a.child).map(p => (p, p.dataType)).map { [error] [warn] [error] [warn] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/ProjectionOverSchema.scala:52: match may not be exhaustive. [error] It would fail on the following input: (_, _) [error] [warn] getProjection(child).map(p => (p, p.dataType)).map { [error] [warn] ... ``` And ``` $ sbt > ++2.12.6 > project hive > testOnly *ParquetMetastoreSuite ... [error] /Users/rendong/wdi/spark/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala:22: object tools is not a member of package scala [error] import scala.tools.nsc.Properties [error] ^ [error] /Users/rendong/wdi/spark/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala:146: not found: value Properties [error] val version = Properties.versionNumberString match { [error] ^ [error] two errors found ... ``` ## How was this patch tested? Existing tests. Closes #22260 from sadhen/fix_exhaustive_match. Authored-by: 忍冬 <rendong@wacai.com> Signed-off-by: hyukjinkwon <gurwls223@apache.org>
## What changes were proposed in this pull request? Introduced by apache#21320 and apache#11744 ``` $ sbt > ++2.12.6 > project sql > compile ... [error] [warn] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/ProjectionOverSchema.scala:41: match may not be exhaustive. [error] It would fail on the following inputs: (_, ArrayType(_, _)), (_, _) [error] [warn] getProjection(a.child).map(p => (p, p.dataType)).map { [error] [warn] [error] [warn] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/ProjectionOverSchema.scala:52: match may not be exhaustive. [error] It would fail on the following input: (_, _) [error] [warn] getProjection(child).map(p => (p, p.dataType)).map { [error] [warn] ... ``` And ``` $ sbt > ++2.12.6 > project hive > testOnly *ParquetMetastoreSuite ... [error] /Users/rendong/wdi/spark/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala:22: object tools is not a member of package scala [error] import scala.tools.nsc.Properties [error] ^ [error] /Users/rendong/wdi/spark/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala:146: not found: value Properties [error] val version = Properties.versionNumberString match { [error] ^ [error] two errors found ... ``` ## How was this patch tested? Existing tests. Closes apache#22260 from sadhen/fix_exhaustive_match. Authored-by: 忍冬 <rendong@wacai.com> Signed-off-by: hyukjinkwon <gurwls223@apache.org>
What changes were proposed in this pull request?
The purpose of SPARK-12653 is re-enabling a regression test.
Historically, the target regression test is added by SPARK-8498, but is temporarily disabled by SPARK-12615 due to binary compatibility error.
The following is the current error message at the submitting spark job with the pre-built
test.jar
file in the target regression test.Simple rebuilding
test.jar
can not recover the purpose of testcase since we need to support both Scala 2.10 and 2.11 for a while. For example, we will face the following Scala 2.11 error if we usetest.jar
built by Scala 2.10.This PR replace the existing
test.jar
withtest-2.10.jar
andtest-2.11.jar
and improve the regression test to use the suitable jar file.How was this patch tested?
Pass the existing Jenkins test.