Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-12653][SQL] Re-enable test "SPARK-8489: MissingRequirementError during reflection" #11744

Closed
wants to merge 2 commits into from

Conversation

dongjoon-hyun
Copy link
Member

What changes were proposed in this pull request?

The purpose of SPARK-12653 is re-enabling a regression test.
Historically, the target regression test is added by SPARK-8498, but is temporarily disabled by SPARK-12615 due to binary compatibility error.

The following is the current error message at the submitting spark job with the pre-built test.jar file in the target regression test.

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.$lessinit$greater$default$6()Lscala/collection/Map;

Simple rebuilding test.jar can not recover the purpose of testcase since we need to support both Scala 2.10 and 2.11 for a while. For example, we will face the following Scala 2.11 error if we use test.jar built by Scala 2.10.

Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;

This PR replace the existing test.jar with test-2.10.jar and test-2.11.jar and improve the regression test to use the suitable jar file.

How was this patch tested?

Pass the existing Jenkins test.

…ing reflection"

Current Error:
```
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.$lessinit$greater$default$6()Lscala/collection/Map;
```

Scala 2.11 Error with test.jar built by Scala 2.10.5:
```
Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
```
@dongjoon-hyun dongjoon-hyun changed the title [SPARK-12653] Re-enable test "SPARK-8489: MissingRequirementError during reflection" [SPARK-12653][SQL] Re-enable test "SPARK-8489: MissingRequirementError during reflection" Mar 16, 2016
@SparkQA
Copy link

SparkQA commented Mar 16, 2016

Test build #53246 has finished for PR 11744 at commit 95f740f.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

// This test uses a pre-built jar to test SPARK-8489. In a nutshell, this test creates
// a HiveContext and uses it to create a data frame from an RDD using reflection.
// Before the fix in SPARK-8470, this results in a MissingRequirementError because
// the HiveContext code mistakenly overrides the class loader that contains user classes.
// For more detail, see sql/hive/src/test/resources/regression-test-SPARK-8489/*scala.
val testJar = "sql/hive/src/test/resources/regression-test-SPARK-8489/test.jar"
import Properties.versionString
val version = versionString.substring(versionString.indexOf(" ") + 1,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it might be better if you explicitly match 2.10 and 2.11, and throw exceptions for other things. I think the error message then would be more obvious when we introduce support for 2.12.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you. I see. I will change soon.

@rxin
Copy link
Contributor

rxin commented Mar 16, 2016

LGTM other than that one comment.

@dongjoon-hyun
Copy link
Member Author

I updated the code to use exact matching and tested on both 2.10 and 2.11 again.
Now, it becomes much robust as you said.
Thank you again, @rxin.

@rxin
Copy link
Contributor

rxin commented Mar 16, 2016

LGTM pending Jenkins

@SparkQA
Copy link

SparkQA commented Mar 16, 2016

Test build #53300 has finished for PR 11744 at commit 38b9752.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member

srowen commented Mar 16, 2016

Merged to master. We'll watch the 2.10 builds too.

@asfgit asfgit closed this in 431a3d0 Mar 16, 2016
@dongjoon-hyun
Copy link
Member Author

Thank you, @srowen .

roygao94 pushed a commit to roygao94/spark that referenced this pull request Mar 22, 2016
…r during reflection"

## What changes were proposed in this pull request?

The purpose of [SPARK-12653](https://issues.apache.org/jira/browse/SPARK-12653) is re-enabling a regression test.
Historically, the target regression test is added by [SPARK-8498](apache@093c348), but is temporarily disabled by [SPARK-12615](apache@8ce645d) due to binary compatibility error.

The following is the current error message at the submitting spark job with the pre-built `test.jar` file in the target regression test.
```
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.$lessinit$greater$default$6()Lscala/collection/Map;
```

Simple rebuilding `test.jar` can not recover the purpose of testcase since we need to support both Scala 2.10 and 2.11 for a while. For example, we will face the following Scala 2.11 error if we use `test.jar` built by Scala 2.10.
```
Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
```

This PR replace the existing `test.jar` with `test-2.10.jar` and `test-2.11.jar` and improve the regression test to use the suitable jar file.

## How was this patch tested?

Pass the existing Jenkins test.

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes apache#11744 from dongjoon-hyun/SPARK-12653.
@dongjoon-hyun dongjoon-hyun deleted the SPARK-12653 branch March 29, 2016 05:28
asfgit pushed a commit that referenced this pull request Aug 30, 2018
## What changes were proposed in this pull request?
Introduced by #21320 and #11744

```
$ sbt
> ++2.12.6
> project sql
> compile
...
[error] [warn] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/ProjectionOverSchema.scala:41: match may not be exhaustive.
[error] It would fail on the following inputs: (_, ArrayType(_, _)), (_, _)
[error] [warn]         getProjection(a.child).map(p => (p, p.dataType)).map {
[error] [warn]
[error] [warn] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/ProjectionOverSchema.scala:52: match may not be exhaustive.
[error] It would fail on the following input: (_, _)
[error] [warn]         getProjection(child).map(p => (p, p.dataType)).map {
[error] [warn]
...
```

And

```
$ sbt
> ++2.12.6
> project hive
> testOnly *ParquetMetastoreSuite
...
[error] /Users/rendong/wdi/spark/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala:22: object tools is not a member of package scala
[error] import scala.tools.nsc.Properties
[error]              ^
[error] /Users/rendong/wdi/spark/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala:146: not found: value Properties
[error]     val version = Properties.versionNumberString match {
[error]                   ^
[error] two errors found
...
```

## How was this patch tested?
Existing tests.

Closes #22260 from sadhen/fix_exhaustive_match.

Authored-by: 忍冬 <rendong@wacai.com>
Signed-off-by: hyukjinkwon <gurwls223@apache.org>
fjh100456 pushed a commit to fjh100456/spark that referenced this pull request Aug 31, 2018
## What changes were proposed in this pull request?
Introduced by apache#21320 and apache#11744

```
$ sbt
> ++2.12.6
> project sql
> compile
...
[error] [warn] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/ProjectionOverSchema.scala:41: match may not be exhaustive.
[error] It would fail on the following inputs: (_, ArrayType(_, _)), (_, _)
[error] [warn]         getProjection(a.child).map(p => (p, p.dataType)).map {
[error] [warn]
[error] [warn] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/ProjectionOverSchema.scala:52: match may not be exhaustive.
[error] It would fail on the following input: (_, _)
[error] [warn]         getProjection(child).map(p => (p, p.dataType)).map {
[error] [warn]
...
```

And

```
$ sbt
> ++2.12.6
> project hive
> testOnly *ParquetMetastoreSuite
...
[error] /Users/rendong/wdi/spark/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala:22: object tools is not a member of package scala
[error] import scala.tools.nsc.Properties
[error]              ^
[error] /Users/rendong/wdi/spark/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala:146: not found: value Properties
[error]     val version = Properties.versionNumberString match {
[error]                   ^
[error] two errors found
...
```

## How was this patch tested?
Existing tests.

Closes apache#22260 from sadhen/fix_exhaustive_match.

Authored-by: 忍冬 <rendong@wacai.com>
Signed-off-by: hyukjinkwon <gurwls223@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants