Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-28201][SQL][TEST][FOLLOWUP] Fix Integration test suite according to the new exception message #25165

Closed
wants to merge 2 commits into from

Conversation

dongjoon-hyun
Copy link
Member

@dongjoon-hyun dongjoon-hyun commented Jul 15, 2019

What changes were proposed in this pull request?

#25010 breaks the integration test suite due to the changing the user-facing exception like the following. This PR fixes the integration test suite.

-    require(
-      decimalVal.precision <= precision,
-      s"Decimal precision ${decimalVal.precision} exceeds max precision $precision")
+    if (decimalVal.precision > precision) {
+      throw new ArithmeticException(
+        s"Decimal precision ${decimalVal.precision} exceeds max precision $precision")
+    }

How was this patch tested?

Manual test.

$ build/mvn install -DskipTests
$ build/mvn -Pdocker-integration-tests -pl :spark-docker-integration-tests_2.12 test

@dongjoon-hyun
Copy link
Member Author

cc @mgaido91 and @cloud-fan

@dongjoon-hyun
Copy link
Member Author

dongjoon-hyun commented Jul 15, 2019

For reviewers, I'm using deepdiver/docker-oracle-xe-11g for master and branch-2.4, but it's an unofficial image. So, I didn't include it in the PR description.

Copy link
Contributor

@mgaido91 mgaido91 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just left a personal consideration

@@ -376,8 +376,7 @@ class OracleIntegrationSuite extends DockerJDBCIntegrationSuite with SharedSQLCo
val e = intercept[org.apache.spark.SparkException] {
spark.read.jdbc(jdbcUrl, "tableWithCustomSchema", new Properties()).collect()
}
assert(e.getMessage.contains(
"requirement failed: Decimal precision 39 exceeds max precision 38"))
assert(e.getMessage.contains("Decimal precision 39 exceeds max precision 38"))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as a very nit, I'd rather check the exception type, which I think is more important than the exact message. Now we should be coherent in the whole codebase and always throw an ArithmeticException, while previously we were sometimes throwing RuntimeException or others for the same case.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for review, @mgaido91 .

Sure, of course, we can check the underlying exception type by e.getCause from SparkException additionally. I'll add that.

BTW, message checking is a more fine-grained verification. As you know, ArithmeticException and ParseException are not specific. For example, ArithmeticException can be caused by divide by zero. We should check the error message always.

@SparkQA
Copy link

SparkQA commented Jul 15, 2019

Test build #107698 has finished for PR 25165 at commit 0de5843.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jul 15, 2019

Test build #107702 has finished for PR 25165 at commit a822c71.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@dongjoon-hyun dongjoon-hyun changed the title [SPARK-28201][SQL][TEST][FOLLOWUP] Update Integration test suite according to the new exception message [SPARK-28201][SQL][TEST][FOLLOWUP] Fix Integration test suite according to the new exception message Jul 15, 2019
@cloud-fan
Copy link
Contributor

thanks, merging to master!

@cloud-fan cloud-fan closed this in 9a7f01d Jul 16, 2019
@dongjoon-hyun
Copy link
Member Author

Thank you, @cloud-fan and @mgaido91 !

@dongjoon-hyun dongjoon-hyun deleted the SPARK-28201 branch July 16, 2019 15:41
vinodkc pushed a commit to vinodkc/spark that referenced this pull request Jul 18, 2019
…ng to the new exception message

## What changes were proposed in this pull request?

apache#25010 breaks the integration test suite due to the changing the user-facing exception like the following. This PR fixes the integration test suite.

```scala
-    require(
-      decimalVal.precision <= precision,
-      s"Decimal precision ${decimalVal.precision} exceeds max precision $precision")
+    if (decimalVal.precision > precision) {
+      throw new ArithmeticException(
+        s"Decimal precision ${decimalVal.precision} exceeds max precision $precision")
+    }
```

## How was this patch tested?

Manual test.
```
$ build/mvn install -DskipTests
$ build/mvn -Pdocker-integration-tests -pl :spark-docker-integration-tests_2.12 test
```

Closes apache#25165 from dongjoon-hyun/SPARK-28201.

Authored-by: Dongjoon Hyun <dhyun@apple.com>
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
4 participants