Skip to content

Commit

Permalink
[SPARK-28201][SQL][TEST][FOLLOWUP] Fix Integration test suite accordi…
Browse files Browse the repository at this point in the history
…ng to the new exception message

## What changes were proposed in this pull request?

#25010 breaks the integration test suite due to the changing the user-facing exception like the following. This PR fixes the integration test suite.

```scala
-    require(
-      decimalVal.precision <= precision,
-      s"Decimal precision ${decimalVal.precision} exceeds max precision $precision")
+    if (decimalVal.precision > precision) {
+      throw new ArithmeticException(
+        s"Decimal precision ${decimalVal.precision} exceeds max precision $precision")
+    }
```

## How was this patch tested?

Manual test.
```
$ build/mvn install -DskipTests
$ build/mvn -Pdocker-integration-tests -pl :spark-docker-integration-tests_2.12 test
```

Closes #25165 from dongjoon-hyun/SPARK-28201.

Authored-by: Dongjoon Hyun <dhyun@apple.com>
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
  • Loading branch information
dongjoon-hyun authored and cloud-fan committed Jul 16, 2019
1 parent 6926849 commit 9a7f01d
Showing 1 changed file with 2 additions and 2 deletions.
Expand Up @@ -376,8 +376,8 @@ class OracleIntegrationSuite extends DockerJDBCIntegrationSuite with SharedSQLCo
val e = intercept[org.apache.spark.SparkException] {
spark.read.jdbc(jdbcUrl, "tableWithCustomSchema", new Properties()).collect()
}
assert(e.getMessage.contains(
"requirement failed: Decimal precision 39 exceeds max precision 38"))
assert(e.getCause().isInstanceOf[ArithmeticException])
assert(e.getMessage.contains("Decimal precision 39 exceeds max precision 38"))

// custom schema can read data
val props = new Properties()
Expand Down

0 comments on commit 9a7f01d

Please sign in to comment.