New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-28201][SQL][TEST][FOLLOWUP] Fix Integration test suite according to the new exception message #25165
Conversation
…rding to the new exception message
cc @mgaido91 and @cloud-fan |
For reviewers, I'm using |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, just left a personal consideration
@@ -376,8 +376,7 @@ class OracleIntegrationSuite extends DockerJDBCIntegrationSuite with SharedSQLCo | |||
val e = intercept[org.apache.spark.SparkException] { | |||
spark.read.jdbc(jdbcUrl, "tableWithCustomSchema", new Properties()).collect() | |||
} | |||
assert(e.getMessage.contains( | |||
"requirement failed: Decimal precision 39 exceeds max precision 38")) | |||
assert(e.getMessage.contains("Decimal precision 39 exceeds max precision 38")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
as a very nit, I'd rather check the exception type, which I think is more important than the exact message. Now we should be coherent in the whole codebase and always throw an ArithmeticException
, while previously we were sometimes throwing RuntimeException
or others for the same case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for review, @mgaido91 .
Sure, of course, we can check the underlying exception type by e.getCause
from SparkException
additionally. I'll add that.
BTW, message checking
is a more fine-grained verification. As you know, ArithmeticException
and ParseException
are not specific. For example, ArithmeticException
can be caused by divide by zero
. We should check the error message always.
Test build #107698 has finished for PR 25165 at commit
|
Test build #107702 has finished for PR 25165 at commit
|
thanks, merging to master! |
Thank you, @cloud-fan and @mgaido91 ! |
…ng to the new exception message ## What changes were proposed in this pull request? apache#25010 breaks the integration test suite due to the changing the user-facing exception like the following. This PR fixes the integration test suite. ```scala - require( - decimalVal.precision <= precision, - s"Decimal precision ${decimalVal.precision} exceeds max precision $precision") + if (decimalVal.precision > precision) { + throw new ArithmeticException( + s"Decimal precision ${decimalVal.precision} exceeds max precision $precision") + } ``` ## How was this patch tested? Manual test. ``` $ build/mvn install -DskipTests $ build/mvn -Pdocker-integration-tests -pl :spark-docker-integration-tests_2.12 test ``` Closes apache#25165 from dongjoon-hyun/SPARK-28201. Authored-by: Dongjoon Hyun <dhyun@apple.com> Signed-off-by: Wenchen Fan <wenchen@databricks.com>
What changes were proposed in this pull request?
#25010 breaks the integration test suite due to the changing the user-facing exception like the following. This PR fixes the integration test suite.
How was this patch tested?
Manual test.