[SPARK-37945][SQL][CORE] Use error classes in the execution errors of arithmetic ops#38273
[SPARK-37945][SQL][CORE] Use error classes in the execution errors of arithmetic ops#38273khalidmammadov wants to merge 13 commits intoapache:masterfrom
Conversation
|
@MaxGekk Please review |
There was a problem hiding this comment.
Does the error class have at least one test? If not, please add one. The same question about other new error classes.
There was a problem hiding this comment.
I am pretty sure there is one already: _LEGACY_ERROR_TEMP_2118.
Please double check before adding more new error messages.
There was a problem hiding this comment.
@amaliujia apart from INTEGER_OVERFLOW rest of the changes are merely replacing name for the existing class metaname
@MaxGekk all of these are already unit tested within relevant use cases via intercept[] but I can work on changing those cases to assert additionally the class name/msg using checkError( ... errorClass="CLASS_NAME")
There was a problem hiding this comment.
I can work on changing those cases to assert additionally the class name/msg using checkError ...
Please, do that. The purpose is to make the tests independent from error messages (only valuable message parameters), so, in this way tech editors could edit error message in error-classes.json and don't worry about internal Spark tests.
There was a problem hiding this comment.
sure, working on it
There was a problem hiding this comment.
@MaxGekk Can you please check if all good or not?
|
Can one of the admins verify this patch? |
There was a problem hiding this comment.
What's the difference with ARITHMETIC_OVERFLOW? Can't you re-use the last one?
There was a problem hiding this comment.
nit: Please, upper case the first letter to be consistent w/ other error messages.
There was a problem hiding this comment.
ANSIEnabled -> ansiConfig, see other error classes.
There was a problem hiding this comment.
The method integerOverflowError is invoked from only 2 places. Let's introduce specific error classes for both cases, and don't pass arbitrary message.
Could you image that we will output errors according to locale in a local language. In that case, we will translate entire error-classes.json to the language but some text will pass in English from source code. That looks inconsistent.
There was a problem hiding this comment.
Could you re-use the existing exception: SparkRuntimeException, please. We don't need to introduce additional things as we already have error classes that users can use to distinguish errors.
There was a problem hiding this comment.
Could you fix indentation here. See https://github.com/databricks/scala-style-guide#spacing-and-indentation
There was a problem hiding this comment.
Wrap the config by toSQLConf(), see example in the file.
|
@khalidmammadov Could you re-trigger tests/builds by merging the recent master, please. |
Co-authored-by: Maxim Gekk <max.gekk@gmail.com>
Co-authored-by: Maxim Gekk <max.gekk@gmail.com>
d574362 to
f20713a
Compare
|
+1, LGTM. Merging to master. |
|
@MaxGekk thanks for reviews and merge! |
… arithmetic ops ### What changes were proposed in this pull request? Migrate the following errors in QueryExecutionErrors onto use error classes: unscaledValueTooLargeForPrecisionError -> UNSCALED_VALUE_TOO_LARGE_FOR_PRECISION decimalPrecisionExceedsMaxPrecisionError -> DECIMAL_PRECISION_EXCEEDS_MAX_PRECISION integerOverflowError -> INTEGER_OVERFLOW outOfDecimalTypeRangeError -> OUT_OF_DECIMAL_TYPE_RANGE ### Why are the changes needed? Porting ArithmeticExceptions to the new error framework ### Does this PR introduce _any_ user-facing change? Yes, errors will indicate that it's controlled Spark exception ### How was this patch tested? ./build/sbt "catalyst/testOnly org.apache.spark.sql.types.DecimalSuite" ./build/sbt "sql/testOnly org.apache.spark.sql.execution.streaming.sources.RateStreamProviderSuite" ./build/sbt "core/testOnly testOnly org.apache.spark.SparkThrowableSuite" Closes apache#38273 from khalidmammadov/error_class2. Lead-authored-by: Khalid Mammadov <khalidmammadov9@gmail.com> Co-authored-by: khalidmammadov <khalidmammadov9@gmail.com> Signed-off-by: Max Gekk <max.gekk@gmail.com>
What changes were proposed in this pull request?
Migrate the following errors in QueryExecutionErrors onto use error classes:
unscaledValueTooLargeForPrecisionError -> UNSCALED_VALUE_TOO_LARGE_FOR_PRECISION
decimalPrecisionExceedsMaxPrecisionError -> DECIMAL_PRECISION_EXCEEDS_MAX_PRECISION
integerOverflowError -> INTEGER_OVERFLOW
outOfDecimalTypeRangeError -> OUT_OF_DECIMAL_TYPE_RANGE
Why are the changes needed?
Porting ArithmeticExceptions to the new error framework
Does this PR introduce any user-facing change?
Yes, errors will indicate that it's controlled Spark exception
How was this patch tested?
./build/sbt "catalyst/testOnly org.apache.spark.sql.types.DecimalSuite"
./build/sbt "sql/testOnly org.apache.spark.sql.execution.streaming.sources.RateStreamProviderSuite"
./build/sbt "core/testOnly testOnly org.apache.spark.SparkThrowableSuite"