Skip to content

[WIP][SQL] Replace require() by an internal error in catalyst#38896

Closed
MaxGekk wants to merge 9 commits intoapache:masterfrom
MaxGekk:require-error-class
Closed

[WIP][SQL] Replace require() by an internal error in catalyst#38896
MaxGekk wants to merge 9 commits intoapache:masterfrom
MaxGekk:require-error-class

Conversation

@MaxGekk
Copy link
Member

@MaxGekk MaxGekk commented Dec 3, 2022

What changes were proposed in this pull request?

Why are the changes needed?

Does this PR introduce any user-facing change?

How was this patch tested?

left: Expression,
right: Expression,
override val dataType: DataType) extends BinaryOperator {
require(dataType.isInstanceOf[DecimalType])
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not needed since we require the type below in the method inputType()

* Tests an condition, throwing an `SparkException` with
* the error class `INTERNAL_ERROR` if false.
*/
def checkInternalError(condition: Boolean, msg: String): Unit = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this tiny utility method worth it? not sure

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I want to be as close as possible to the existing require and don't introduce unnecessary boilerplate code.

*/
class ArrayBasedMapData(val keyArray: ArrayData, val valueArray: ArrayData) extends MapData {
require(keyArray.numElements() == valueArray.numElements())
checkInternalError(
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This one is an user-facing error, actually. See the failed test:

JsonFunctionsSuite.SPARK-33134: return partial results only for root JSON objects
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 186.0 failed 1 times, most recent failure: Lost task 0.0 in stage 186.0 (TID 227) (localhost executor driver): org.apache.spark.SparkException: [INTERNAL_ERROR] The number of key elements 1 and value elements 0 must be the same.
	at org.apache.spark.SparkException$.internalError(SparkException.scala:77)
	at org.apache.spark.SparkException$.internalError(SparkException.scala:81)
	at org.apache.spark.SparkException$.checkInternalError(SparkException.scala:97)

* are different.
*/
class ArrayBasedMapData(val keyArray: ArrayData, val valueArray: ArrayData) extends MapData {
require(keyArray.numElements() == valueArray.numElements())
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have to leave it as is because JSON datasource expects IllegalArgumentException otherwise it propagates SparkException w/ INTERNAL_ERROR to users.

@github-actions
Copy link

github-actions bot commented Apr 7, 2023

We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable.
If you'd like to revive this PR, please reopen it and ask a committer to remove the Stale tag!

@github-actions github-actions bot added the Stale label Apr 7, 2023
@github-actions github-actions bot closed this Apr 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants