Skip to content

[SPARK-50119] Add user facing error for lambda functions outside of higher order functions#48658

Closed
dusantism-db wants to merge 3 commits intoapache:masterfrom
dusantism-db:lambda-unresolved-exception
Closed

[SPARK-50119] Add user facing error for lambda functions outside of higher order functions#48658
dusantism-db wants to merge 3 commits intoapache:masterfrom
dusantism-db:lambda-unresolved-exception

Conversation

@dusantism-db
Copy link
Contributor

@dusantism-db dusantism-db commented Oct 25, 2024

What changes were proposed in this pull request?

This PR replaces a UnresolvedException throw with a user facing error. The error has a ? instead of class name of the class which contains the lambda, because at this point in execution we don't have that information

Why are the changes needed?

See https://issues.apache.org/jira/browse/SPARK-50119

Does this PR introduce any user-facing change?

Yes, different error is thrown for lambda function misuse

How was this patch tested?

Added unit test.

Was this patch authored or co-authored using generative AI tooling?

No

@github-actions github-actions bot added the SQL label Oct 25, 2024
Copy link
Member

@MaxGekk MaxGekk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider to provide more relevant error condition.

override def exprId: ExprId = throw new UnresolvedException("exprId")
override def dataType: DataType = throw new UnresolvedException("dataType")
override def dataType: DataType = throw new SparkRuntimeException(
errorClass = "INVALID_LAMBDA_FUNCTION_CALL.NON_HIGHER_ORDER_FUNCTION",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you clarify, please, why do you propose the error condition. IMHO, it is related to the condition when Spark expects a higher-order function but got something else.

Looking at the test, you are trying to fix another situation when a higher-order functions is not expected at all.


override def exprId: ExprId = throw new UnresolvedException("exprId")
override def dataType: DataType = throw new UnresolvedException("dataType")
override def dataType: DataType = throw new SparkRuntimeException(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, why a runtime exception?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants