Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-38604][SQL] Keep ceil and floor with only a single argument the same as before #35913

Closed
wants to merge 2 commits into from
Closed
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
8 changes: 6 additions & 2 deletions sql/core/src/main/scala/org/apache/spark/sql/functions.scala
Expand Up @@ -1783,7 +1783,9 @@ object functions {
* @group math_funcs
* @since 1.4.0
*/
def ceil(e: Column): Column = ceil(e, lit(0))
def ceil(e: Column): Column = withExpr {
UnresolvedFunction(Seq("ceil"), Seq(e.expr), isDistinct = false)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, is it just a code cleanup or it does fix a bug?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To confirm, the problem you hit is: ceil(input) and ceil(input, 0) return the same result, but use different expressions which break some custom catalyst rules?

If that's the case, I'd suggest we also fix CeilFloorExpressionBuilderBase and call buildWithOneParam if the scale is 0, to make ceil(input) and ceil(input, 0) exactly the same.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan I agree as the current change does not take into consideration that ceil(input) and ceil(input, 0) return different types although they mean the same thing (which could confuse end users).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can call ceil(input)/floor(input) if scale = 0, but we need the test case to validate the modifications. @awdavidson @revans2 Could you please provide us the test case to be sure ? thanks.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is a bug.

scala> spark.range(1).selectExpr("id", "ceil(id) as one_arg_sql", "ceil(id, 0) as two_arg_sql").select(col("*"), ceil(col("id")).alias("one_arg_func"), ceil(col("id"), lit(0)).alias("two_arg_func")).printSchema
root
 |-- id: long (nullable = false)
 |-- one_arg_sql: long (nullable = true)
 |-- two_arg_sql: decimal(20,0) (nullable = true)
 |-- one_arg_func: decimal(20,0) (nullable = true)
 |-- two_arg_func: decimal(20,0) (nullable = true)
 

scala> spark.range(1).selectExpr("cast(id as double) as id").selectExpr("id", "ceil(id) as one_arg_sql", "ceil(id, 0) as two_arg_sql").select(col("*"), ceil(col("id")).alias("one_arg_func"), ceil(col("id"), lit(0)).alias("two_arg_func")).printSchema
root
 |-- id: double (nullable = false)
 |-- one_arg_sql: long (nullable = true)
 |-- two_arg_sql: decimal(30,0) (nullable = true)
 |-- one_arg_func: decimal(30,0) (nullable = true)
 |-- two_arg_func: decimal(30,0) (nullable = true) 

Without this patch the SQL and scala APIs produce different results and the scala API produces a result that is different from what was in Spark 3.2.

I documented this in the JIRA https://issues.apache.org/jira/browse/SPARK-38604

After this patch the single argument version behaves like it did in 3.2 and is also consistent with the SQL API.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added test cases that explicitly check the result type.

From a consistency standpoint if the return type is going to depend on the scale, then the scale can only ever be a literal value. If we want to break backwards compatibility, then I would suggest that we also fix the overflow issue https://issues.apache.org/jira/browse/SPARK-28135 with double being round to a long. That technically also applies to a double being cast to a decimal type and then rounded.

}

/**
* Computes the ceiling of the given value of `e` to 0 decimal places.
Expand Down Expand Up @@ -1913,7 +1915,9 @@ object functions {
* @group math_funcs
* @since 1.4.0
*/
def floor(e: Column): Column = floor(e, lit(0))
def floor(e: Column): Column = withExpr {
UnresolvedFunction(Seq("floor"), Seq(e.expr), isDistinct = false)
}

/**
* Computes the floor of the given column value to 0 decimal places.
Expand Down