Skip to content

Conversation

@hsiang-c
Copy link
Owner

@hsiang-c hsiang-c commented Nov 13, 2025

Which issue does this PR close?

  • Closes #.

Rationale for this change

  • Support ANSI mode Spark-compatible abs math function

What changes are included in this PR?

Tasks breakdown

Non-ANSI mode ANSI mode ANSI Interval Types
apache#18205 This PR TODO

Are these changes tested?

  • unit tests

Are there any user-facing changes?

Yes, the abs function can be specified in the SQL.

  • Arithmetic overflow will be thrown on arithmetic overflow.

@hsiang-c hsiang-c changed the title Support ANSI mode feat: support Spark-compatible abs math function part 2 - ANSI mode Nov 13, 2025
github-merge-queue bot pushed a commit to apache/datafusion that referenced this pull request Nov 19, 2025
…mode (#18205)

## Which issue does this PR close?

<!--
We generally require a GitHub issue to be filed for all bug fixes and
enhancements and this helps us generate change logs for our releases.
You can link an issue to this PR using the GitHub syntax. For example
`Closes #123` indicates that this PR will close issue #123.
-->

- Part of #15914

## Rationale for this change

<!--
Why are you proposing this change? If this is already explained clearly
in the issue then this section is not needed.
Explaining clearly why changes are proposed helps reviewers understand
your changes and offer better suggestions for fixes.
-->
 - Apache Spark's `abs()` behaves differently than DataFusion.
- Apache Spark's
[ANSI-compliant](https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html#ansi-compliance)
dialect can be toggled by SparkConf `spark.sql.ansi.enabled`. When ANSI
mode is off, arithmetic overflow doesn't throw exception like DataFusion
does.
- DataFusion Comet can leverage it at
apache/datafusion-comet#2595

## What changes are included in this PR?

<!--
There is no need to duplicate the description in the issue here but it
is sometimes worth providing a summary of the individual changes in this
PR.
-->
- This is the 1st PR to support non-ANSI mode Spark-compatible `abs`
math function
- Mimics Apache Spark `v4.0.1` [abs
expression](https://github.com/apache/spark/blob/v4.0.1/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala#L148)
for numeric types only and non-ANSI mode, i.e.
`spark.sql.ansi.enabled=false`

### Tasks breakdown

| Non-ANSI mode | ANSI mode | ANSI Interval Types |
| - | - | - |
| this PR | hsiang-c#1 (will change
base branch) | TODO |

## Are these changes tested?

<!--
We typically require tests for all PRs in order to:
1. Prevent the code from being accidentally broken by subsequent changes
2. Serve as another way to document the expected behavior of the code

If tests are not included in your PR, please explain why (for example,
are they covered by existing tests)?
-->
 - unit tests
 - sqllogictest: `test_files/spark/math/abs.slt`

## Are there any user-facing changes?

<!--
If there are user-facing changes then we may require documentation to be
updated before approving the PR.
-->

Yes, the abs function can be specified in the SQL.

 - Arithmetic overflow will NOT be thrown on arithmetic overflow.

<!--
If there are any breaking changes to public APIs, please add the `api
change` label.
-->

---------

Co-authored-by: Oleks V <comphead@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants