-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-30252][SQL] Disallow negative scale of Decimal #26881
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Closed
Changes from all commits
Commits
Show all changes
26 commits
Select commit
Hold shift + click to select a range
e88a6ac
disallow negative scale under ansi mode
Ngone51 fbb9db1
attach JIRA id
Ngone51 caafaa6
improve test
Ngone51 80758eb
Merge branch 'master' of github.com:apache/spark into nonnegative-scale
Ngone51 9d41ea2
consider precision <= scale
Ngone51 7bd8478
create type from decimal
Ngone51 f3f34f1
use BigDecimal to create integeral decimal
Ngone51 2c2df5b
fix test
Ngone51 3f2abae
fix decimal type for values less than 1.0
Ngone51 d08789a
use legacy config
Ngone51 c0589c9
update error message
Ngone51 925e390
add migration guide
Ngone51 48042ff
fix test suites
Ngone51 b5bbd4d
fix query
Ngone51 410f6e6
add comment
Ngone51 ab11de7
fix
Ngone51 a3262fb
fix tests
Ngone51 898a8d9
fix python test
Ngone51 feb0a6f
merge master
Ngone51 c1bd853
revert sql.out for [-1.0,1.0] decimal
Ngone51 1ba3067
add check for max precision
Ngone51 a76930f
update with conf
Ngone51 64704dd
fix python indent
Ngone51 156c31f
revert precision check
Ngone51 603aed0
fix test
Ngone51 563853b
fix python failed doc
Ngone51 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -24,6 +24,7 @@ import scala.reflect.runtime.universe.typeTag | |
| import org.apache.spark.annotation.Stable | ||
| import org.apache.spark.sql.AnalysisException | ||
| import org.apache.spark.sql.catalyst.expressions.{Expression, Literal} | ||
| import org.apache.spark.sql.internal.SQLConf | ||
|
|
||
| /** | ||
| * The data type representing `java.math.BigDecimal` values. | ||
|
|
@@ -41,6 +42,8 @@ import org.apache.spark.sql.catalyst.expressions.{Expression, Literal} | |
| @Stable | ||
| case class DecimalType(precision: Int, scale: Int) extends FractionalType { | ||
|
|
||
| DecimalType.checkNegativeScale(scale) | ||
|
|
||
| if (scale > precision) { | ||
| throw new AnalysisException( | ||
| s"Decimal scale ($scale) cannot be greater than precision ($precision).") | ||
|
|
@@ -141,20 +144,26 @@ object DecimalType extends AbstractDataType { | |
| } | ||
|
|
||
| private[sql] def fromLiteral(literal: Literal): DecimalType = literal.value match { | ||
| case v: Short => fromBigDecimal(BigDecimal(v)) | ||
| case v: Int => fromBigDecimal(BigDecimal(v)) | ||
| case v: Long => fromBigDecimal(BigDecimal(v)) | ||
| case v: Short => fromDecimal(Decimal(BigDecimal(v))) | ||
| case v: Int => fromDecimal(Decimal(BigDecimal(v))) | ||
| case v: Long => fromDecimal(Decimal(BigDecimal(v))) | ||
| case _ => forType(literal.dataType) | ||
| } | ||
|
|
||
| private[sql] def fromBigDecimal(d: BigDecimal): DecimalType = { | ||
| DecimalType(Math.max(d.precision, d.scale), d.scale) | ||
| } | ||
| private[sql] def fromDecimal(d: Decimal): DecimalType = DecimalType(d.precision, d.scale) | ||
|
|
||
| private[sql] def bounded(precision: Int, scale: Int): DecimalType = { | ||
| DecimalType(min(precision, MAX_PRECISION), min(scale, MAX_SCALE)) | ||
| } | ||
|
|
||
| private[sql] def checkNegativeScale(scale: Int): Unit = { | ||
| if (scale < 0 && !SQLConf.get.allowNegativeScaleOfDecimalEnabled) { | ||
| throw new AnalysisException(s"Negative scale is not allowed: $scale. " + | ||
| s"You can use spark.sql.legacy.allowNegativeScaleOfDecimal.enabled=true " + | ||
| s"to enable legacy mode to allow it.") | ||
|
Comment on lines
+162
to
+163
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. nit: no need s"". |
||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Scale adjustment implementation is based on Hive's one, which is itself inspired to | ||
| * SQLServer's one. In particular, when a result precision is greater than | ||
|
|
@@ -164,7 +173,8 @@ object DecimalType extends AbstractDataType { | |
| * This method is used only when `spark.sql.decimalOperations.allowPrecisionLoss` is set to true. | ||
| */ | ||
| private[sql] def adjustPrecisionScale(precision: Int, scale: Int): DecimalType = { | ||
| // Assumption: | ||
| // Assumptions: | ||
| checkNegativeScale(scale) | ||
| assert(precision >= scale) | ||
|
|
||
| if (precision <= MAX_PRECISION) { | ||
|
|
||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.