-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CALCITE-4240] SqlTypeUtil#getMaxPrecisionScaleDecimal returns a decimal that with same precision and scale (Jiatao Tao) #2161
Conversation
Still have some ut failed, will fix than, with review comments. |
Remove the word 'fix' from the commit message. Change the message so that it is clear what the bug is. Add space before '(' in commit message. Why would |
Hi @julianhyde , Roger the message irregular, will refine latter. From my understanding, precision represents the total number of digits that can be represented by the column, so it must >= than scale. If you execute Also in Spark/hive:
|
b589272
to
d510178
Compare
// means we can only have decimal places. | ||
while (maxScale >= maxPrecision) { | ||
maxScale = maxScale / 2; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm afraid not all the sql engines have the rule that maxScale = maxScale / 2
, we should add a check for the validity instead of modifying the value silently. And i'm inclined to fix the default max precision/scale for Calcite type system.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm afraid not all the sql engines have the rule that
maxScale = maxScale / 2
, we should add a check for the validity instead of modifying the value silently. And i'm inclined to fix the default max precision/scale for Calcite type system.
This is also a proposal, I'll take a look at Hive/Spark decimal's max scale/repcision, but I'm afraid they may be the same, or we don't get max but get default?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spark
Default: (10, 0)
Max: The precision can be up to 38, scale can also be up to 38 (less or equal to precision).
https://spark.apache.org/docs/latest/api/java/org/apache/spark/sql/types/DecimalType.html
Hive
Default: (10, 0)
Max: Decimal precision out of allowed range [1,38], Decimal scale out of allowed range [0,38]
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Types
The "getMaxPrecisionScaleDecimal" means suppose to get a max Decimal, but the Decimal(max, max) is not this meaning.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I got the impression that many systems support max precision scale as decimal(38, 18)
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I got the impression that many systems support max precision scale as
decimal(38, 18)
.
Hi @danny0405, IMO the specific number here is not so important (like18 or 19), we just need return a big decimal here, so can we finalize a number just like Decimal(38, 18)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @julianhyde
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's just make the valid max scale half of the max precision, e.g. the max scale = max precision/2
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's just make the valid max scale half of the max precision, e.g. the
max scale = max precision/2
.
OK
Thanks, can you fix the conflicts. |
857dcc6
to
0ef386c
Compare
The tests still fails. |
I'm working on this 😂 |
0ef386c
to
fe24721
Compare
expr("'12.3' * '5'") | ||
.columnType("DECIMAL(19, 19) NOT NULL"); | ||
.columnType("DECIMAL(19, 18) NOT NULL"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
expr("12.3/'5.1'") | ||
.columnType("DECIMAL(19, 0) NOT NULL"); | ||
.columnType("DECIMAL(19, 8) NOT NULL"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fe24721
to
0c9ec90
Compare
…mal that with same precision and scale (Jiatao Tao)
0c9ec90
to
b0fdb37
Compare
Hi @danny0405 the ut passed, would you take a look? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1
…mal that with same precision and scale (Jiatao Tao) The SqlTypeUtil#getMaxPrecisionScaleDecimal now returns decimal type with max precision and scale half of that. Previously it returns DECIMAL(19, 19) which is invalid. close apache#2161
…mal that with same precision and scale (Jiatao Tao) The SqlTypeUtil#getMaxPrecisionScaleDecimal now returns decimal type with max precision and scale half of that. Previously it returns DECIMAL(19, 19) which is invalid. close apache#2161
Hi @danny0405 , can you take a look at this PR~
"maxScale" should not greater than "maxPrecision". If they are equal, e.g. Decimal(19,19) means we can only have decimal places.