[SPARK-42251][SQL] Forbid deicmal type if precision less than 1#39822
[SPARK-42251][SQL] Forbid deicmal type if precision less than 1#39822ulysses-you wants to merge 2 commits intoapache:masterfrom
Conversation
|
I'm not sure this worth a leagcy config. cc @cloud-fan @viirya @gengliangwang |
|
how does |
|
Another place to check is user-specified schema when reading data sources, e.g. |
It's wrapped in Decimal first which internally use longVal to represent the decimal value so it won't fail. The changePrecision method only fail with decimal(0, 0) if it uses decimalVal(BigDecimal). That's the triky place that it may hide bug since some operators do not fail.
This case should be covered. It's similar with |
|
We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable. |
What changes were proposed in this pull request?
throw exception if the decimal precision less than 1.
Why are the changes needed?
Spark does not actually support decimal type with 0 precision. e.g.
Does this PR introduce any user-facing change?
yes, one main behavior change is:
SELECT cast(0 as decimal(0, 0))How was this patch tested?
add test