Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-39226][DOCS][FOLLOWUP] Update the migration guide after fixing the precision of the return type of round-like functions #36821

Closed
wants to merge 2 commits into from

Conversation

wangyum
Copy link
Member

@wangyum wangyum commented Jun 9, 2022

What changes were proposed in this pull request?

Update the migration guide after fixing the precision of the return type of round-like functions.

How to reproduce this issue:

-- Spark 3.2
CREATE TABLE t1(CURNCY_AMT DECIMAL(18,6)) using parquet;
CREATE VIEW v1 AS SELECT BROUND(CURNCY_AMT, 6) AS CURNCY_AMT FROM t1;
-- Spark 3.3
 SELECT * FROM v1;
org.apache.spark.sql.AnalysisException: [CANNOT_UP_CAST_DATATYPE] Cannot up cast CURNCY_AMT from "DECIMAL(19,6)" to "DECIMAL(18,6)".

Why are the changes needed?

Update the migration guide.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

N/A

@github-actions github-actions bot added the DOCS label Jun 9, 2022
@wangyum
Copy link
Member Author

wangyum commented Jun 9, 2022

cc @cloud-fan @MaxGekk

@wangyum wangyum changed the title [SPARK-39226][SQL][FOLLOWUP] Update the migration guide after fixing the precision of the return type of round-like functions [SPARK-39226][DOCS][FOLLOWUP] Update the migration guide after fixing the precision of the return type of round-like functions Jun 9, 2022
@@ -65,6 +65,8 @@ license: |
- Since Spark 3.3, when reading values from a JSON attribute defined as `FloatType` or `DoubleType`, the strings `"+Infinity"`, `"+INF"`, and `"-INF"` are now parsed to the appropriate values, in addition to the already supported `"Infinity"` and `"-Infinity"` variations. This change was made to improve consistency with Jackson's parsing of the unquoted versions of these values. Also, the `allowNonNumericNumbers` option is now respected so these strings will now be considered invalid if this option is disabled.

- Since Spark 3.3, Spark will try to use built-in data source writer instead of Hive serde in `INSERT OVERWRITE DIRECTORY`. This behavior is effective only if `spark.sql.hive.convertMetastoreParquet` or `spark.sql.hive.convertMetastoreOrc` is enabled respectively for Parquet and ORC formats. To restore the behavior before Spark 3.3, you can set `spark.sql.hive.convertMetastoreInsertDir` to `false`.

- Since Spark 3.3, the precision of the return type of round-like functions has been fixed. This may cause Spark throw a `CANNOT_UP_CAST_DATATYPE` exception when using views created by prior versions. In such cases, you need to recreate the views using ALTER VIEW AS or CREATE OR REPLACE VIEW AS with newer Spark versions.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: more precisely: ... Spark throw AnalysisException of the CANNOT_UP_CAST_DATATYPE error class

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed.

@MaxGekk
Copy link
Member

MaxGekk commented Jun 9, 2022

Failed with:

[info] Main Java API documentation to /__w/spark/spark/target/javaunidoc...
[error] /__w/spark/spark/connector/avro/target/java/org/apache/spark/sql/avro/AvroDataToCatalyst.java:3:1:  error: illegal combination of modifiers: abstract and static
[error]   static public abstract  R apply (T1 v1, T2 v2, T3 v3)  ;
[error]                             ^

Is this related?

@wangyum
Copy link
Member Author

wangyum commented Jun 9, 2022

Failed with:

[info] Main Java API documentation to /__w/spark/spark/target/javaunidoc...
[error] /__w/spark/spark/connector/avro/target/java/org/apache/spark/sql/avro/AvroDataToCatalyst.java:3:1:  error: illegal combination of modifiers: abstract and static
[error]   static public abstract  R apply (T1 v1, T2 v2, T3 v3)  ;
[error]                             ^

Is this related?

I do not think it is related.

@MaxGekk
Copy link
Member

MaxGekk commented Jun 9, 2022

+1, LGTM. Merging to master/3.3.
Thank you, @wangyum and @yaooqinn for review.

MaxGekk pushed a commit that referenced this pull request Jun 9, 2022
… the precision of the return type of round-like functions

### What changes were proposed in this pull request?

Update the migration guide after fixing the precision of the return type of round-like functions.

How to reproduce this issue:
```sql
-- Spark 3.2
CREATE TABLE t1(CURNCY_AMT DECIMAL(18,6)) using parquet;
CREATE VIEW v1 AS SELECT BROUND(CURNCY_AMT, 6) AS CURNCY_AMT FROM t1;
```
```sql
-- Spark 3.3
 SELECT * FROM v1;
org.apache.spark.sql.AnalysisException: [CANNOT_UP_CAST_DATATYPE] Cannot up cast CURNCY_AMT from "DECIMAL(19,6)" to "DECIMAL(18,6)".
```

### Why are the changes needed?

Update the migration guide.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

N/A

Closes #36821 from wangyum/SPARK-39226.

Authored-by: Yuming Wang <yumwang@ebay.com>
Signed-off-by: Max Gekk <max.gekk@gmail.com>
(cherry picked from commit 1053794)
Signed-off-by: Max Gekk <max.gekk@gmail.com>
@MaxGekk MaxGekk closed this in 1053794 Jun 9, 2022
@wangyum wangyum deleted the SPARK-39226 branch June 9, 2022 13:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
3 participants