-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Description
Search before asking
- I searched in the issues and found nothing similar.
Motivation
In spark, insert into a non-null column (primary-key) with a nullable value will display a error:
Cannot write nullable values to non-null column 'pkey'
But we can use the nvl or coalesce to work around. We need a docs in https://paimon.apache.org/docs/master/how-to/writing-tables/#applying-recordschanges-to-tables to demonstrate this.
we also need a test case to cover this like:
@test
public void testNonnull() {
spark.sql(
"CREATE TABLE T (a INT NOT NULL, b INT, c STRING) TBLPROPERTIES"
+ " ('primary-key'='a', 'file.format'='avro')");
spark.sql(
"CREATE TABLE T2 (a INT, b INT, c STRING) TBLPROPERTIES"
+ " ('file.format'='avro')");
spark.sql("INSERT INTO T2 VALUES (1, 11, '111'), (null, 22, '222')").collectAsList();
Assertions.assertNoException(spark.sql("INSERT INTO T SELECT nvl(a, 1), b, c FROM T2"));
}
in class SparkWriteITCase
Solution
No response
Anything else?
No response
Are you willing to submit a PR?
- I'm willing to submit a PR!