You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
lucienoz
changed the title
[SUPPORT] spark sql cow how to set payload class?
[SUPPORT] spark sql cow hoodie.datasource.write.payload.class = 'org.apache.hudi.common.model.DefaultHoodieRecordPayload' not work
Aug 3, 2023
@lucienoz Sorry for the delay here. I was able to reproduce this issue with hudi 0.12.3 but this was fixed later with 0.13.1 version. Can you please upgrade Hudi version to 0.13.1.
Also, With new release (0.14.0) which will be out soon, we have quite a few improvements on the same line. INSERT INTO by default will behave as operation type insert and allow all the duplicates to flow in. If using insert.into behaviour as insert, it is behaving correctly based on the payload.
Tips before filing an issue
Have you gone through our FAQs?
Join the mailing list to engage in conversations and get faster support at dev-subscribe@hudi.apache.org.
If you have triaged this as a bug, then file an issue directly.
Describe the problem you faced
A clear and concise description of the problem.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
A clear and concise description of what you expected to happen.
Environment Description
Hudi version :0.12.3
Spark version :3.3.1
Hive version :3.1.2
Hadoop version :3.3.5
Storage (HDFS/S3/GCS..) :hdfs
Running on Docker? (yes/no) :no
Additional context
Add any other context about the problem here.
Stacktrace
Add the stacktrace of the error.
The text was updated successfully, but these errors were encountered: