-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SUPPORT] Custom HoodieRecordPayload for use in flink sql #7100
Comments
Thanks @complone . Tried with that also, but no luck. CREATE TABLE t1(
uuid VARCHAR(20) PRIMARY KEY NOT ENFORCED,
name VARCHAR(10),
age INT,
ts TIMESTAMP(3),
`partition` VARCHAR(20)
)
PARTITIONED BY (`partition`)
WITH (
'connector' = 'hudi',
'path' = '/tmp/hudi',
'hoodie.compaction.payload.class' = 'gsHudiPoc.Poc', -- My custom class
'write.payload.class' = 'gsHudiPoc.Poc', -- My custom class
'payload.class' = 'gsHudiPoc.Poc', -- My custom class
'hoodie.datasource.write.payload.class' = 'gsHudiPoc.Poc', -- My custom class
'table.type' = 'COPY_ON_WRITE'
); Let me try looking into the code of FlinkOptions.java |
@yuzhaojing : can you assist here please. |
Did you try to use Feel free to re-open it if you still have problem here. |
Thanks. I was able to get it to work with DataStream API. |
spark sql cow table how to set payload.class ? |
/etc/hudi/conf/hudi-default.conf
I am also passing my custom jar while starting flink sql client.
Any idea what I am doing wrong?
See listing in the question.
The text was updated successfully, but these errors were encountered: