-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
supplying kafka key in toConfluentAvro #6
Comments
Hi there. For now you're right, we are currently only supporting the value column. The key will be supported in about 2 weeks. You'll come back here as soon as it's committed. Thanks for question. |
Hi there, pinging to let you know that the key column can now be preserved after consuming from Kafka. News about using it to send messages to Kafka are coming soon. |
Hi there, full support now provided for Avro serde on keys an values from Dataframes retrieved from Kafka. https://github.com/AbsaOSS/ABRiS#writingreading-keys-and-values-as-avro-from-kafka |
Thanks. Appreciate it a lot. |
Do you mean having plain key and Avro payload in Confluent format? |
Exactly this
|
Not yet, but since you're asking about it we now have a use case. Will add and let you know. Thanks for help. |
Hi @pateusz , pinging to let you know about the plain key feature. You can check more about it here: https://github.com/AbsaOSS/ABRiS#writingreading-values-as-avro-and-plain-keys-as-string-tofrom-kafka Regards. |
Rather than make new methods for each combination of key+value type (for example, say I have avro keys and integer values), what about a UDF that you can just at-will do like
Similar to the existing |
Hi @Cricket007 , sorry for the late reply. Yep, that is definitely coming since Spark itself had these methods added. We'll be updating the whole API to comply with that standard. |
@felipemmelo Would you like me to create a new issue for tracking that request? |
Hi @Cricket007 , if you can, please, so that we keep it documented. Thanks a lot! |
@felipemmelo Done! #16 |
Hey.
Is there any way to specify kafka key when using toConfluentAvro method?
As far as I see this method converts DF to DF with only 'value' column, which makes it impossible to specify kafka key from one of input cols. As far as i see the same happens in fromConfluentAvro - key columns isn't preserved.
Is there any workaround for this?
The text was updated successfully, but these errors were encountered: