Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

supplying kafka key in toConfluentAvro #6

Closed
pateusz opened this issue Jun 29, 2018 · 13 comments
Closed

supplying kafka key in toConfluentAvro #6

pateusz opened this issue Jun 29, 2018 · 13 comments

Comments

@pateusz
Copy link

pateusz commented Jun 29, 2018

Hey.
Is there any way to specify kafka key when using toConfluentAvro method?
As far as I see this method converts DF to DF with only 'value' column, which makes it impossible to specify kafka key from one of input cols. As far as i see the same happens in fromConfluentAvro - key columns isn't preserved.
Is there any workaround for this?

@felipemmelo
Copy link
Collaborator

Hi there. For now you're right, we are currently only supporting the value column. The key will be supported in about 2 weeks. You'll come back here as soon as it's committed. Thanks for question.

@felipemmelo
Copy link
Collaborator

Hi there, pinging to let you know that the key column can now be preserved after consuming from Kafka. News about using it to send messages to Kafka are coming soon.

@felipemmelo
Copy link
Collaborator

Hi there, full support now provided for Avro serde on keys an values from Dataframes retrieved from Kafka.

https://github.com/AbsaOSS/ABRiS#writingreading-keys-and-values-as-avro-from-kafka

@pateusz
Copy link
Author

pateusz commented Jul 26, 2018

Thanks. Appreciate it a lot.
One more question. As far as I'm able to read nofluent avro value+plain key. Is it possible to write confluent avro record in such way?

@felipemmelo
Copy link
Collaborator

Do you mean having plain key and Avro payload in Confluent format?

@pateusz
Copy link
Author

pateusz commented Jul 26, 2018 via email

@felipemmelo
Copy link
Collaborator

Not yet, but since you're asking about it we now have a use case. Will add and let you know. Thanks for help.

@felipemmelo
Copy link
Collaborator

Hi @pateusz , pinging to let you know about the plain key feature. You can check more about it here: https://github.com/AbsaOSS/ABRiS#writingreading-values-as-avro-and-plain-keys-as-string-tofrom-kafka

Regards.

@OneCricketeer
Copy link

Rather than make new methods for each combination of key+value type (for example, say I have avro keys and integer values), what about a UDF that you can just at-will do like

df.select(from_confluent_avro(col("key")), col("value").cast("int"))

Similar to the existing from_json function

@felipemmelo
Copy link
Collaborator

Hi @Cricket007 , sorry for the late reply. Yep, that is definitely coming since Spark itself had these methods added. We'll be updating the whole API to comply with that standard.

@OneCricketeer
Copy link

@felipemmelo Would you like me to create a new issue for tracking that request?

@felipemmelo
Copy link
Collaborator

Hi @Cricket007 , if you can, please, so that we keep it documented. Thanks a lot!

@OneCricketeer
Copy link

@felipemmelo Done! #16

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants