[FLINK-31049] [flink-connector-kafka]Add support for Kafka record headers to KafkaSink #22228
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What is the purpose of the change
The default
org.apache.flink.connector.kafka.sink.KafkaSink
does not support adding Kafka record headers when using KafkaRecordSerializationSchemaBuilder, which is the most convenient way to create a Kafka sink. This PR adds support for Kafka headers toKafkaRecordSerializationSchemaBuilder
.Brief change log
HeaderProducer
that allows creatingHeader
s from the input elementKafkaRecordSerializationSchemaBuilder
to allow setting aHeaderProducer
HeaderProducer
constructor argument toKafkaRecordSerializationSchemaWrapper
that now uses aProducerRecord
constructor that includes the headers.Verifying this change
Please make sure both new and modified tests in this PR follows the conventions defined in our code quality guide: https://flink.apache.org/contributing/code-style-and-quality-common.html#testing
This change added tests and can be verified as follows:
KafkaRecordSerializationSchemaBuilderTest
Does this pull request potentially affect one of the following parts:
@Public(Evolving)
: yesDocumentation