Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Avro messages encoded with Confluent's (de)serializer (and other alternative (de)serializers) #55

Closed
neptoon opened this issue Oct 9, 2016 · 0 comments
Assignees
Milestone

Comments

@neptoon
Copy link

neptoon commented Oct 9, 2016

As described in http://stackoverflow.com/questions/39941497/spring-cloud-stream-kafka-consuming-avro-messages-from-confluent-rest-proxy, the Kafka Binders fail to deserialize Avro messages which have been serialized via Confluents serializer. The issue seems to be that the Confluent serializer adds 4 (or 5?) magic bytes at the beginning of each message which contains the Avro schema ID which can be used to retrieve the schema from the schema registry.

There seems to be no easy way to configure the bindings to use alternative serializers. From what I understand, the bindings hard-code the use of the Kafka ByteArrayDeserializer (https://github.com/spring-cloud/spring-cloud-stream-binder-kafka/blob/master/spring-cloud-stream-binder-kafka/src/main/java/org/springframework/cloud/stream/binder/kafka/KafkaMessageChannelBinder.java#L254).

@mbogoevici mbogoevici added this to the 1.1.1.RELEASE milestone Oct 10, 2016
@ilayaperumalg ilayaperumalg self-assigned this Oct 11, 2016
ilayaperumalg added a commit to ilayaperumalg/spring-cloud-stream-binder-kafka that referenced this issue Oct 11, 2016
 - When binding the consumer, the kafka consumer should not be set to use `ByteArrayDeserializer` for both key/value deserializer. Instead, they need to be used as the default values. Any extended consumer properties for key/value deserializer should override this default deserializer.

 - Add test

This resolves spring-cloud#55
ilayaperumalg added a commit to ilayaperumalg/spring-cloud-stream-binder-kafka that referenced this issue Nov 7, 2016
 - When binding the consumer, the kafka consumer should not be set to use `ByteArrayDeserializer` for both key/value deserializer. Instead, they need to be used as the default values. Any extended consumer properties for key/value deserializer should override this default deserializer.

 - Add test

This resolves spring-cloud#55

Add tests for custom/native serialization

 - Test using built-in serialization without using kafka native serialization (ie. both the serializer/de-serializer are set to use ByteArraySe/Deserializer)
 - Test using custom serializer by explicitly setting value.deserializer for both Kafka producer/consumer properties
 - Test avro message conversion and Kafka Avro Serializer using Confluent Schema Registry
     - Given pre-released registry versions have some bugs that blocks the testing and `3.0.1` requires Kafka 0.10, this test is only in Kafka .10 binder

Update Schema based custom serializer/de-serializer tests

Fix import
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants