Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Unknown magic byte!" when deserializing avro message with TopicRecordNameStrategy #231

Closed
aeneasb opened this issue Nov 23, 2020 · 2 comments

Comments

@aeneasb
Copy link

aeneasb commented Nov 23, 2020

Hi guys,
I'm trying to deserialize an avro message using a custom message format (see screenshot for configuration). To use multiple avro schemas per kafka topic, I use the "TopicRecordNameStrategy" as subject name strategy. Hence, my subjects in the schema registry are named [topic-name]-[record-name].
messageformat
However, when I open a view in the webview console I get the following error:
Error org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition usermanagement.user.avro.v2-20 at offset 0. If needed, please seek past the record to continue consumption. (show more) java.util.concurrent.CompletionException thrown at java.util.concurrent.CompletableFuture::encodeThrowable -> org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition usermanagement.user.avro.v2-20 at offset 0. If needed, please seek past the record to continue consumption. org.apache.kafka.common.errors.SerializationException thrown at -> Error deserializing key/value for partition usermanagement.user.avro.v2-20 at offset 0. If needed, please seek past the record to continue consumption. org.apache.kafka.common.errors.SerializationException thrown at -> Error deserializing Avro message for id -1 org.apache.kafka.common.errors.SerializationException thrown at -> Unknown magic byte!

This is how my avro schema looks like:
{ "namespace": "UserManagement", "type": "record", "name": "UserRegistered", "fields": [ { "name": "Metadata", "doc": "Mandatory metadata for all event types", "type": { "namespace": "Kafka.Schemas.Common", "name": "Metadata", "type": "record", "fields": [ { "name": "AggregateId", "type": "string", "doc": "Identifies the aggregate taking into account that this aggregate could be emitted / created from different producers. Therefore it is essential that producers use unique id's." }, { "name": "TraceId", "type": "string", "doc": "UUID according to ISO/IEC 11578:1996. The TraceId identifies the event uniquely and must be logged and transported to all processing entities in the context of this event. Format pattern: Pattern.compile('([a-f0-9]{8}(-[a-f0-9]{4}){3}-[a-f0-9]{12})'" }, { "doc": "Timestamp of message on the global timeline, independent of a particular time zone or calendar, with a precision of one microseconds.", "name": "Timestamp", "type": { "type": "long", "logicalType": "timestamp-micros" } } ] } }, { "doc": "Subject Identifier. A locally unique and never reassigned identifier within the Issuer for the End-User, which is intended to be consumed by the Client", "name": "Sub", "type": "string" }, { "doc": "Shorthand name by which the End-User wishes to be referred to at the RP, such as janedoe or j.doe. This value MAY be any valid JSON string including special characters such as @, /, or whitespace", "name": "PreferredUserName", "type": "string" }, { "doc": "Verifiable Identifier for an Issuer. An Issuer Identifier is a case sensitive URL using the https scheme that contains scheme, host, and optionally, port number and path components and no query or fragment components", "name": "IssuerIdentifier", "type": "string" }, { "doc": "End-Users preferred e-mail address", "name": "Email", "type": "string" }, { "doc": "Given name(s) or first name(s) of the End-User", "name": "GivenName", "type": [ "string", "null" ] }, { "doc": "Surname(s) or last name(s) of the End-User", "name": "FamilyName", "type": [ "string", "null" ] }, { "doc": "End-Users preferred telephone number", "name": "PhoneNumber", "type": [ "string", "null" ] } ] }

@Crim
Copy link
Collaborator

Crim commented Nov 24, 2020

Unfortunately I'm not much of an Avro or Schema Registry expert. This comes up in google which might be good to verify that the record in kafka is actually a serialized Avro record.

Have you verified consuming this offset using the kafka-console consumer and everything works ok?

@aeneasb aeneasb closed this as completed Nov 27, 2020
@aeneasb
Copy link
Author

aeneasb commented Nov 27, 2020

The error was that I set the "Message Format for Keys" of the view to be my custom message format instead of string. Now it works.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants