Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Leverage schema registry when deserializing protobuf messages for viewing #172

Closed
atoom opened this issue Feb 5, 2021 · 6 comments
Closed
Labels
backend feature New feature or request

Comments

@atoom
Copy link

atoom commented Feb 5, 2021

In order to deserialize protobuf messages we currently have to add a reference to a remote git-repository and add explicit mappings between a topic and a .proto file.

The latest version of the Kafka Schema Registry by Confluent now supports protobuf in addition to Avro. It would be nice if Kowl automatically could try to fetch .proto files via the message schema id from the schema registry when viewing protobuf encoded messages.

@atoom atoom changed the title Leverage schema registry when deserializing messages for viewing Leverage schema registry when deserializing protobuf messages for viewing Feb 5, 2021
@weeco weeco added backend feature New feature or request labels Feb 6, 2021
@sneko
Copy link

sneko commented Feb 17, 2021

@weeco I'm also interested by this, is it on your roadmap?

Also, about the mapping, what do you recommend when for 1 topic I have multiple type of messages (multiple schemas)? In my code I'm able to distinguish them thanks to a subproperty of the key (like key.type property) but I don't know if Kowl is able to do this on the fly?

Thank you,

@sneko
Copy link

sneko commented Feb 17, 2021

@weeco according to https://www.confluent.fr/blog/multiple-event-types-in-the-same-kafka-topic/#json-schema-and-protobuf-with-schema-references an idea would be to have a "master schema" for each topic containing multiple schemas. And this master schema would be like:

message UserEvent {
  oneof oneof_type {
    UserCreated created = 1;
    UserUpdated updated = 2;
    UserRemoved removed = 3;
 }
}

Like that I would be able in Kowl to map 1-to-1 and Kowl would deal with the oneof on its own?

Does it make sense to you?

EDIT: I'm not sure it would work since in my Kafka they are directly registered as UserCreated, UserUpdated in the message value. Not with the "master schema", I don't think Kowl can deal with that 😢 ?

@weeco
Copy link
Contributor

weeco commented Feb 17, 2021

Adding Protobuf support using the schema registry is planned, but I'm not sure in what release we'll be able to ship this. Multiple/dynamic types for a key/value in a single topic sounds more tricky to solve and I'm not sure if we want to go that route with Kowl - waiting for more feedback here.

@saryeHaddadi
Copy link

saryeHaddadi commented Apr 17, 2021

Our company uses C# as the main development language. Definetly, we can't use Avro (support is bad/buggy in C#). We are planning to use Protobuf as our serialization format. I am in search of developer tooling and found Kowl interesting (UI, seems to have all the basic features for development). But before opening a feature request myself, I searched for this issue.

Indeed, in term of developper and Day-1 experience, having Kowl working out-of-the-box with Protobuf, by leveraging the schema registry, without any further maintenance, is desirable. As what concens me, today, the only feature I am still searching for is an easy way to consult protobuf message for our developers.

Then, if needed we can always address the tooling for administrators separatly from the developers tooling.

@weeco
Copy link
Contributor

weeco commented Apr 25, 2021

Hey everyone,
I looked at the Java code and what it does for the serialization and deserialization part. Unfortunately this implementation does not yet exist in any Go client, but I'd be happy to implement it into Kowl. In order to validate the deserialization results I'd like to test it against a few serialized messages that utilize the schema registry and have been produced using one of the Java clients.

Is someone able to help out with this? Ideally hit me up via Discord, that would be awesome :) sarye helped out, thank you! :)

For reference:
Deserialize method: https://github.com/confluentinc/schema-registry/blob/c15fd4a1b7b641fe3072ad04e2d17931ffc61ee5/protobuf-serializer/src/main/java/io/confluent/kafka/serializers/protobuf/AbstractKafkaProtobufDeserializer.java#L106-L178
Serialize method: https://github.com/confluentinc/schema-registry/blob/c15fd4a1b7b641fe3072ad04e2d17931ffc61ee5/protobuf-serializer/src/main/java/io/confluent/kafka/serializers/protobuf/AbstractKafkaProtobufSerializer.java#L101-L108

@weeco
Copy link
Contributor

weeco commented Apr 29, 2021

Fixed via PR #204

@weeco weeco closed this as completed Apr 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backend feature New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants