-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Interoperability with confluent kafka + schema registry + container header #22
Comments
I'm not sure I follow what you're asking. You can generate container files using a How are you talking to Kafka? Are you using the REST proxy? |
Thanks for your reply, |
I see. I'm not familiar with Edit: Could you share the code you have so far? |
I might not be explaining myself. The error is on the receiving end. If you On Wednesday, December 16, 2015, Matthieu Monsch notifications@github.com
MSc. Charles M. Romestant F. Merci de penser à l'environnement avant d'imprimer cet e-mail |
Confluent stack will add a header to a single Avro message:
You can get an idea of how deserialization works here. As far as I know, Avsc is only about Avro and if you need to integrate with Confluent's stack, you need to handle yourself the specific header. So I may be mistaken but I think Avro's "Object Container Files" is another topic. |
Awesome thanks! On Monday, December 21, 2015, Gabriel Nau notifications@github.com wrote:
MSc. Charles M. Romestant F. Merci de penser à l'environnement avant d'imprimer cet e-mail |
Thanks @gabrielnau , this makes sense. @cromestant - you should then be able to generate a valid message for example as follows: /**
* Encode an Avro value into a message, as expected by Confluent's Kafka Avro
* deserializer.
*
* @param val {...} The Avro value to encode.
* @param type {Type} Your value's Avro type.
* @param schemaId {Integer} Your schema's ID (inside the registry).
* @param length {Integer} Optional initial buffer length. Set it high enough
* to avoid having to resize. Defaults to 1024.
*
*/
function toMessageBuffer(val, type, schemaId, length) {
var buf = new Buffer(length || 1024);
buf[0] = 0; // Magic byte.
buf.writeInt32BE(schemaId, 1);
var pos = type.encode(val, buf, 5);
if (pos < 0) {
// The buffer was too short, we need to resize.
return getMessageBuffer(type, val, schemaId, length - pos);
}
return buf.slice(0, pos);
} Sample usage: var type = avsc.parse('string');
var buf = toMessageBuffer('hello', type, 1); // Assuming 1 is your schema's ID. |
You guys are awesome. I'll test all of this in January when back from On Tuesday, December 22, 2015, Matthieu Monsch notifications@github.com
MSc. Charles M. Romestant F. Merci de penser à l'environnement avant d'imprimer cet e-mail |
Found this while investigating the hassle to switch to Avro, and made me smile, very nice work 👍 Our Java guys are already using Avro and they love it, especially together with the schema registry. Would be nice to here from @cromestant how it worked out, maybe you want to share some insights? |
Sorry. The truth is that we have not been able to pick it up yet. The On Monday, January 18, 2016, Stephan Schneider notifications@github.com
MSc. Charles M. Romestant F. Merci de penser à l'environnement avant d'imprimer cet e-mail |
I've finally been able to test this, and it works like a charm.
marking this as closed as it is working. |
Hello, I'm working on using a confluent kafka deployment, and using avsc to be the publisher in many cases. However I am running into some problems when trying to use schema validation and the schema registry.
I am wondering if I have not found the correct way to do this in avsc or if it is not yet compatible, but in essence, if you look at this
You will see that along with the serialized data they include in the payload the schema, so that the client can then query the schema registry for the latest version of the schema to deserialize.
also looking at this it mentions that:
If I run using the java samples included in the distribution, I get no problems, however trying to serialize with AVSC I only get the serialized payload, and not the schema.(thus getting magic byte missing error)
Is this something avsc is compatible with? or , is there a way to have avsc include the schema in the header as described here and here
in advance, thank you for your reply
The text was updated successfully, but these errors were encountered: