You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi - I need to serialize data via the Apicurio serdes but then deserialize it with a different client library (i'm serializing in Java but deserializing in node.js).
I know when deserializing I need to account for the magic byte and the global id, but it seems that the actual serialized data doesn't contain control characters that other avro libraries are expecting.
For example, this is a data object serialized with another Java Avro library: [2,2,8,84,101,115,116,22,72,101,108,108,111,32,87,111,114,108,100,0,2,44,77,121,32,80,114,111,100,117,99,116,32,68,101,115,99,114,105,112,116,105,111,110,2,26,77,121,32,80,114,111,100,117,99,116,32,73,68,2,2,6,69,85,82]
The same schema/data serialized with a node.js Avro library: [2,2,8,84,101,115,116,22,72,101,108,108,111,32,87,111,114,108,100,0,2,44,77,121,32,80,114,111,100,117,99,116,32,68,101,115,99,114,105,112,116,105,111,110,2,26,77,121,32,80,114,111,100,117,99,116,32,73,68,2,2,6,69,85,82]
Now using AvroKafkaSerializer with the same schema/data, and parsing out the magic byte and global id, I get: [2,8,84,101,115,116,22,72,101,108,108,111,32,87,111,114,108,100,0,44,77,121,32,80,114,111,100,117,99,116,32,68,101,115,99,114,105,112,116,105,111,110,26,77,121,32,80,114,111,100,117,99,116,32,73,68,6,69,85,82]
It's missing 5 instances of the byte value '2'
I'm not sure I understand this fully.
So the serialized data from AvroKafkaSerializer did contain the magic byte and the globalId but you removed it, and the resulting bytes are not equal to the result of serialization with other libraries (I'm assuming those other libraries don't have any schema registry related functionalities)
For analizing this better it would be helpfult if you could share a specific example or reproducer, so we can try it out, see and compare to find what is going on...
Thank you @johnkbarrow
Hi - I need to serialize data via the Apicurio serdes but then deserialize it with a different client library (i'm serializing in Java but deserializing in node.js).
I know when deserializing I need to account for the magic byte and the global id, but it seems that the actual serialized data doesn't contain control characters that other avro libraries are expecting.
For example, this is a data object serialized with another Java Avro library:
[2,2,8,84,101,115,116,22,72,101,108,108,111,32,87,111,114,108,100,0,2,44,77,121,32,80,114,111,100,117,99,116,32,68,101,115,99,114,105,112,116,105,111,110,2,26,77,121,32,80,114,111,100,117,99,116,32,73,68,2,2,6,69,85,82]
The same schema/data serialized with a node.js Avro library:
[2,2,8,84,101,115,116,22,72,101,108,108,111,32,87,111,114,108,100,0,2,44,77,121,32,80,114,111,100,117,99,116,32,68,101,115,99,114,105,112,116,105,111,110,2,26,77,121,32,80,114,111,100,117,99,116,32,73,68,2,2,6,69,85,82]
Now using AvroKafkaSerializer with the same schema/data, and parsing out the magic byte and global id, I get:
[2,8,84,101,115,116,22,72,101,108,108,111,32,87,111,114,108,100,0,44,77,121,32,80,114,111,100,117,99,116,32,68,101,115,99,114,105,112,116,105,111,110,26,77,121,32,80,114,111,100,117,99,116,32,73,68,6,69,85,82]
It's missing 5 instances of the byte value '2'
This is my schema:
{ "type" : "record", "name" : "ProductDAO", "namespace" : "com.xxx.testObjects", "fields" : [ { "name" : "customerProperties", "type" : [ "null", { "type" : "map", "values" : "string" } ] }, { "name" : "description", "type" : [ "null", "string" ] }, { "name" : "id", "type" : [ "null", "string" ] }, { "name" : "price", "type" : [ "null", { "type" : "record", "name" : "Price", "fields" : [ { "name" : "currency", "type" : [ "null", "string" ] } ] } ] } ] }
and my sample data:
{ id: 'My Product ID', description: 'My Product Description', customerProperties: { "Test": "Hello World" }, price: { "currency": "EUR" } }
Any idea why this is happening and what I am doing wrong?
The text was updated successfully, but these errors were encountered: