-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Setup Avro schemas #15
Comments
Notes on schema registry integration
Avro producer exampleExample of Akka stream avro record production (without using case classes) Here is a snippet of my scala code that failed:
//My simple avro schema
val key = "key1"
val f1 = "str1"
val f2 = "str2"
val f3 = "int1"
val userSchema = """
{
"fields": [
{ "name": "str1", "type": "string" },
{ "name": "str2", "type": "string" },
{ "name": "int1", "type": "int" }
],
"name": "myrecord",
"type": "record"
}"""
val parser = new Schema.Parser
val schema = parser.parse(userSchema)
//My SchemaReg client
private val avroRecord: GenericData.Record = new GenericData.Record(schema)
val client = new CachedSchemaRegistryClient("http://hostname:8181",1000)
//My kafka producer settings with Akka Stream Kafka
val producerSettings =
ProducerSettings(context.system,
new io.confluent.kafka.serializers.KafkaAvroSerializer(client),
new io.confluent.kafka.serializers.KafkaAvroSerializer(client))
.withBootstrapServers("sherpavm:6667")
.withProperty("schema.registry.url", "http://sherpavm:8181")
//My producer function with with Akka Stream Kafka
def testProducer(epoch: Int) = Source(epoch to epoch+20)
.map { i =>
addRecords(avroRecord, s"st1-1-${i}", s"st1-2-${i}", i)
new ProducerRecord[Object, Object](topic, key, avroRecord)
}
.runWith(Producer.plainSink(producerSettings))
def addRecords(avroRecord: GenericData.Record, str1: String, str2: String, int1: Int) = {
if (str1 != null) avroRecord.put(f1, str1)
if (str2 != null) avroRecord.put(f2, str2)
avroRecord.put(f3, int1)
} |
… for automatically generating case classes from schema
Problems:Not sure how to handle reading with custom readers (projections) Funcionality is provided, but not sure how to get it to work automatically when passing the decoder. We might have to extend the
|
Schema evolution ruleshttps://docs.oracle.com/cd/E26161_02/html/GettingStartedGuide/schemaevolution.html Aliases:Seems schema registry doesn't accept aliases as being backwards compatible: https://groups.google.com/forum/#!topic/confluent-platform/0NlrxFD5FHk |
Example of writing our own deserializerhttp://stackoverflow.com/questions/36697350/avro-with-kafka-deserializing-with-changing-schema which we might use to have a GenericRecord instead of an IndexedRecord or something like that |
Compilation error bugSeems the class ImageRequestDeserializer on the UI backend is breaking compilationn for some reason, we need to remove it and investigate
|
…g deserialized with old schema, we want to upgrade it to the new schema
…schema, refactored processor code
…e64 encoding to json image content pushed back to client
We want
Frontend:
UI-Backend:
Image Processor:
This will always use V2 of the schema to read the data, with Avro schema evolution filling the gap for V1 produced data.
So what we will test initially is the case of "old producers", which are pushing data into the stream with an older version of the schema, and consumers still can read data properly.
TODO
The text was updated successfully, but these errors were encountered: