Sample compute node implementation for processing Kafka messages that have been serialized using Apache Avro schemas from a schema registry.
It turns json-encoded or binary-encoded Avro data into a JSON object that can be processed using standard native App Connect transformation nodes. This can be used with messages that contain schema IDs in message headers, or with schema IDs in the message payload.
AvroDeserialize.java
- implementation of the Java compute node
sample-policy.policyxml
- example of a policy needed to configure the compute node with details of the schema registry to use
input terminal | format | details |
---|---|---|
input | BLOB | serialized message data retrieved by a KafkaConsumer node |
output terminal | format | details |
---|---|---|
out | JSON | JSON object deserialized using an Apache Avro schema |
alt | BLOB | messages that could not be deserialized* |
- Schema has been deleted from the Schema Registry since the message was produced to the Kafka topic
- Schema Registry is not currently available
- Invalid schema registry credentials provided in the config policy
The compute node has a run-time dependency on a policy for providing the configuration information about the Avro schema registry to use.
A sample policy is provided and needs to be deployed with any message flows using this compute node.
The compute node implementation has compile and run-time dependencies on the following jars. See the IBM App Connect Enterprise documentation on Adding Java code dependencies for guidance on how to do this.
avro-1.11.3.jar
slf4j-api-1.7.25.jar
The compute node implementation is based on schemas from an Apicurio Schema Registry, such as the registry that is included with IBM Event Streams or run as a stand-alone open source registry.
It can be simply modified to support the use of other schema registries - comments in the Java compute node code identify where these changes are needed.