-
Notifications
You must be signed in to change notification settings - Fork 150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error: truncated buffer/TypeError: this.buf.utf8Slice is not a function #59
Comments
Your schema looks fine. What environment are you running (i.e. node/browser version)? Could you attach the code you are using to read the message? |
The environment is node and I was trying to read messages using the following code:
|
Thanks, I see. The problem is that The best way to handle this would probably be to implement some kind of schema resolution logic (retrieving the // ...
consumer.on('message', function (message) {
var buf = new Buffer(message.value, 'binary'); // Read string into a buffer.
var decodedMessage = type.fromBuffer(buf.slice(5)); // Skip prefix.
console.log(decodedMessage);
}); You might also want to take a look at #22 for more information on Kafka messages. |
Oh! Thanks. |
I also had this problem with kafka-node. I solved it by setting the encoding on the consumer to 'buffer'. For example: let consumer = new kafka.Consumer(client, [], {encoding:'buffer'}); |
Just for future reference: I ran into this error when I didn't use a checksum whilst encoding/decoding with a custom compression codec. Once I added the checksum, this exact error disappeared. |
When I want to read an event received from Kafka with the following schema, at first I get the Error: truncated buffer exception and when I customize the JSON schema I get TypeError: this.buf.utf8Slice is not a function exception! I think something is wrong with JSON schema. I've provided both Avro and JSON schema for better debugging.
My Avro schema when producing event to Kafka:
My JSON schema when trying to read produced events:
The text was updated successfully, but these errors were encountered: