Consume failed in two cases : when using public/default username AND when using the "correct" username the tenant/namespace of the topic #1564
Description
The bug description
While testing KoP as a kafka protocol handler for pulsar, I succeeded in producing messages to a partitioned topic
But failed in consuming the messages.
case 1
According to KoP docs, the username needed to be the tenant/namespace of the topic
But, when I used this username, the consumer keep been restarted, and eventually the session is closed.
case 2
I tried to use the username of public/default and the consumer succeeded to join the group but is not consuming any messages.
Bug conclusions
I assume that there is a problem with the compatiblity to kafka in group coordinator and group.id, over pulsar.
To Reproduce these cases
-
Building the cluster : 3 brokers, 3 bookies, 3 local zookeepers, and 3 global zookeepers. each instance runs on different machines.
-
Configuring KoP
messagingProtocols=kafka
protocolHandlerDirectory=./protocols
kafkaListeners=SASL_PLAINTEXT://<broker_hostname>:9092
saslAllowedMechanisms=PLAIN
allowAutoTopicCreationType=partitioned -
Creating
tenant, namespace, partitioned topic, subscription, JWT, and giving consume produce roles -
Writing the attached code using kafkajs
Expected behavior
Giving the tenant/namespace of the topic as the consumer username and succesfuly connect to the broker, join to the group and start consuming the messages in the topic
Brokers logs in case 1
ERROR io.streamnative.pulsar.handlers.kop.kafka RequestHandler - Caught error in the handler, closing channel
Add any other context about the problem here.
Logs Screenshots
kafkajs logs in case 1
This logs repeating themself
The code
const {ConfigResourceTypes, Kafka} = require('kafkajs');
MESSAGE_EVERY = 1000;
const MESSAGE_SIZE = 10;
const TOPIC_NAME = 'persistent://ohad-test/kop/kop_test_partition';
// const kopUser = 'public/default';
const kopUser = 'ohad-test/kop';
const kopPassword = 'token:eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJKb2UifQ.ipevRNuRP6HflG8cFKnmUPtypruRC4fb1DWtoLL62SY';
const KOP_CONN = new Kafka({
brokers: ['broker1:9092', 'broker2:9092', 'broker3:9092'],
sasl: {
mechanism: 'PLAIN',
username: kopUser,
password: kopPassword
}
});
const produce = async(producer, topicName, i) => {
await producer.connect();
async function prodos() {
await producer.send({
topic: topicName,
messages: [
{key: null, value: 'a'.repeat(MESSAGE_SIZE)}
]
});
console.log(`Produce ${topicName} - Hello ${i}`);
i +=1;
}
setInterval(prodos, MESSAGE_EVERY);
}
const consume = async (consumer, topicName) => {
await consumer.connect();
await consumer.subscribe({topic: topicName, fromBeginning: true});
await consumer.run({
eachMessage: async({topic, partition, message}) => {
console.log({
topic,
value: message.value.toString(),
});
},
});
}
const run = async () => {
const KOP_PRODUCER = KOP_CONN.producer();
await produce(KOP_PRODUCER, TOPIC_NAME, 0);
const KOP_CONSUMER = KOP_CONN.consumer({groupId: kopUser});
await consume(KOP_CONSUMER, TOPIC_NAME);
}
run();