Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

help request: Pubsub with Kafka ready for production ? #10947

Open
arnauddeman opened this issue Feb 19, 2024 · 5 comments
Open

help request: Pubsub with Kafka ready for production ? #10947

arnauddeman opened this issue Feb 19, 2024 · 5 comments

Comments

@arnauddeman
Copy link

arnauddeman commented Feb 19, 2024

Description

Hello, I would like to know the status of PubSub with Kafka implementation in Apisix. Is it an experimental feature or is it ready for production ? The underlying Kafka Client lua-resty-kafka is still flagged as experimental.

My second question is about the implementation of the pubsub pattern in Apisix : the new messages are not pushed to the client only the messages published before the PubSubQuery are. Is this the expected behaviour ?

Environment

APISIX docker version, image apache/apisix:3.6.0-debian

@arnauddeman arnauddeman changed the title Pubsub with Kafka ready for production ? help request: Pubsub with Kafka ready for production ? Feb 20, 2024
@shreemaan-abhishek
Copy link
Contributor

Push will not occur (due to the complexity of Nginx workers and some long connections). The existing mode is req/resp, where the client tells the gateway from which offset position to start sending subsequent messages.

The input parameters of fetch message is topic/partition/offset, which will send the latest message starting from offset (if there are too many messages and exceed the maximum byte limit, the number of messages that meet the response body size will be returned).

After the client receives the response, it can traverse it to retrieve all messages, and in the last message, you will get the timestamp and offset of this message. The client can use them to retrieve newer messages in the next fetch request.

@arnauddeman
Copy link
Author

arnauddeman commented Feb 27, 2024

Thanks for your answer,
I am in the context of reactive programming, so I think that will not feet my needs. I also need to be able to filter the Kafka messages.
Another experiment around this topic seems to be more successful :
Apisix <------------------> API (Spring Kafka) <----------------> Kafka
(Websocket
and OIDC plugin)

@juzhiyuan
Copy link
Member

juzhiyuan commented Mar 21, 2024

Hi @arnauddeman, I have checked with a few of the maintainers.

Pub/Sub is still experimental, requiring the client to use WebSocket rather than HTTP or Kafka Protocol.

I have also checked the issues you raised at https://github.com/apache/apisix/issues/created_by/arnauddeman, but I would like to understand your requirements and scenarios. Can you share the details? (what kind of problems do you want to resolve? e.g., proxy Kafka behind APISIX? or configuring OIDC with Kafka?).

This can help us better understand your needs and may provide a suggestion.

@arnauddeman
Copy link
Author

Hi @juzhiyuan,
Thanks for your answer.
The objective is to implement a notification system and the first idea was to proxy Kafka behind Apisix with the OIDC plugin. This doesn't seem to be the good approach because we need the messages to be pushed and filtered.

As you suggested, I have tested the use of a web socket route, connected to a locally developed API which uses a Kafka client. The associated APISIX route uses the OIDC plugin and the local API can retrieve the token to determine the user and filter the Kafka messages.

@juzhiyuan
Copy link
Member

Hi @arnauddeman,

I have checked with @bzp2010 and @moonming; we need some time to discuss this. Will keep you updated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: 📋 Backlog
Development

No branches or pull requests

3 participants