New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support precreated Kafka topics on locked-down brokers #8378
Comments
Note that there are two places in code where creation happens: As part of this task, we should refactor it a bit so this logic can live in one place. One option might be having the sink_connector use our kafka-utils. |
we should probably have the same semantics for that as we end up implementing the sqs part of s3 sources with sqs notifications. That is: similar syntax for using existing vs creating new, similar defaults, similar error and cleanup behavior. |
Simplify the places we create kafka topics to just one. I think this might potentially fix a latent bug because the way we created sink topics (and their corresponding consistency topics) may in some cases create the topic with the wrong number of partitions (see comment in ::kafka_util::admin::create_topic_helper explaining automatic topic creation) Motivation Seemed like a decent first step for #8378.
Heroku Kafka also works this way. So, even when topics are pre-created by the user, attempting to create SINKs to Heroku Kafka fails at step (1) with the error:
Not too sure about the popularity of Kafka on Heroku, but just flagging it here. |
Note - it’s possible that the metadata check in precreated topics should differ from the metadata query when mz creates them (for example, does the sink actually have a dependency on knowing the partition count for data topics or can it just use whatever topic is present however it’s configured). |
Retiring old issue |
Some customers would like to run materialized in a configuration where topics are precreated in Kafka, the broker disallows materialized from creating new topics, and materialize's sinks are created with
reuse_topic
.This is currently unsupported.
Our logic today is:
TopicAlreadyExists
error, consider creation successfulIn a scenario where the Kafka broker has disallowed topic creation, step 1 will result in a denied error - even if the topic exists.
Example:
Logic will need to be adjusted to something that both handles the topic creation race condition and supports locked-down brokers, such as:
TopicAlreadyExists
error, consider creation successfulIt will be important to test this in a plausible user configuration to make sure that whatever we put in as that initial metadata query isn't also going to be ACL denied.
The text was updated successfully, but these errors were encountered: