Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable Broker Authentication usign SASL mechanism #43

Merged
merged 14 commits into from
Jul 11, 2023

Conversation

rpelisse
Copy link
Collaborator

@rpelisse rpelisse commented Jul 5, 2023

Superseeds #35

@rpelisse
Copy link
Collaborator Author

rpelisse commented Jul 6, 2023

@rmarting I've looked into this. The issue is that Connect can't connect to zk and the brokers:

[2023-07-06 07:45:32,750] ERROR Stopping due to error (org.apache.kafka.connect.cli.ConnectStandalone:126)
org.apache.kafka.connect.errors.ConnectException: Failed to connect to and describe Kafka cluster. Check worker's broker connection and security properties.
        at org.apache.kafka.connect.util.ConnectUtils.lookupKafkaClusterId(ConnectUtils.java:77)
        at org.apache.kafka.connect.util.ConnectUtils.lookupKafkaClusterId(ConnectUtils.java:58)
        at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:82)
Caused by: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.IllegalSaslStateException: Unexpected handshake request with client mechanism SCRAM-SHA-512, enabled mechanisms are []
        at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
        at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1999)
        at org.apache.kafka.common.internals.KafkaFutureImpl.get(KafkaFutureImpl.java:165)
        at org.apache.kafka.connect.util.ConnectUtils.lookupKafkaClusterId(ConnectUtils.java:71)
        ... 2 more
Caused by: org.apache.kafka.common.errors.IllegalSaslStateException: Unexpected handshake request with client mechanism SCRAM-SHA-512, enable

Do you have an idea on what's going wrong here?

@rmarting
Copy link
Collaborator

rmarting commented Jul 6, 2023

I think, it is related to the authentication mechanism used by Kafka connect againts the Kafka Broker. If I am not wrong, that issue is because the Kafka connect is using sasl.mechanism=SCRAM-SHA-512 to establish the authentication, however the Kafka broker is not defined with that value.

Please, review the way you are configuring the Kafka broker, and then Kafka connect must use the same configuration.

If the property amq_streams_connect_broker_auth_scram_enabled is enabled, then Kafka connect will use that mechanism to connect to the user, otherwise PLAIN will be used.

Can you share your playbook to deploy the kafka broker (with authentication), and the kafka connect cluster?

@rpelisse
Copy link
Collaborator Author

rpelisse commented Jul 6, 2023

@rmarting I'm testing your PR ! So it's your setup :)

It's running the playbooks/playbook.yml and it's using default from the two other roles (zk and brokers) to setup the rest of the cluster.

@rmarting
Copy link
Collaborator

rmarting commented Jul 6, 2023

Ok! Let me review my original PR with the default playbook, as it seems that the Kafka connect is failing the authentication ... so maybe there is something wrong! Thanks for the heads up!

@rmarting
Copy link
Collaborator

rmarting commented Jul 6, 2023

Last question, could you share here the content of the connect-standalone.properties of the Kafka connect cluster deployed? thanks

@rpelisse
Copy link
Collaborator Author

rpelisse commented Jul 6, 2023

It's the one based on the templates in the collection: roles/amq_streams_connect/templates/connect-standalone.properties.j2

I'm running the PR locally so I can post here the resulting file.

@rpelisse
Copy link
Collaborator Author

rpelisse commented Jul 6, 2023

And here is the resulting conf:

# Ansible managed

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# These are defaults. This file just demonstrates how to override some settings.
bootstrap.servers=localhost:9092

security.protocol=SASL_PLAINTEXT
sasl.mechanism=SCRAM-SHA-512
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
                 username=broker \
                 password=PLEASE_CHANGEME_IAMNOTGOOD_FOR_PRODUCTION;

# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will
# need to configure these based on the format they want their data in when loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply
# it to
key.converter.schemas.enable=true
value.converter.schemas.enable=true

offset.storage.file.filename=/tmp/connect.offsets
# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000

# Set to a list of filesystem paths separated by commas (,) to enable class loading isolation for plugins
# (connectors, converters, transformations). The list should consist of top level directories that include 
# any combination of: 
# a) directories immediately containing jars with plugins and their dependencies
# b) uber-jars with plugins and their dependencies
# c) directories immediately containing the package directory structure of classes of plugins and their dependencies
# Note: symlinks will be followed to discover dependencies or plugins.
# Examples: 
# plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,
plugin.path=/opt/kafka_2.13-3.3.2/libs/connect-file-3.3.2.jar

@rmarting
Copy link
Collaborator

rmarting commented Jul 6, 2023

By default the variable amq_streams_connect_broker_auth_enabled is false, so the following block should not be included in that file:

security.protocol=SASL_PLAINTEXT
sasl.mechanism=SCRAM-SHA-512
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
                 username=broker \
                 password=PLEASE_CHANGEME_IAMNOTGOOD_FOR_PRODUCTION;

So, I think that the issue is because the if clause is not well declare to skip that section. I tested with the following commit in my original PR for your review and test with this branch.

e946b68

The file must not include anything related to authentication, as the default values are false.

@rpelisse rpelisse merged commit 9d6cb90 into main Jul 11, 2023
1 check passed
@rpelisse rpelisse added the major_changes Major changes mean the user can CHOOSE to make a change when they update but do not have to label Jul 11, 2023
@rmarting
Copy link
Collaborator

Thanks @rpelisse for your review, and extended contributions to close this amazing PR. It is a huge step in the collection to cover more complex scenarios.

@rpelisse rpelisse deleted the rmarting_feat-bk-auth branch July 24, 2023 11:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
major_changes Major changes mean the user can CHOOSE to make a change when they update but do not have to
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants