Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quarkus 1.7.0 + Camel-Quarkus Kafka: Failed to configure SaslClientAuthenticator #11411

Closed
bframke opened this issue Aug 17, 2020 · 24 comments
Closed
Labels
area/camel kind/bug Something isn't working
Milestone

Comments

@bframke
Copy link

bframke commented Aug 17, 2020

Describe the bug
After an update from Quarkus 1.5.2 to Quarkus 1.6.0+ (currently tested with 1.7.0 same behavior) we now receive an error message when we try to secure the Kafka Connection in the App, while also using the camel-quarkus kafka component. We are not using Kafka-Streams. We are also not using an Kerberos, only good old configuration and its values.

Error

Caused by: org.apache.kafka.common.errors.SaslAuthenticationException: Failed to configure SaslClientAuthenticator
Caused by: org.apache.kafka.common.KafkaException: Principal could not be determined from Subject, this may be a transient failure due to Kerberos re-login
        at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.firstPrincipal(SaslClientAuthenticator.java:579)
        at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.<init>(SaslClientAuthenticator.java:171)
        at org.apache.kafka.common.network.SaslChannelBuilder.buildClientAuthenticator(SaslChannelBuilder.java:274)
        at org.apache.kafka.common.network.SaslChannelBuilder.lambda$buildChannel$1(SaslChannelBuilder.java:216)
        at org.apache.kafka.common.network.KafkaChannel.<init>(KafkaChannel.java:143)
        at org.apache.kafka.common.network.SaslChannelBuilder.buildChannel(SaslChannelBuilder.java:224)
        at org.apache.kafka.common.network.Selector.buildAndAttachKafkaChannel(Selector.java:338)
        at org.apache.kafka.common.network.Selector.registerChannel(Selector.java:329)
        at org.apache.kafka.common.network.Selector.connect(Selector.java:256)
        at org.apache.kafka.clients.NetworkClient.initiateConnect(NetworkClient.java:957)
        at org.apache.kafka.clients.NetworkClient.access$600(NetworkClient.java:73)
        at org.apache.kafka.clients.NetworkClient$DefaultMetadataUpdater.maybeUpdate(NetworkClient.java:1128)
        at org.apache.kafka.clients.NetworkClient$DefaultMetadataUpdater.maybeUpdate(NetworkClient.java:1016)
        at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:547)
        at org.apache.kafka.clients.producer.internals.Sender.runOnce(Sender.java:324)
        at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:239)
        at java.base/java.lang.Thread.run(Thread.java:834)

Application.properties

camel.component.kafka.worker-pool-core-size=20
camel.component.kafka.worker-pool-max-size=200
camel.component.kafka.enable-idempotence=true
camel.component.kafka.max-in-flight-request=1
camel.component.kafka.retries=1
camel.component.kafka.request-required-acks=all
camel.component.kafka.max-request-size=52428800
camel.component.kafka.reconnect-backoff-max-ms=1000

camel.component.kafka.sasl-jaas-config=org.apache.kafka.common.security.scram.ScramLoginModule required \
  username="admin" password="admin123" \
  user_admin="admin123";
camel.component.kafka.security-protocol=SASL_PLAINTEXT
camel.component.kafka.sasl-mechanism=SCRAM-SHA-512
kafka.security-protocol=SASL_PLAINTEXT

Jaas-Config for Command

KafkaClient {
  org.apache.kafka.common.security.scram.ScramLoginModule  required
  serviceName="kafka"
  principal="admin"
  username="admin"
  password="admin123"
  user_admin="admin123";
};

Jaas-Config for Kafka

KafkaServer {
  org.apache.kafka.common.security.scram.ScramLoginModule required
  username="admin"
  password="admin123"
  user_admin="admin123";
};
Client {};

Run Gradle Task

gradle -Djava.security.auth.login.config=$HOME/tgit/regressions/kafka/kafka_client_jaas.conf producer:quarkusd

Docker-Compose Kafka

version: '2'
services:
  zookeeper:
    image: wurstmeister/zookeeper
    ports:
      - "2181:2181"
    environment:
      ZOOKEEPER_TICK_TIME: 2000
  kafka:
    image: wurstmeister/kafka
    hostname: kafka
    depends_on:
      - zookeeper
    ports:
      - "9092:9092"
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ADVERTISED_HOST_NAME: kafka
      KAFKA_ADVERTISED_PORT: 9092
      KAFKA_PORT: 9094
      KAFKA_OPTS: "-Djava.security.auth.login.config=/etc/kafka/kafka_server_jaas.conf"

      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT, OUTSIDE:PLAINTEXT
      KAFKA_LISTENERS: INSIDE://:9094, OUTSIDE://kafka:9092
      KAFKA_ADVERTISED_LISTENERS: INSIDE://:9094, OUTSIDE://kafka:9092
      KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE

      KAFKA_SASL_ENABLED_MECHANISMS: SCRAM-SHA-256, SCRAM-SHA-512
      KAFKA_SASL_MECHANISM_INTER_BROKER_PROTOCOL: SCRAM-SHA-512

      #KAFKA_SSL_KEYSTORE_LOCATION: /etc/kafka/kafka.keystore.jks
      #KAFKA_SSL_KEYSTORE_PASSWORD: test123
      #KAFKA_SSL_KEY_PASSWORD: test123
      #KAFKA_SSL_TRUSTSTORE_LOCATION: /etc/kafka/kafka.truststore.jks
      #KAFKA_SSL_TRUSTSTORE_PASSWORD: test123
      #KAFKA_SSL_ENABLED_PROTOCOLS: TLSv1.2
      #KAFKA_SSL_KEYSTORE_TYPE: JKS
      #KAFKA_SSL_TRUSTSTORE_TYPE: JKS
      #KAFKA_SSL_CLIENT_AUTH: none

      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_CREATE_TOPICS: "my-topic:1:1"
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
      - ./kafka_server_jaas.conf:/etc/kafka/kafka_server_jaas.conf
      - ./kafka.keystore.jks:/etc/kafka/kafka.keystore.jks
      - ./kafka.truststore.jks:/etc/kafka/kafka.truststore.jks
    networks:
      - default
  kafdrop:
      image: obsidiandynamics/kafdrop
      depends_on:
        - kafka
      restart: "no"
      ports:
        - "9000:9000"
      environment:
        KAFKA_BROKERCONNECT: "kafka:9092"
        JVM_OPTS: "-Xms16M -Xmx48M -Xss180K -XX:-TieredCompilation -XX:+UseStringDeduplication -noverify"
        #KAFKA_PROPERTIES: ${KAFKA_DROP_PROPERTIES_BASE64}
        #KAFKA_TRUSTSTORE: ${KAFKA_DROP_TRUSTSTORE_BASE64}
        #KAFKA_KEYSTORE: ${KAFKA_DROP_KEYSTORE_BASE64}
  proxy:
    image: defreitas/dns-proxy-server
    hostname: proxy
    ports:
      - "5380:5380"
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
      - /etc/resolv.conf:/etc/resolv.conf

Kafka with Compose

docker-compose down && docker-compose up -d
docker exec -i kafka_kafka_1  kafka-configs.sh --zookeeper kafka_zookeeper_1:2181 --alter --add-config 'SCRAM-SHA-256=[password=admin123],SCRAM-SHA-512=[password=admin123]' --entity-type users --entity-name admin

Expected behavior
Producer can build up the Camel-Route which uses Camel-Quarkus Kafka Consumer.

Actual behavior
Producer can't create the Route because it tries to get a Principal to use Scram mechanism, but can't get it.

To Reproduce
Steps to reproduce the behavior:

  1. Create a Quarkus Project with Camel-Quarkus Kafka + Wurstmeister Kafka up and running with Security Config.
  2. Use given Config Params to try and configure the minimum.
  3. Use given Command with path to jaas.conf to get rid of the missing login
  4. Try to create a Camel Route which works with Kafka

Configuration

# Add your application.properties here, if applicable.
camel.component.kafka.worker-pool-core-size=20
camel.component.kafka.worker-pool-max-size=200
camel.component.kafka.enable-idempotence=true
camel.component.kafka.max-in-flight-request=1
camel.component.kafka.retries=1
camel.component.kafka.request-required-acks=all
camel.component.kafka.max-request-size=52428800
camel.component.kafka.reconnect-backoff-max-ms=1000

camel.component.kafka.sasl-jaas-config=org.apache.kafka.common.security.scram.ScramLoginModule required \
  username="admin" password="admin123" \
  user_admin="admin123";
camel.component.kafka.security-protocol=SASL_PLAINTEXT
camel.component.kafka.sasl-mechanism=SCRAM-SHA-512
kafka.security-protocol=SASL_PLAINTEXT

Screenshots

Environment (please complete the following information):

  • Output of uname -a or ver:
    Linux bfr-pc 5.7.9-1-MANJARO Switch to the Maven distributed copy of the SubstrateVM annotations #1 SMP PREEMPT Thu Jul 16 08:20:05 UTC 2020 x86_64 GNU/Linux

  • Output of java -version:
    openjdk version "11.0.7" 2020-04-14
    OpenJDK Runtime Environment (build 11.0.7+10)
    OpenJDK 64-Bit Server VM (build 11.0.7+10, mixed mode)

  • GraalVM version (if different from Java): 20.1

  • Quarkus version or git rev: 1.7.0

  • Build tool (ie. output of mvnw --version or gradlew --version):


Gradle 6.5.1

Build time: 2020-07-15 13:08:58 UTC
Revision:

Kotlin: 1.3.72
Groovy: 2.5.11
Ant: Apache Ant(TM) version 1.10.7 compiled on September 1 2019
JVM: 11.0.7 (Oracle Corporation 11.0.7+10)
OS: Linux 5.7.9-1-MANJARO amd64

@bframke bframke added the kind/bug Something isn't working label Aug 17, 2020
@bframke
Copy link
Author

bframke commented Aug 25, 2020

Is none interested in it? Or is it something thats totally out of scope here?

@lburgazzoli
Copy link
Contributor

@bframke do you have a reproducer for that ?

@bframke
Copy link
Author

bframke commented Sep 4, 2020

Now I've created one, which gets the error directly on startup.
https://github.com/bframke/quarkus-examples/tree/master/camel-kafka-sasl

Tested and build with Quarkus 1.7.1

@lburgazzoli
Copy link
Contributor

can you add some instruction about how to run it ?

@bframke
Copy link
Author

bframke commented Sep 4, 2020

There is a start.sh in the project, which starts the Project with the command I've also named in the first part of the ticket.

For now per default Kafka with SASL (example on how to deactivate it is also in the first part of the ticket) can be started with the restart-kafka.sh in the kafka directory.

The application will try to connect to the Kafka via SASL, this can be changed with the application.properties, with just commenting out the security part of the application. What needs to be commented out is also in the first part of the ticket

@lburgazzoli
Copy link
Contributor

lburgazzoli commented Sep 4, 2020

it does not seems able to connect, what I get is:

{"timestamp":"2020.09.04T13:59:34.478+0200", "message":"[Producer clientId=producer-1] Bootstrap broker localhost:9092 (id: -1 rack: null) disconnected", "level": "WARN", "source": "mars:org.apache.kafka.clients.NetworkClient:kafka-producer-network-thread | producer-1","X-Correlation-ID":"", "X-B3-TraceId":"", "X-B3-SpanId":""}

The reason is:

kafdrop_1    | 2020-09-04 12:16:24.359 ERROR 1 [           main] o.s.b.SpringApplication                  : Application run failed
kafdrop_1    | 
kafdrop_1    | org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'aclController' defined in URL [jar:file:/kafdrop-3.27.0/kafdrop-3.27.0.jar!/BOOT-INF/classes!/kafdrop/controller/AclController.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'kafkaMonitorImpl' defined in URL [jar:file:/kafdrop-3.27.0/kafdrop-3.27.0.jar!/BOOT-INF/classes!/kafdrop/service/KafkaMonitorImpl.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'kafkaHighLevelConsumer': Invocation of init method failed; nested exception is org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
kafdrop_1    | 	at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:769)
kafdrop_1    | 	at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:218)
kafdrop_1    | 	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1341)
kafdrop_1    | 	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1187)
kafdrop_1    | 	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555)
kafdrop_1    | 	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515)
kafdrop_1    | 	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320)
kafdrop_1    | 	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
kafdrop_1    | 	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318)
kafdrop_1    | 	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
kafdrop_1    | 	at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:845)
kafdrop_1    | 	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:877)
kafdrop_1    | 	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:549)
kafdrop_1    | 	at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:141)
kafdrop_1    | 	at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:744)
kafdrop_1    | 	at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:391)
kafdrop_1    | 	at org.springframework.boot.SpringApplication.run(SpringApplication.java:312)
kafdrop_1    | 	at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:140)
kafdrop_1    | 	at kafdrop.Kafdrop.main(Kafdrop.java:53)
kafdrop_1    | 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
kafdrop_1    | 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
kafdrop_1    | 	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
kafdrop_1    | 	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
kafdrop_1    | 	at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
kafdrop_1    | 	at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
kafdrop_1    | 	at org.springframework.boot.loader.Launcher.launch(Launcher.java:51)
kafdrop_1    | 	at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:52)

@bframke
Copy link
Author

bframke commented Sep 4, 2020

Yes, you have to start the Kafka first, then the application, default uri to reach the kafka is localhost:9092, if it is not the same on your system there is a kafka broker list in the application.properties, where you can change it

@lburgazzoli
Copy link
Contributor

The reason kafka does not start is:

kafka_1      | [2020-09-04 14:26:25,569] WARN SASL configuration failed: javax.security.auth.login.LoginException: No JAAS configuration section named 'Client' was found in specified JAAS configuration file: '/etc/kafka/kafka_server_jaas.conf'. Will continue connection to Zookeeper server without SASL authentication, if Zookeeper server allows it. (org.apache.zookeeper.ClientCnxn)
kafka_1      | [2020-09-04 14:26:25,569] INFO Opening socket connection to server zookeeper/172.21.0.2:2181 (org.apache.zookeeper.ClientCnxn)
kafka_1      | [2020-09-04 14:26:25,569] ERROR [ZooKeeperClient Kafka server] Auth failed. (kafka.zookeeper.ZooKeeperClient)

@bframke
Copy link
Author

bframke commented Sep 4, 2020

Okay, normally you just need docker-compose, use restart-kafka.sh and thats it. We did not modify the image in any away that we needed to rebuild it. On docker-compose, executed in the same folder where the files are, it normally gets them into it because we add them as a docker-volume

@lburgazzoli
Copy link
Contributor

I'm using that script but does not seem to work (on fedora)

@bframke
Copy link
Author

bframke commented Sep 4, 2020

Okay, on ArchLinux and Ubuntu it did work, mh. Could be that Fedora does not add the docker volumes in the same way as the other two OS do or maybe something need to be changed there, not sure. Never worked with Fedora. Else we have to create a new docker image where the volumes are just in the docker image itself.

@lburgazzoli
Copy link
Contributor

I did try to use plain docker so I have:

docker network inspect quarkus >/dev/null 2>&1 || docker network create quarkus

docker run \
    --rm \
    -ti \
    --name zookeeper \
    --network quarkus \
    --env ZOOKEEPER_TICK_TIME=2000 \
    --publish 2181:2181 \
    wurstmeister/zookeeper

and

#!/usr/bin/env bash

docker network inspect quarkus >/dev/null 2>&1 || docker network create quarkus

docker run \
    --rm \
    -ti \
    --name kafka \
    --network quarkus \
    --env KAFKA_BROKER_ID="1" \
    --env KAFKA_ADVERTISED_HOST_NAME="kafka" \
    --env KAFKA_ADVERTISED_HOST_NAME="kafka" \
    --env KAFKA_PORT="9094" \
    --env KAFKA_OPTS="-Djava.security.auth.login.config=/etc/kafka/kafka_server_jaas.conf" \
    --env KAFKA_LISTENER_SECURITY_PROTOCOL_MAP="INSIDE:PLAINTEXT, OUTSIDE:PLAINTEXT" \
    --env KAFKA_LISTENERS="INSIDE://:9094, OUTSIDE://kafka:9092" \
    --env KAFKA_ADVERTISED_LISTENERS="INSIDE://:9094, OUTSIDE://kafka:9092" \
    --env KAFKA_INTER_BROKER_LISTENER_NAME="INSIDE" \
    --env KAFKA_SASL_ENABLED_MECHANISMS="SCRAM-SHA-256, SCRAM-SHA-512" \
    --env KAFKA_SASL_MECHANISM_INTER_BROKER_PROTOCOL="SCRAM-SHA-512" \
    --env KAFKA_SSL_KEYSTORE_LOCATION="/etc/kafka/kafka.keystore.jks" \
    --env KAFKA_SSL_KEYSTORE_PASSWORD="test123" \
    --env KAFKA_SSL_KEY_PASSWORD="test123" \
    --env KAFKA_SSL_TRUSTSTORE_LOCATION="/etc/kafka/kafka.truststore.jks" \
    --env KAFKA_SSL_TRUSTSTORE_PASSWORD="test123" \
    --env KAFKA_SSL_ENABLED_PROTOCOLS="TLSv1.2" \
    --env KAFKA_SSL_KEYSTORE_TYPE="JKS" \
    --env KAFKA_SSL_TRUSTSTORE_TYPE="JKS" \
    --env KAFKA_SSL_CLIENT_AUTH="none" \
    --env KAFKA_ZOOKEEPER_CONNECT="zookeeper:2181" \
    --env KAFKA_CREATE_TOPICS="my-topic:1:1" \
    --volume $PWD/kafka_server_jaas.conf:/etc/kafka/kafka_server_jaas.conf:Z \
    --volume $PWD/kafka.keystore.jks:/etc/kafka/kafka.keystore.jks:Z \
    --volume $PWD/kafka.truststore.jks:/etc/kafka/kafka.truststore.jks:Z \
    --publish 9092:9092 \
    wurstmeister/kafka

When starting kafka I keep getting:

[2020-09-04 16:49:16,586] WARN SASL configuration failed: javax.security.auth.login.LoginException: No JAAS configuration section named 'Client' was found in specified JAAS configuration file: '/etc/kafka/kafka_server_jaas.conf'. Will continue connection to Zookeeper server without SASL authentication, if Zookeeper server allows it. (org.apache.zookeeper.ClientCnxn)
[2020-09-04 16:49:16,587] INFO Opening socket connection to server zookeeper/172.20.0.2:2181 (org.apache.zookeeper.ClientCnxn)
[2020-09-04 16:49:16,589] ERROR [ZooKeeperClient Kafka server] Auth failed. (kafka.zookeeper.ZooKeeperClient)
[2020-09-04 16:49:16,594] INFO Socket error occurred: zookeeper/172.20.0.2:2181: Host is unreachable (org.apache.zookeeper.ClientCnxn)

I did check the content of the files on the container and they are properly mounted.

@lburgazzoli
Copy link
Contributor

ok, now it is better but I'm getting:

[2020-09-04 17:04:54,145] WARN [SocketServer brokerId=1] Unexpected error from /192.168.178.122; closing connection (org.apache.kafka.common.network.Selector)
org.apache.kafka.common.network.InvalidReceiveException: Invalid receive (size = 369296128 larger than 104857600)
	at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:105)
	at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:447)
	at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:397)
	at org.apache.kafka.common.network.Selector.attemptRead(Selector.java:678)
	at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:580)
	at org.apache.kafka.common.network.Selector.poll(Selector.java:485)
	at kafka.network.Processor.poll(SocketServer.scala:913)
	at kafka.network.Processor.run(SocketServer.scala:816)
	at java.lang.Thread.run(Thread.java:748)

@bframke
Copy link
Author

bframke commented Sep 4, 2020

Yeah, I got this error with the Bitnami Kafka in Kubernetes. I've increased the socketRequestMaxBytes from 104857600 to 209715200. I guess in your case it should be 419430400 bytes, it should hopefully help in this case

@lburgazzoli
Copy link
Contributor

lburgazzoli commented Sep 9, 2020

I've hopefully been able to have everything running and it works, I think the only missing part was that with the latest version of camel-quarkus you need to explicitly add camel-quarkus-main if you want to sue properties like camel.component.kafka.* (see https://camel.apache.org/camel-quarkus/latest/user-guide/bootstrap.html)

@bframke
Copy link
Author

bframke commented Sep 9, 2020

ohh, yeah there was a change in camel-quarkus, good to know, thanks. Thought everything I put in there was enough, but I guess for me he pulled it out my local cache.

@lburgazzoli
Copy link
Contributor

@bframke is the issues fixed by adding camel-main ?

@bframke
Copy link
Author

bframke commented Sep 11, 2020

No, in the main project where I did a copy from and where camel-main is already in because of opentracing this still happens.

@lburgazzoli
Copy link
Contributor

lburgazzoli commented Sep 11, 2020

so I guess I need some more help to get a working reproducer :)
the best would be to send a PR with a failing integration test on the camel-quarkus project

@bframke
Copy link
Author

bframke commented Sep 11, 2020

Yeah, I'm checking currently again in the reproducer, had some new errors, because of some tests I did some days ago.

@bframke
Copy link
Author

bframke commented Sep 11, 2020

Okay, added the missing camel-main to the reproducer, also just in case upgraded to 1.7.2 (also tested with 1.7.1) and now I get terminated during auth. So we are after the SASL Error from the beginning.

So looks like it works with this, maybe we have an bigger error in our project or it is already resolved and we didn't know.

@lburgazzoli
Copy link
Contributor

cool, I'll try to create a reproducer in any case on camel-quarkus so we can spot similar issues in case

@lburgazzoli
Copy link
Contributor

@bframke can we close this issue ?

@bframke
Copy link
Author

bframke commented Sep 11, 2020

Yes, we are away from the original problem. If it occurs anew I will open it, else there will be another one :D

@bframke bframke closed this as completed Sep 11, 2020
@gsmet gsmet added this to the 1.7.1.Final milestone Oct 1, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/camel kind/bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants