Installing connector plugins Running in a "--no-prompt" mode Implicit acceptance of the license below: Apache 2.0 https://github.com/debezium/debezium/blob/master/LICENSE.txt Implicit confirmation of the question: You are about to install 'debezium-connector-mysql' from Debezium Community, as published on Confluent Hub. Downloading component Debezium MySQL CDC Connector 1.1.0, provided by Debezium Community from Confluent Hub and installing into /usr/share/confluent-hub-components Adding installation directory to plugin path in the following files: /etc/kafka/connect-distributed.properties /etc/kafka/connect-standalone.properties /etc/schema-registry/connect-avro-distributed.properties /etc/schema-registry/connect-avro-standalone.properties Completed Running in a "--no-prompt" mode Implicit acceptance of the license below: Apache License 2.0 https://www.apache.org/licenses/LICENSE-2.0 Downloading component Kafka Connect Datagen 0.3.1, provided by Confluent, Inc. from Confluent Hub and installing into /usr/share/confluent-hub-components Adding installation directory to plugin path in the following files: /etc/kafka/connect-distributed.properties /etc/kafka/connect-standalone.properties /etc/schema-registry/connect-avro-distributed.properties /etc/schema-registry/connect-avro-standalone.properties Completed Launching Kafka Connect worker Waiting for Kafka Connect to start listening on localhost ⏳ ===> ENV Variables ... ALLOW_UNSIGNED=false COMPONENT=kafka-connect CONFLUENT_DEB_VERSION=1 CONFLUENT_PLATFORM_LABEL= CONFLUENT_VERSION=5.5.0 CONNECT_BOOTSTRAP_SERVERS=kafka:29092 CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR=1 CONNECT_CONFIG_STORAGE_TOPIC=_kafka-connect-01-configs CONNECT_GROUP_ID=kafka-connect-01 CONNECT_INTERNAL_KEY_CONVERTER=org.apache.kafka.connect.json.JsonConverter CONNECT_INTERNAL_VALUE_CONVERTER=org.apache.kafka.connect.json.JsonConverter CONNECT_KEY_CONVERTER=io.confluent.connect.avro.AvroConverter CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL=http://schema-registry:8081 CONNECT_LOG4J_APPENDER_STDOUT_LAYOUT_CONVERSIONPATTERN=[%d] %p %X{connector.context}%m (%c:%L)%n CONNECT_LOG4J_LOGGERS=org.apache.kafka.connect.runtime.rest=WARN,org.reflections=ERROR CONNECT_LOG4J_ROOT_LOGLEVEL=INFO CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR=1 CONNECT_OFFSET_STORAGE_TOPIC=_kafka-connect-01-offsets CONNECT_PLUGIN_PATH=/usr/share/java,/usr/share/confluent-hub-components/,/connectors/ CONNECT_REST_ADVERTISED_HOST_NAME=kafka-connect-01 CONNECT_REST_PORT=8083 CONNECT_STATUS_STORAGE_REPLICATION_FACTOR=1 CONNECT_STATUS_STORAGE_TOPIC=_kafka-connect-01-status CONNECT_VALUE_CONVERTER=io.confluent.connect.avro.AvroConverter CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL=http://schema-registry:8081 CUB_CLASSPATH=/etc/confluent/docker/docker-utils.jar HOME=/root HOSTNAME=2ded14f9fd40 KAFKA_ADVERTISED_LISTENERS= KAFKA_VERSION= KAFKA_ZOOKEEPER_CONNECT= LANG=C.UTF-8 PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin PWD=/ PYTHON_PIP_VERSION=8.1.2 PYTHON_VERSION=2.7.9-1 SCALA_VERSION=2.12 SHLVL=2 ZULU_OPENJDK_VERSION=8=8.38.0.13 _=/usr/bin/env ===> User uid=0(root) gid=0(root) groups=0(root) ===> Configuring ... Tue Jul 14 16:48:58 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) Tue Jul 14 16:49:03 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) Tue Jul 14 16:49:08 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) Tue Jul 14 16:49:13 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) Tue Jul 14 16:49:18 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) ===> Running preflight checks ... ===> Check if Kafka is healthy ... ===> Launching ... ===> Launching kafka-connect ... Tue Jul 14 16:49:23 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) [2020-07-14 16:49:25,174] INFO WorkerInfo values: jvm.args = -Xms256M, -Xmx2G, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -XX:MaxInlineLevel=15, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote=true, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dkafka.logs.dir=/var/log/kafka, -Dlog4j.configuration=file:/etc/kafka/connect-log4j.properties jvm.spec = Azul Systems, Inc., OpenJDK 64-Bit Server VM, 1.8.0_212, 25.212-b04 jvm.classpath = /etc/kafka-connect/jars/*:/usr/share/java/kafka/connect-basic-auth-extension-5.5.0-ccs.jar:/usr/share/java/kafka/jetty-http-9.4.24.v20191120.jar:/usr/share/java/kafka/jetty-io-9.4.24.v20191120.jar:/usr/share/java/kafka/commons-cli-1.4.jar:/usr/share/java/kafka/jakarta.annotation-api-1.3.4.jar:/usr/share/java/kafka/osgi-resource-locator-1.0.1.jar:/usr/share/java/kafka/jetty-server-9.4.24.v20191120.jar:/usr/share/java/kafka/netty-resolver-4.1.45.Final.jar:/usr/share/java/kafka/jackson-module-jaxb-annotations-2.10.2.jar:/usr/share/java/kafka/jackson-jaxrs-json-provider-2.10.2.jar:/usr/share/java/kafka/zstd-jni-1.4.4-7.jar:/usr/share/java/kafka/netty-transport-4.1.45.Final.jar:/usr/share/java/kafka/kafka_2.12-5.5.0-ccs-scaladoc.jar:/usr/share/java/kafka/jakarta.ws.rs-api-2.1.5.jar:/usr/share/java/kafka/jetty-client-9.4.24.v20191120.jar:/usr/share/java/kafka/validation-api-2.0.1.Final.jar:/usr/share/java/kafka/connect-file-5.5.0-ccs.jar:/usr/share/java/kafka/jakarta.xml.bind-api-2.3.2.jar:/usr/share/java/kafka/jersey-container-servlet-2.28.jar:/usr/share/java/kafka/slf4j-api-1.7.30.jar:/usr/share/java/kafka/maven-artifact-3.6.3.jar:/usr/share/java/kafka/aopalliance-repackaged-2.5.0.jar:/usr/share/java/kafka/netty-transport-native-epoll-4.1.45.Final.jar:/usr/share/java/kafka/slf4j-log4j12-1.7.30.jar:/usr/share/java/kafka/kafka.jar:/usr/share/java/kafka/kafka_2.12-5.5.0-ccs-javadoc.jar:/usr/share/java/kafka/rocksdbjni-5.18.3.jar:/usr/share/java/kafka/connect-transforms-5.5.0-ccs.jar:/usr/share/java/kafka/commons-compress-1.19.jar:/usr/share/java/kafka/reflections-0.9.12.jar:/usr/share/java/kafka/jackson-jaxrs-base-2.10.2.jar:/usr/share/java/kafka/jackson-module-scala_2.12-2.10.2.jar:/usr/share/java/kafka/audience-annotations-0.5.0.jar:/usr/share/java/kafka/support-metrics-client-5.5.0-ccs.jar:/usr/share/java/kafka/kafka-streams-scala_2.12-5.5.0-ccs.jar:/usr/share/java/kafka/javassist-3.22.0-CR2.jar:/usr/share/java/kafka/connect-mirror-5.5.0-ccs.jar:/usr/share/java/kafka/jersey-container-servlet-core-2.28.jar:/usr/share/java/kafka/scala-java8-compat_2.12-0.9.0.jar:/usr/share/java/kafka/httpclient-4.5.11.jar:/usr/share/java/kafka/jersey-server-2.28.jar:/usr/share/java/kafka/jakarta.inject-2.5.0.jar:/usr/share/java/kafka/kafka_2.12-5.5.0-ccs-sources.jar:/usr/share/java/kafka/zookeeper-jute-3.5.7.jar:/usr/share/java/kafka/javax.servlet-api-3.1.0.jar:/usr/share/java/kafka/commons-lang3-3.8.1.jar:/usr/share/java/kafka/netty-handler-4.1.45.Final.jar:/usr/share/java/kafka/commons-logging-1.2.jar:/usr/share/java/kafka/support-metrics-common-5.5.0-ccs.jar:/usr/share/java/kafka/jackson-dataformat-csv-2.10.2.jar:/usr/share/java/kafka/jersey-common-2.28.jar:/usr/share/java/kafka/connect-json-5.5.0-ccs.jar:/usr/share/java/kafka/kafka_2.12-5.5.0-ccs-test-sources.jar:/usr/share/java/kafka/jetty-servlet-9.4.24.v20191120.jar:/usr/share/java/kafka/kafka_2.12-5.5.0-ccs-test.jar:/usr/share/java/kafka/httpcore-4.4.13.jar:/usr/share/java/kafka/jackson-core-2.10.2.jar:/usr/share/java/kafka/jackson-datatype-jdk8-2.10.2.jar:/usr/share/java/kafka/jetty-continuation-9.4.24.v20191120.jar:/usr/share/java/kafka/jopt-simple-5.0.4.jar:/usr/share/java/kafka/zookeeper-3.5.7.jar:/usr/share/java/kafka/javassist-3.26.0-GA.jar:/usr/share/java/kafka/hk2-api-2.5.0.jar:/usr/share/java/kafka/jetty-servlets-9.4.24.v20191120.jar:/usr/share/java/kafka/jakarta.activation-api-1.2.1.jar:/usr/share/java/kafka/paranamer-2.8.jar:/usr/share/java/kafka/scala-collection-compat_2.12-2.1.3.jar:/usr/share/java/kafka/metrics-core-2.2.0.jar:/usr/share/java/kafka/kafka-streams-examples-5.5.0-ccs.jar:/usr/share/java/kafka/kafka-clients-5.5.0-ccs.jar:/usr/share/java/kafka/scala-library-2.12.10.jar:/usr/share/java/kafka/avro-1.9.2.jar:/usr/share/java/kafka/connect-runtime-5.5.0-ccs.jar:/usr/share/java/kafka/kafka-streams-5.5.0-ccs.jar:/usr/share/java/kafka/javax.ws.rs-api-2.1.1.jar:/usr/share/java/kafka/scala-reflect-2.12.10.jar:/usr/share/java/kafka/jaxb-api-2.3.0.jar:/usr/share/java/kafka/jackson-module-paranamer-2.10.2.jar:/usr/share/java/kafka/hk2-utils-2.5.0.jar:/usr/share/java/kafka/jetty-security-9.4.24.v20191120.jar:/usr/share/java/kafka/netty-transport-native-unix-common-4.1.45.Final.jar:/usr/share/java/kafka/jersey-client-2.28.jar:/usr/share/java/kafka/netty-common-4.1.45.Final.jar:/usr/share/java/kafka/jackson-annotations-2.10.2.jar:/usr/share/java/kafka/jersey-media-jaxb-2.28.jar:/usr/share/java/kafka/argparse4j-0.7.0.jar:/usr/share/java/kafka/jackson-databind-2.10.2.jar:/usr/share/java/kafka/commons-codec-1.11.jar:/usr/share/java/kafka/kafka-tools-5.5.0-ccs.jar:/usr/share/java/kafka/lz4-java-1.7.1.jar:/usr/share/java/kafka/netty-codec-4.1.45.Final.jar:/usr/share/java/kafka/connect-mirror-client-5.5.0-ccs.jar:/usr/share/java/kafka/activation-1.1.1.jar:/usr/share/java/kafka/connect-api-5.5.0-ccs.jar:/usr/share/java/kafka/scala-logging_2.12-3.9.2.jar:/usr/share/java/kafka/httpmime-4.5.11.jar:/usr/share/java/kafka/plexus-utils-3.2.1.jar:/usr/share/java/kafka/netty-buffer-4.1.45.Final.jar:/usr/share/java/kafka/log4j-1.2.17.jar:/usr/share/java/kafka/jetty-util-9.4.24.v20191120.jar:/usr/share/java/kafka/jersey-hk2-2.28.jar:/usr/share/java/kafka/kafka_2.12-5.5.0-ccs.jar:/usr/share/java/kafka/kafka-streams-test-utils-5.5.0-ccs.jar:/usr/share/java/kafka/kafka-log4j-appender-5.5.0-ccs.jar:/usr/share/java/kafka/hk2-locator-2.5.0.jar:/usr/share/java/kafka/snappy-java-1.1.7.3.jar:/usr/share/java/kafka/confluent-metrics-5.5.0-ce.jar:/usr/share/java/confluent-common/common-utils-5.5.0.jar:/usr/share/java/confluent-common/slf4j-api-1.7.26.jar:/usr/share/java/confluent-common/common-config-5.5.0.jar:/usr/share/java/confluent-common/common-metrics-5.5.0.jar:/usr/share/java/confluent-common/build-tools-5.5.0.jar:/usr/share/java/kafka-serde-tools/kafka-streams-avro-serde-5.5.0.jar:/usr/share/java/kafka-serde-tools/swagger-models-1.5.3.jar:/usr/share/java/kafka-serde-tools/re2j-1.3.jar:/usr/share/java/kafka-serde-tools/kotlin-scripting-compiler-embeddable-1.3.50.jar:/usr/share/java/kafka-serde-tools/jackson-datatype-jsr310-2.10.2.jar:/usr/share/java/kafka-serde-tools/kafka-streams-protobuf-serde-5.5.0.jar:/usr/share/java/kafka-serde-tools/kotlinx-coroutines-core-1.1.1.jar:/usr/share/java/kafka-serde-tools/commons-digester-1.8.1.jar:/usr/share/java/kafka-serde-tools/protobuf-java-util-3.11.4.jar:/usr/share/java/kafka-serde-tools/kafka-protobuf-serializer-5.5.0.jar:/usr/share/java/kafka-serde-tools/classmate-1.3.4.jar:/usr/share/java/kafka-serde-tools/kotlin-reflect-1.3.50.jar:/usr/share/java/kafka-serde-tools/validation-api-2.0.1.Final.jar:/usr/share/java/kafka-serde-tools/wire-schema-3.1.0.jar:/usr/share/java/kafka-serde-tools/animal-sniffer-annotations-1.18.jar:/usr/share/java/kafka-serde-tools/snakeyaml-1.24.jar:/usr/share/java/kafka-serde-tools/joda-time-2.10.2.jar:/usr/share/java/kafka-serde-tools/handy-uri-templates-2.1.8.jar:/usr/share/java/kafka-serde-tools/jackson-datatype-joda-2.10.2.jar:/usr/share/java/kafka-serde-tools/kotlin-scripting-compiler-impl-embeddable-1.3.50.jar:/usr/share/java/kafka-serde-tools/j2objc-annotations-1.3.jar:/usr/share/java/kafka-serde-tools/kafka-connect-protobuf-converter-5.5.0.jar:/usr/share/java/kafka-serde-tools/jakarta.ws.rs-api-2.1.6.jar:/usr/share/java/kafka-serde-tools/rocksdbjni-5.18.3.jar:/usr/share/java/kafka-serde-tools/gson-2.8.5.jar:/usr/share/java/kafka-serde-tools/commons-compress-1.19.jar:/usr/share/java/kafka-serde-tools/kafka-schema-serializer-5.5.0.jar:/usr/share/java/kafka-serde-tools/error_prone_annotations-2.3.4.jar:/usr/share/java/kafka-serde-tools/annotations-13.0.jar:/usr/share/java/kafka-serde-tools/kotlin-scripting-jvm-1.3.50.jar:/usr/share/java/kafka-serde-tools/kafka-schema-registry-client-5.5.0.jar:/usr/share/java/kafka-serde-tools/failureaccess-1.0.1.jar:/usr/share/java/kafka-serde-tools/jackson-dataformat-yaml-2.10.2.jar:/usr/share/java/kafka-serde-tools/hibernate-validator-6.0.17.Final.jar:/usr/share/java/kafka-serde-tools/checker-compat-qual-2.5.5.jar:/usr/share/java/kafka-serde-tools/jersey-server-2.30.jar:/usr/share/java/kafka-serde-tools/json-20190722.jar:/usr/share/java/kafka-serde-tools/commons-lang3-3.8.1.jar:/usr/share/java/kafka-serde-tools/commons-logging-1.2.jar:/usr/share/java/kafka-serde-tools/kotlin-stdlib-common-1.3.61.jar:/usr/share/java/kafka-serde-tools/kafka-avro-serializer-5.5.0.jar:/usr/share/java/kafka-serde-tools/jackson-core-2.10.2.jar:/usr/share/java/kafka-serde-tools/jackson-datatype-jdk8-2.10.2.jar:/usr/share/java/kafka-serde-tools/kafka-connect-avro-data-5.5.0.jar:/usr/share/java/kafka-serde-tools/kafka-connect-avro-converter-5.5.0.jar:/usr/share/java/kafka-serde-tools/jakarta.el-api-3.0.3.jar:/usr/share/java/kafka-serde-tools/jersey-common-2.30.jar:/usr/share/java/kafka-serde-tools/osgi-resource-locator-1.0.3.jar:/usr/share/java/kafka-serde-tools/kotlin-stdlib-jdk7-1.3.61.jar:/usr/share/java/kafka-serde-tools/scala-library-2.12.10.jar:/usr/share/java/kafka-serde-tools/kotlin-stdlib-1.3.61.jar:/usr/share/java/kafka-serde-tools/swagger-core-1.5.3.jar:/usr/share/java/kafka-serde-tools/avro-1.9.2.jar:/usr/share/java/kafka-serde-tools/kafka-protobuf-provider-5.5.0.jar:/usr/share/java/kafka-serde-tools/jackson-module-parameter-names-2.10.2.jar:/usr/share/java/kafka-serde-tools/kotlin-scripting-common-1.3.50.jar:/usr/share/java/kafka-serde-tools/kafka-streams-5.5.0-ccs.jar:/usr/share/java/kafka-serde-tools/jakarta.el-3.0.2.jar:/usr/share/java/kafka-serde-tools/kafka-connect-json-schema-converter-5.5.0.jar:/usr/share/java/kafka-serde-tools/kafka-streams-json-schema-serde-5.5.0.jar:/usr/share/java/kafka-serde-tools/jakarta.annotation-api-1.3.5.jar:/usr/share/java/kafka-serde-tools/okio-2.4.3.jar:/usr/share/java/kafka-serde-tools/kotlinx-coroutines-core-common-1.1.1.jar:/usr/share/java/kafka-serde-tools/jersey-client-2.30.jar:/usr/share/java/kafka-serde-tools/jackson-annotations-2.10.2.jar:/usr/share/java/kafka-serde-tools/jsr305-3.0.2.jar:/usr/share/java/kafka-serde-tools/jersey-bean-validation-2.30.jar:/usr/share/java/kafka-serde-tools/kafka-json-schema-serializer-5.5.0.jar:/usr/share/java/kafka-serde-tools/kafka-json-schema-provider-5.5.0.jar:/usr/share/java/kafka-serde-tools/mbknor-jackson-jsonschema_2.12-1.0.36.jar:/usr/share/java/kafka-serde-tools/jackson-databind-2.10.2.jar:/usr/share/java/kafka-serde-tools/commons-validator-1.6.jar:/usr/share/java/kafka-serde-tools/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/share/java/kafka-serde-tools/jakarta.inject-2.6.1.jar:/usr/share/java/kafka-serde-tools/swagger-annotations-1.5.22.jar:/usr/share/java/kafka-serde-tools/guava-28.1-android.jar:/usr/share/java/kafka-serde-tools/org.everit.json.schema-1.12.1.jar:/usr/share/java/kafka-serde-tools/jersey-media-jaxb-2.30.jar:/usr/share/java/kafka-serde-tools/jakarta.validation-api-2.0.2.jar:/usr/share/java/kafka-serde-tools/kotlin-stdlib-jdk8-1.3.61.jar:/usr/share/java/kafka-serde-tools/classgraph-4.8.21.jar:/usr/share/java/kafka-serde-tools/protobuf-java-3.11.4.jar:/usr/share/java/kafka-serde-tools/jboss-logging-3.3.2.Final.jar:/usr/share/java/kafka-serde-tools/jackson-datatype-guava-2.10.2.jar:/usr/share/java/kafka-serde-tools/kafka-json-serializer-5.5.0.jar:/usr/share/java/kafka-serde-tools/wire-runtime-3.1.0.jar:/usr/share/java/kafka-serde-tools/commons-collections-3.2.2.jar:/usr/share/java/kafka-serde-tools/kotlin-script-runtime-1.3.50.jar:/usr/share/java/monitoring-interceptors/monitoring-interceptors-5.5.0.jar:/usr/bin/../share/java/kafka/connect-basic-auth-extension-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/jetty-http-9.4.24.v20191120.jar:/usr/bin/../share/java/kafka/jetty-io-9.4.24.v20191120.jar:/usr/bin/../share/java/kafka/commons-cli-1.4.jar:/usr/bin/../share/java/kafka/jakarta.annotation-api-1.3.4.jar:/usr/bin/../share/java/kafka/osgi-resource-locator-1.0.1.jar:/usr/bin/../share/java/kafka/jetty-server-9.4.24.v20191120.jar:/usr/bin/../share/java/kafka/netty-resolver-4.1.45.Final.jar:/usr/bin/../share/java/kafka/jackson-module-jaxb-annotations-2.10.2.jar:/usr/bin/../share/java/kafka/jackson-jaxrs-json-provider-2.10.2.jar:/usr/bin/../share/java/kafka/zstd-jni-1.4.4-7.jar:/usr/bin/../share/java/kafka/netty-transport-4.1.45.Final.jar:/usr/bin/../share/java/kafka/kafka_2.12-5.5.0-ccs-scaladoc.jar:/usr/bin/../share/java/kafka/jakarta.ws.rs-api-2.1.5.jar:/usr/bin/../share/java/kafka/jetty-client-9.4.24.v20191120.jar:/usr/bin/../share/java/kafka/validation-api-2.0.1.Final.jar:/usr/bin/../share/java/kafka/connect-file-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/jakarta.xml.bind-api-2.3.2.jar:/usr/bin/../share/java/kafka/jersey-container-servlet-2.28.jar:/usr/bin/../share/java/kafka/slf4j-api-1.7.30.jar:/usr/bin/../share/java/kafka/maven-artifact-3.6.3.jar:/usr/bin/../share/java/kafka/aopalliance-repackaged-2.5.0.jar:/usr/bin/../share/java/kafka/netty-transport-native-epoll-4.1.45.Final.jar:/usr/bin/../share/java/kafka/slf4j-log4j12-1.7.30.jar:/usr/bin/../share/java/kafka/kafka.jar:/usr/bin/../share/java/kafka/kafka_2.12-5.5.0-ccs-javadoc.jar:/usr/bin/../share/java/kafka/rocksdbjni-5.18.3.jar:/usr/bin/../share/java/kafka/connect-transforms-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/commons-compress-1.19.jar:/usr/bin/../share/java/kafka/reflections-0.9.12.jar:/usr/bin/../share/java/kafka/jackson-jaxrs-base-2.10.2.jar:/usr/bin/../share/java/kafka/jackson-module-scala_2.12-2.10.2.jar:/usr/bin/../share/java/kafka/audience-annotations-0.5.0.jar:/usr/bin/../share/java/kafka/support-metrics-client-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/kafka-streams-scala_2.12-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/javassist-3.22.0-CR2.jar:/usr/bin/../share/java/kafka/connect-mirror-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/jersey-container-servlet-core-2.28.jar:/usr/bin/../share/java/kafka/scala-java8-compat_2.12-0.9.0.jar:/usr/bin/../share/java/kafka/httpclient-4.5.11.jar:/usr/bin/../share/java/kafka/jersey-server-2.28.jar:/usr/bin/../share/java/kafka/jakarta.inject-2.5.0.jar:/usr/bin/../share/java/kafka/kafka_2.12-5.5.0-ccs-sources.jar:/usr/bin/../share/java/kafka/zookeeper-jute-3.5.7.jar:/usr/bin/../share/java/kafka/javax.servlet-api-3.1.0.jar:/usr/bin/../share/java/kafka/commons-lang3-3.8.1.jar:/usr/bin/../share/java/kafka/netty-handler-4.1.45.Final.jar:/usr/bin/../share/java/kafka/commons-logging-1.2.jar:/usr/bin/../share/java/kafka/support-metrics-common-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/jackson-dataformat-csv-2.10.2.jar:/usr/bin/../share/java/kafka/jersey-common-2.28.jar:/usr/bin/../share/java/kafka/connect-json-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/kafka_2.12-5.5.0-ccs-test-sources.jar:/usr/bin/../share/java/kafka/jetty-servlet-9.4.24.v20191120.jar:/usr/bin/../share/java/kafka/kafka_2.12-5.5.0-ccs-test.jar:/usr/bin/../share/java/kafka/httpcore-4.4.13.jar:/usr/bin/../share/java/kafka/jackson-core-2.10.2.jar:/usr/bin/../share/java/kafka/jackson-datatype-jdk8-2.10.2.jar:/usr/bin/../share/java/kafka/jetty-continuation-9.4.24.v20191120.jar:/usr/bin/../share/java/kafka/jopt-simple-5.0.4.jar:/usr/bin/../share/java/kafka/zookeeper-3.5.7.jar:/usr/bin/../share/java/kafka/javassist-3.26.0-GA.jar:/usr/bin/../share/java/kafka/hk2-api-2.5.0.jar:/usr/bin/../share/java/kafka/jetty-servlets-9.4.24.v20191120.jar:/usr/bin/../share/java/kafka/jakarta.activation-api-1.2.1.jar:/usr/bin/../share/java/kafka/paranamer-2.8.jar:/usr/bin/../share/java/kafka/scala-collection-compat_2.12-2.1.3.jar:/usr/bin/../share/java/kafka/metrics-core-2.2.0.jar:/usr/bin/../share/java/kafka/kafka-streams-examples-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/kafka-clients-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/scala-library-2.12.10.jar:/usr/bin/../share/java/kafka/avro-1.9.2.jar:/usr/bin/../share/java/kafka/connect-runtime-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/kafka-streams-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/javax.ws.rs-api-2.1.1.jar:/usr/bin/../share/java/kafka/scala-reflect-2.12.10.jar:/usr/bin/../share/java/kafka/jaxb-api-2.3.0.jar:/usr/bin/../share/java/kafka/jackson-module-paranamer-2.10.2.jar:/usr/bin/../share/java/kafka/hk2-utils-2.5.0.jar:/usr/bin/../share/java/kafka/jetty-security-9.4.24.v20191120.jar:/usr/bin/../share/java/kafka/netty-transport-native-unix-common-4.1.45.Final.jar:/usr/bin/../share/java/kafka/jersey-client-2.28.jar:/usr/bin/../share/java/kafka/netty-common-4.1.45.Final.jar:/usr/bin/../share/java/kafka/jackson-annotations-2.10.2.jar:/usr/bin/../share/java/kafka/jersey-media-jaxb-2.28.jar:/usr/bin/../share/java/kafka/argparse4j-0.7.0.jar:/usr/bin/../share/java/kafka/jackson-databind-2.10.2.jar:/usr/bin/../share/java/kafka/commons-codec-1.11.jar:/usr/bin/../share/java/kafka/kafka-tools-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/lz4-java-1.7.1.jar:/usr/bin/../share/java/kafka/netty-codec-4.1.45.Final.jar:/usr/bin/../share/java/kafka/connect-mirror-client-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/activation-1.1.1.jar:/usr/bin/../share/java/kafka/connect-api-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/scala-logging_2.12-3.9.2.jar:/usr/bin/../share/java/kafka/httpmime-4.5.11.jar:/usr/bin/../share/java/kafka/plexus-utils-3.2.1.jar:/usr/bin/../share/java/kafka/netty-buffer-4.1.45.Final.jar:/usr/bin/../share/java/kafka/log4j-1.2.17.jar:/usr/bin/../share/java/kafka/jetty-util-9.4.24.v20191120.jar:/usr/bin/../share/java/kafka/jersey-hk2-2.28.jar:/usr/bin/../share/java/kafka/kafka_2.12-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/kafka-streams-test-utils-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/kafka-log4j-appender-5.5.0-ccs.jar:/usr/bin/../share/java/kafka/hk2-locator-2.5.0.jar:/usr/bin/../share/java/kafka/snappy-java-1.1.7.3.jar:/usr/bin/../share/java/kafka/confluent-metrics-5.5.0-ce.jar:/usr/bin/../support-metrics-client/build/dependant-libs-2.12/*:/usr/bin/../support-metrics-client/build/libs/*:/usr/share/java/support-metrics-client/* os.spec = Linux, amd64, 4.19.76-linuxkit os.vcpus = 2 (org.apache.kafka.connect.runtime.WorkerInfo:71) [2020-07-14 16:49:25,253] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectDistributed:90) [2020-07-14 16:49:25,362] INFO Loading plugin from: /usr/share/java/kafka-connect-elasticsearch (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:49:27,041] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-connect-elasticsearch/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:27,045] INFO Added plugin 'io.confluent.connect.elasticsearch.ElasticsearchSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:27,052] INFO Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:27,052] INFO Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:27,052] INFO Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:27,055] INFO Loading plugin from: /usr/share/java/kafka-connect-ibmmq (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:49:28,530] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-connect-ibmmq/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:28,531] INFO Added plugin 'io.confluent.connect.ibm.mq.IbmMQSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:28,534] INFO Loading plugin from: /usr/share/java/kafka-connect-jms (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) Tue Jul 14 16:49:28 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) [2020-07-14 16:49:32,133] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-connect-jms/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:32,141] INFO Added plugin 'io.confluent.connect.jms.JmsSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:32,142] INFO Loading plugin from: /usr/share/java/kafka-connect-jdbc (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:49:32,566] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-connect-jdbc/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:32,573] INFO Added plugin 'io.confluent.connect.jdbc.JdbcSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:32,573] INFO Added plugin 'io.confluent.connect.jdbc.JdbcSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:32,606] INFO Loading plugin from: /usr/share/java/kafka-connect-s3 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) Tue Jul 14 16:49:33 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) Tue Jul 14 16:49:38 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) [2020-07-14 16:49:40,748] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-connect-s3/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:40,748] INFO Added plugin 'io.confluent.connect.s3.S3SinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:40,749] INFO Added plugin 'io.confluent.connect.storage.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:40,749] INFO Added plugin 'io.confluent.connect.avro.AvroConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:40,749] INFO Added plugin 'org.apache.kafka.common.config.provider.FileConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:40,750] INFO Loading plugin from: /usr/share/java/kafka-connect-activemq (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:49:41,339] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-connect-activemq/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:41,339] INFO Added plugin 'io.confluent.connect.activemq.ActiveMQSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:41,342] INFO Loading plugin from: /usr/share/java/kafka-connect-storage-common (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:49:43,415] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-connect-storage-common/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:43,416] INFO Loading plugin from: /usr/share/java/confluent-hub-client (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) Tue Jul 14 16:49:43 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) [2020-07-14 16:49:44,146] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-hub-client/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:44,148] INFO Loading plugin from: /usr/share/java/acl (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:49:48,074] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/acl/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:48,076] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,076] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,077] INFO Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,077] INFO Added plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,077] INFO Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,078] INFO Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,078] INFO Added plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,078] INFO Added plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,079] INFO Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,080] INFO Added plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,081] INFO Added plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,081] INFO Added plugin 'io.confluent.connect.json.JsonSchemaConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,081] INFO Added plugin 'io.confluent.connect.protobuf.ProtobufConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,082] INFO Added plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,082] INFO Added plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,082] INFO Added plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,083] INFO Added plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,083] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,083] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,083] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,084] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,084] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,084] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,084] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,085] INFO Added plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,085] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,085] INFO Added plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,085] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,086] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,086] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,093] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,093] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,094] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,096] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,096] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,097] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,100] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,100] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,102] INFO Added plugin 'io.confluent.kafka.secretregistry.client.config.provider.SecretConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,103] INFO Added plugin 'io.confluent.connect.security.ConnectSecurityExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:48,103] INFO Loading plugin from: /usr/share/java/confluent-control-center (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) Tue Jul 14 16:49:49 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) Tue Jul 14 16:49:54 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) [2020-07-14 16:49:54,843] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-control-center/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:54,849] INFO Loading plugin from: /usr/share/java/monitoring-interceptors (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:49:55,156] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/monitoring-interceptors/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:55,157] INFO Loading plugin from: /usr/share/java/kafka (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:49:58,044] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:58,053] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:58,053] INFO Added plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:58,053] INFO Added plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:58,055] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:58,055] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:58,056] INFO Added plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:49:58,056] INFO Loading plugin from: /usr/share/java/kafka-serde-tools (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) Tue Jul 14 16:49:59 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) [2020-07-14 16:49:59,753] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-serde-tools/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:49:59,755] INFO Loading plugin from: /usr/share/java/confluent-rebalancer (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:50:01,251] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-rebalancer/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:50:01,252] INFO Loading plugin from: /usr/share/java/confluent-common (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:50:01,259] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-common/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:50:01,259] INFO Loading plugin from: /usr/share/java/schema-registry (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:50:02,481] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/schema-registry/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:50:02,482] INFO Loading plugin from: /usr/share/java/rest-utils (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:50:03,105] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/rest-utils/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:50:03,106] INFO Loading plugin from: /usr/share/confluent-hub-components/confluentinc-kafka-connect-gcs (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) Tue Jul 14 16:50:04 UTC 2020 Kafka Connect listener HTTP state: 000 (waiting for 200) [2020-07-14 16:50:04,568] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/confluent-hub-components/confluentinc-kafka-connect-gcs/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:50:04,568] INFO Added plugin 'io.confluent.connect.gcs.GcsSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:50:04,569] INFO Loading plugin from: /usr/share/confluent-hub-components/confluentinc-kafka-connect-datagen (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:50:04,678] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/confluent-hub-components/confluentinc-kafka-connect-datagen/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:50:04,679] INFO Added plugin 'io.confluent.kafka.connect.datagen.DatagenConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:50:04,679] INFO Loading plugin from: /usr/share/confluent-hub-components/debezium-debezium-connector-mysql (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239) [2020-07-14 16:50:04,941] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/confluent-hub-components/debezium-debezium-connector-mysql/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:50:04,942] INFO Added plugin 'io.debezium.connector.mysql.MySqlConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:50:04,942] INFO Added plugin 'io.debezium.converters.ByteBufferConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:50:04,942] INFO Added plugin 'io.debezium.converters.CloudEventsConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:50:04,942] INFO Added plugin 'io.debezium.transforms.outbox.EventRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:50:04,943] INFO Added plugin 'io.debezium.transforms.ExtractNewRecordState' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:50:04,943] INFO Added plugin 'io.debezium.transforms.UnwrapFromEnvelope' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:50:04,943] INFO Added plugin 'io.debezium.transforms.ByLogicalTableRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191) [2020-07-14 16:50:08,608] INFO Registered loader: sun.misc.Launcher$AppClassLoader@764c12b6 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262) [2020-07-14 16:50:08,609] INFO Added aliases 'ActiveMQSourceConnector' and 'ActiveMQSource' to plugin 'io.confluent.connect.activemq.ActiveMQSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,609] INFO Added aliases 'ElasticsearchSinkConnector' and 'ElasticsearchSink' to plugin 'io.confluent.connect.elasticsearch.ElasticsearchSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,609] INFO Added aliases 'GcsSinkConnector' and 'GcsSink' to plugin 'io.confluent.connect.gcs.GcsSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,609] INFO Added aliases 'IbmMQSourceConnector' and 'IbmMQSource' to plugin 'io.confluent.connect.ibm.mq.IbmMQSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,612] INFO Added aliases 'JdbcSinkConnector' and 'JdbcSink' to plugin 'io.confluent.connect.jdbc.JdbcSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,612] INFO Added aliases 'JdbcSourceConnector' and 'JdbcSource' to plugin 'io.confluent.connect.jdbc.JdbcSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,613] INFO Added aliases 'JmsSourceConnector' and 'JmsSource' to plugin 'io.confluent.connect.jms.JmsSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,613] INFO Added aliases 'S3SinkConnector' and 'S3Sink' to plugin 'io.confluent.connect.s3.S3SinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,613] INFO Added aliases 'DatagenConnector' and 'Datagen' to plugin 'io.confluent.kafka.connect.datagen.DatagenConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,613] INFO Added aliases 'MySqlConnector' and 'MySql' to plugin 'io.debezium.connector.mysql.MySqlConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,613] INFO Added aliases 'FileStreamSinkConnector' and 'FileStreamSink' to plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,613] INFO Added aliases 'FileStreamSourceConnector' and 'FileStreamSource' to plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,614] INFO Added aliases 'MirrorCheckpointConnector' and 'MirrorCheckpoint' to plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,614] INFO Added aliases 'MirrorHeartbeatConnector' and 'MirrorHeartbeat' to plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,614] INFO Added aliases 'MirrorSourceConnector' and 'MirrorSource' to plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,615] INFO Added aliases 'MockConnector' and 'Mock' to plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,615] INFO Added aliases 'MockSinkConnector' and 'MockSink' to plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,616] INFO Added aliases 'MockSourceConnector' and 'MockSource' to plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,616] INFO Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,617] INFO Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,617] INFO Added aliases 'AvroConverter' and 'Avro' to plugin 'io.confluent.connect.avro.AvroConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,617] INFO Added aliases 'JsonSchemaConverter' and 'JsonSchema' to plugin 'io.confluent.connect.json.JsonSchemaConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,617] INFO Added aliases 'ProtobufConverter' and 'Protobuf' to plugin 'io.confluent.connect.protobuf.ProtobufConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,617] INFO Added aliases 'ByteBufferConverter' and 'ByteBuffer' to plugin 'io.debezium.converters.ByteBufferConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,617] INFO Added aliases 'CloudEventsConverter' and 'CloudEvents' to plugin 'io.debezium.converters.CloudEventsConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,617] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,617] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,617] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,617] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,617] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,617] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,618] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,618] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,618] INFO Added aliases 'ByteBufferConverter' and 'ByteBuffer' to plugin 'io.debezium.converters.ByteBufferConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,618] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,618] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,618] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,618] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,618] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,618] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,618] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,618] INFO Added alias 'SimpleHeaderConverter' to plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416) [2020-07-14 16:50:08,618] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,618] INFO Added alias 'ByLogicalTableRouter' to plugin 'io.debezium.transforms.ByLogicalTableRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416) [2020-07-14 16:50:08,618] INFO Added alias 'ExtractNewRecordState' to plugin 'io.debezium.transforms.ExtractNewRecordState' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416) [2020-07-14 16:50:08,619] INFO Added alias 'UnwrapFromEnvelope' to plugin 'io.debezium.transforms.UnwrapFromEnvelope' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416) [2020-07-14 16:50:08,619] INFO Added alias 'EventRouter' to plugin 'io.debezium.transforms.outbox.EventRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416) [2020-07-14 16:50:08,619] INFO Added alias 'RegexRouter' to plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416) [2020-07-14 16:50:08,619] INFO Added alias 'TimestampRouter' to plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416) [2020-07-14 16:50:08,620] INFO Added alias 'ValueToKey' to plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416) [2020-07-14 16:50:08,620] INFO Added alias 'ConnectSecurityExtension' to plugin 'io.confluent.connect.security.ConnectSecurityExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416) [2020-07-14 16:50:08,623] INFO Added alias 'BasicAuthSecurityRestExtension' to plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416) [2020-07-14 16:50:08,623] INFO Added aliases 'AllConnectorClientConfigOverridePolicy' and 'All' to plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,624] INFO Added aliases 'NoneConnectorClientConfigOverridePolicy' and 'None' to plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,624] INFO Added aliases 'PrincipalConnectorClientConfigOverridePolicy' and 'Principal' to plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419) [2020-07-14 16:50:08,699] INFO DistributedConfig values: access.control.allow.methods = access.control.allow.origin = admin.listeners = null bootstrap.servers = [kafka:29092] client.dns.lookup = default client.id = config.providers = [] config.storage.replication.factor = 1 config.storage.topic = _kafka-connect-01-configs connect.protocol = sessioned connections.max.idle.ms = 540000 connector.client.config.override.policy = None group.id = kafka-connect-01 header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter heartbeat.interval.ms = 3000 inter.worker.key.generation.algorithm = HmacSHA256 inter.worker.key.size = null inter.worker.key.ttl.ms = 3600000 inter.worker.signature.algorithm = HmacSHA256 inter.worker.verification.algorithms = [HmacSHA256] internal.key.converter = class org.apache.kafka.connect.json.JsonConverter internal.value.converter = class org.apache.kafka.connect.json.JsonConverter key.converter = class io.confluent.connect.avro.AvroConverter listeners = null metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 offset.flush.interval.ms = 60000 offset.flush.timeout.ms = 5000 offset.storage.partitions = 25 offset.storage.replication.factor = 1 offset.storage.topic = _kafka-connect-01-offsets plugin.path = [/usr/share/java, /usr/share/confluent-hub-components/, /connectors/] rebalance.timeout.ms = 60000 receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 40000 rest.advertised.host.name = kafka-connect-01 rest.advertised.listener = null rest.advertised.port = null rest.extension.classes = [] rest.host.name = null rest.port = 8083 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI scheduled.rebalance.max.delay.ms = 300000 security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS status.storage.partitions = 5 status.storage.replication.factor = 1 status.storage.topic = _kafka-connect-01-status task.shutdown.graceful.timeout.ms = 5000 topic.tracking.allow.reset = true topic.tracking.enable = true value.converter = class io.confluent.connect.avro.AvroConverter worker.sync.timeout.ms = 3000 worker.unsync.backoff.ms = 300000 (org.apache.kafka.connect.runtime.distributed.DistributedConfig:347) [2020-07-14 16:50:08,699] INFO Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration. (org.apache.kafka.connect.runtime.WorkerConfig:357) [2020-07-14 16:50:08,699] INFO Worker configuration property 'internal.key.converter.schemas.enable' (along with all configuration for 'internal.key.converter') is deprecated and may be removed in an upcoming release. The specified value 'false' matches the default, so this property can be safely removed from the worker configuration. (org.apache.kafka.connect.runtime.WorkerConfig:357) [2020-07-14 16:50:08,700] INFO Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration. (org.apache.kafka.connect.runtime.WorkerConfig:357) [2020-07-14 16:50:08,700] INFO Worker configuration property 'internal.value.converter.schemas.enable' (along with all configuration for 'internal.value.converter') is deprecated and may be removed in an upcoming release. The specified value 'false' matches the default, so this property can be safely removed from the worker configuration. (org.apache.kafka.connect.runtime.WorkerConfig:357) [2020-07-14 16:50:08,704] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:43) [2020-07-14 16:50:08,712] INFO AdminClientConfig values: bootstrap.servers = [kafka:29092] client.dns.lookup = default client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS (org.apache.kafka.clients.admin.AdminClientConfig:347) [2020-07-14 16:50:08,802] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,802] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,803] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,803] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,803] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,803] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,803] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,804] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,804] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,804] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,804] WARN The configuration 'value.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,805] WARN The configuration 'log4j.appender.stdout.layout.conversionpattern' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,805] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,805] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,805] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,806] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,806] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,806] WARN The configuration 'log4j.root.loglevel' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,806] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,807] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,807] WARN The configuration 'key.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:08,808] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:08,808] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:08,808] INFO Kafka startTimeMs: 1594745408807 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:09,084] INFO Kafka cluster ID: d88xyIRzT9ahbzImZzreyA (org.apache.kafka.connect.util.ConnectUtils:59) [2020-07-14 16:50:09,110] INFO Logging initialized @45719ms to org.eclipse.jetty.util.log.Slf4jLog (org.eclipse.jetty.util.log:169) [2020-07-14 16:50:09,182] INFO jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 1.8.0_212-b04 (org.eclipse.jetty.server.Server:359) [2020-07-14 16:50:09,212] INFO Started http_8083@16e557d4{HTTP/1.1,[http/1.1]}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector:330) [2020-07-14 16:50:09,212] INFO Started @45821ms (org.eclipse.jetty.server.Server:399) [2020-07-14 16:50:09,238] INFO Setting up None Policy for ConnectorClientConfigOverride. This will disallow any client configuration to be overridden (org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy:45) [2020-07-14 16:50:09,247] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:09,247] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:09,248] INFO Kafka startTimeMs: 1594745409247 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:09,408] INFO JVM Runtime does not support Modules (org.eclipse.jetty.util.TypeUtil:201) [2020-07-14 16:50:09,436] INFO JsonConverterConfig values: converter.type = key decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false (org.apache.kafka.connect.json.JsonConverterConfig:347) [2020-07-14 16:50:09,437] INFO JsonConverterConfig values: converter.type = value decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false (org.apache.kafka.connect.json.JsonConverterConfig:347) Tue Jul 14 16:50:09 UTC 2020 Kafka Connect listener HTTP state: 404 (waiting for 200) [2020-07-14 16:50:09,544] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:09,545] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:09,545] INFO Kafka startTimeMs: 1594745409544 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:09,549] INFO Kafka Connect distributed worker initialization took 44295ms (org.apache.kafka.connect.cli.ConnectDistributed:128) [2020-07-14 16:50:09,550] INFO Kafka Connect starting (org.apache.kafka.connect.runtime.Connect:51) [2020-07-14 16:50:09,559] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Herder starting (org.apache.kafka.connect.runtime.distributed.DistributedHerder:282) [2020-07-14 16:50:09,559] INFO Worker starting (org.apache.kafka.connect.runtime.Worker:184) [2020-07-14 16:50:09,559] INFO Starting KafkaOffsetBackingStore (org.apache.kafka.connect.storage.KafkaOffsetBackingStore:108) [2020-07-14 16:50:09,560] INFO Starting KafkaBasedLog with topic _kafka-connect-01-offsets (org.apache.kafka.connect.util.KafkaBasedLog:126) [2020-07-14 16:50:09,561] INFO AdminClientConfig values: bootstrap.servers = [kafka:29092] client.dns.lookup = default client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS (org.apache.kafka.clients.admin.AdminClientConfig:347) [2020-07-14 16:50:09,576] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,576] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,578] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,578] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,578] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,578] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,579] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,579] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,579] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,580] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,580] WARN The configuration 'value.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,582] WARN The configuration 'log4j.appender.stdout.layout.conversionpattern' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,583] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,583] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,583] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,583] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,584] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,584] WARN The configuration 'log4j.root.loglevel' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,584] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,584] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,585] WARN The configuration 'key.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:09,585] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:09,586] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:09,586] INFO Kafka startTimeMs: 1594745409585 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:09,731] INFO DefaultSessionIdManager workerName=node0 (org.eclipse.jetty.server.session:333) [2020-07-14 16:50:09,733] INFO No SessionScavenger set, using defaults (org.eclipse.jetty.server.session:338) [2020-07-14 16:50:09,747] INFO node0 Scavenging every 600000ms (org.eclipse.jetty.server.session:140) [2020-07-14 16:50:10,303] INFO Created topic (name=_kafka-connect-01-offsets, numPartitions=25, replicationFactor=1, replicasAssignments=null, configs={cleanup.policy=compact}) on brokers at kafka:29092 (org.apache.kafka.connect.util.TopicAdmin:230) [2020-07-14 16:50:10,322] INFO ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:29092] buffer.memory = 33554432 client.dns.lookup = default client.id = producer-1 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 2147483647 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer (org.apache.kafka.clients.producer.ProducerConfig:347) [2020-07-14 16:50:10,344] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,344] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,344] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,344] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,345] WARN The configuration 'log4j.appender.stdout.layout.conversionpattern' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,345] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,345] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,345] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,345] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,345] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,346] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,346] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,346] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,346] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,346] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,347] WARN The configuration 'value.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,347] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,347] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,347] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,347] WARN The configuration 'log4j.root.loglevel' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,347] WARN The configuration 'key.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,348] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:10,348] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:10,348] INFO Kafka startTimeMs: 1594745410348 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:10,370] INFO ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29092] check.crcs = true client.dns.lookup = default client.id = client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = kafka-connect-01 group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer (org.apache.kafka.clients.consumer.ConsumerConfig:347) [2020-07-14 16:50:10,399] INFO [Producer clientId=producer-1] Cluster ID: d88xyIRzT9ahbzImZzreyA (org.apache.kafka.clients.Metadata:280) [2020-07-14 16:50:10,423] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,427] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,427] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,427] WARN The configuration 'log4j.appender.stdout.layout.conversionpattern' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,428] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,428] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,428] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,429] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,429] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,430] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,430] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,430] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,431] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,431] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,431] WARN The configuration 'value.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,432] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,433] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,433] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,433] WARN The configuration 'log4j.root.loglevel' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,433] WARN The configuration 'key.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,434] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:10,434] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:10,434] INFO Kafka startTimeMs: 1594745410434 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:10,448] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Cluster ID: d88xyIRzT9ahbzImZzreyA (org.apache.kafka.clients.Metadata:280) [2020-07-14 16:50:10,484] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Subscribed to partition(s): _kafka-connect-01-offsets-0, _kafka-connect-01-offsets-5, _kafka-connect-01-offsets-10, _kafka-connect-01-offsets-20, _kafka-connect-01-offsets-15, _kafka-connect-01-offsets-9, _kafka-connect-01-offsets-11, _kafka-connect-01-offsets-4, _kafka-connect-01-offsets-16, _kafka-connect-01-offsets-17, _kafka-connect-01-offsets-3, _kafka-connect-01-offsets-24, _kafka-connect-01-offsets-23, _kafka-connect-01-offsets-13, _kafka-connect-01-offsets-18, _kafka-connect-01-offsets-22, _kafka-connect-01-offsets-2, _kafka-connect-01-offsets-8, _kafka-connect-01-offsets-12, _kafka-connect-01-offsets-19, _kafka-connect-01-offsets-14, _kafka-connect-01-offsets-1, _kafka-connect-01-offsets-6, _kafka-connect-01-offsets-7, _kafka-connect-01-offsets-21 (org.apache.kafka.clients.consumer.KafkaConsumer:1128) [2020-07-14 16:50:10,487] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-0 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,488] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-5 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,488] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-10 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,488] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-20 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,489] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-15 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,489] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-9 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,489] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-11 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,489] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-4 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,489] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-16 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,490] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-17 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,490] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-3 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,490] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-24 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,490] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-23 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,491] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-13 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,491] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-18 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,491] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-22 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,491] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-2 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,491] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-8 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,492] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-12 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,492] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-19 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,492] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-14 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,492] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-1 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,492] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-6 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,493] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-7 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,493] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-offsets-21 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,577] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-21 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,579] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-19 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,580] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-17 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,580] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-15 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,581] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-23 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,581] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-5 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,581] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-3 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,581] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-1 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,582] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-13 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,582] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-11 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,582] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-9 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,583] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-7 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,583] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-22 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,583] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-20 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,583] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-18 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,584] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-16 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,584] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-24 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,584] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-6 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,584] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-4 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,586] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-2 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,587] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-0 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,587] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-14 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,587] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-12 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,587] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-10 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,588] INFO [Consumer clientId=consumer-kafka-connect-01-1, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-offsets-8 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,588] INFO Finished reading KafkaBasedLog for topic _kafka-connect-01-offsets (org.apache.kafka.connect.util.KafkaBasedLog:159) [2020-07-14 16:50:10,589] INFO Started KafkaBasedLog for topic _kafka-connect-01-offsets (org.apache.kafka.connect.util.KafkaBasedLog:161) [2020-07-14 16:50:10,589] INFO Finished reading offsets topic and starting KafkaOffsetBackingStore (org.apache.kafka.connect.storage.KafkaOffsetBackingStore:110) [2020-07-14 16:50:10,591] INFO Worker started (org.apache.kafka.connect.runtime.Worker:191) [2020-07-14 16:50:10,591] INFO Starting KafkaBasedLog with topic _kafka-connect-01-status (org.apache.kafka.connect.util.KafkaBasedLog:126) [2020-07-14 16:50:10,591] INFO AdminClientConfig values: bootstrap.servers = [kafka:29092] client.dns.lookup = default client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS (org.apache.kafka.clients.admin.AdminClientConfig:347) [2020-07-14 16:50:10,594] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,594] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,600] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,600] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,600] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,600] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,601] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,601] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,601] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,601] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,602] WARN The configuration 'value.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,605] WARN The configuration 'log4j.appender.stdout.layout.conversionpattern' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,606] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,606] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,606] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,607] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,611] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,611] WARN The configuration 'log4j.root.loglevel' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,611] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,612] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,612] WARN The configuration 'key.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,612] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:10,612] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:10,613] INFO Kafka startTimeMs: 1594745410612 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:10,853] INFO Created topic (name=_kafka-connect-01-status, numPartitions=5, replicationFactor=1, replicasAssignments=null, configs={cleanup.policy=compact}) on brokers at kafka:29092 (org.apache.kafka.connect.util.TopicAdmin:230) [2020-07-14 16:50:10,856] INFO ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:29092] buffer.memory = 33554432 client.dns.lookup = default client.id = producer-2 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer (org.apache.kafka.clients.producer.ProducerConfig:347) [2020-07-14 16:50:10,872] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,873] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,873] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,873] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,873] WARN The configuration 'log4j.appender.stdout.layout.conversionpattern' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,874] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,874] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,874] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,874] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,874] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,875] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,875] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,875] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,875] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,876] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,876] WARN The configuration 'value.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,876] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,876] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,876] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,877] WARN The configuration 'log4j.root.loglevel' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,877] WARN The configuration 'key.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:10,877] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:10,877] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:10,878] INFO Kafka startTimeMs: 1594745410877 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:10,878] INFO ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29092] check.crcs = true client.dns.lookup = default client.id = client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = kafka-connect-01 group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer (org.apache.kafka.clients.consumer.ConsumerConfig:347) [2020-07-14 16:50:10,901] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,901] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,907] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,908] WARN The configuration 'log4j.appender.stdout.layout.conversionpattern' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,908] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,908] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,909] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,909] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,909] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,909] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,910] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,910] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,911] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,911] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,911] WARN The configuration 'value.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,911] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,911] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,912] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,912] WARN The configuration 'log4j.root.loglevel' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,913] WARN The configuration 'key.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:10,913] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:10,913] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:10,912] INFO [Producer clientId=producer-2] Cluster ID: d88xyIRzT9ahbzImZzreyA (org.apache.kafka.clients.Metadata:280) [2020-07-14 16:50:10,916] INFO Kafka startTimeMs: 1594745410913 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:10,923] INFO [Consumer clientId=consumer-kafka-connect-01-2, groupId=kafka-connect-01] Cluster ID: d88xyIRzT9ahbzImZzreyA (org.apache.kafka.clients.Metadata:280) [2020-07-14 16:50:10,931] INFO [Consumer clientId=consumer-kafka-connect-01-2, groupId=kafka-connect-01] Subscribed to partition(s): _kafka-connect-01-status-0, _kafka-connect-01-status-1, _kafka-connect-01-status-4, _kafka-connect-01-status-2, _kafka-connect-01-status-3 (org.apache.kafka.clients.consumer.KafkaConsumer:1128) [2020-07-14 16:50:10,931] INFO [Consumer clientId=consumer-kafka-connect-01-2, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-status-0 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,931] INFO [Consumer clientId=consumer-kafka-connect-01-2, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-status-1 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,932] INFO [Consumer clientId=consumer-kafka-connect-01-2, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-status-4 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,932] INFO [Consumer clientId=consumer-kafka-connect-01-2, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-status-2 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,932] INFO [Consumer clientId=consumer-kafka-connect-01-2, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-status-3 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:10,945] INFO [Consumer clientId=consumer-kafka-connect-01-2, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-status-2 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,947] INFO [Consumer clientId=consumer-kafka-connect-01-2, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-status-1 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,948] INFO [Consumer clientId=consumer-kafka-connect-01-2, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-status-0 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,948] INFO [Consumer clientId=consumer-kafka-connect-01-2, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-status-4 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,948] INFO [Consumer clientId=consumer-kafka-connect-01-2, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-status-3 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:10,953] INFO HV000001: Hibernate Validator 6.0.17.Final (org.hibernate.validator.internal.util.Version:21) [2020-07-14 16:50:10,957] INFO Finished reading KafkaBasedLog for topic _kafka-connect-01-status (org.apache.kafka.connect.util.KafkaBasedLog:159) [2020-07-14 16:50:10,959] INFO Started KafkaBasedLog for topic _kafka-connect-01-status (org.apache.kafka.connect.util.KafkaBasedLog:161) [2020-07-14 16:50:10,964] INFO Starting KafkaConfigBackingStore (org.apache.kafka.connect.storage.KafkaConfigBackingStore:262) [2020-07-14 16:50:10,964] INFO Starting KafkaBasedLog with topic _kafka-connect-01-configs (org.apache.kafka.connect.util.KafkaBasedLog:126) [2020-07-14 16:50:10,965] INFO AdminClientConfig values: bootstrap.servers = [kafka:29092] client.dns.lookup = default client.id = connections.max.idle.ms = 300000 default.api.timeout.ms = 60000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS (org.apache.kafka.clients.admin.AdminClientConfig:347) [2020-07-14 16:50:10,967] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,968] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,968] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,968] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,973] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,973] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,979] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,979] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,980] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,980] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,980] WARN The configuration 'value.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,980] WARN The configuration 'log4j.appender.stdout.layout.conversionpattern' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,980] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,981] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,981] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,981] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,981] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,981] WARN The configuration 'log4j.root.loglevel' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,982] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,982] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,982] WARN The configuration 'key.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2020-07-14 16:50:10,982] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:10,983] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:10,983] INFO Kafka startTimeMs: 1594745410982 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:11,060] INFO Created topic (name=_kafka-connect-01-configs, numPartitions=1, replicationFactor=1, replicasAssignments=null, configs={cleanup.policy=compact}) on brokers at kafka:29092 (org.apache.kafka.connect.util.TopicAdmin:230) [2020-07-14 16:50:11,064] INFO ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:29092] buffer.memory = 33554432 client.dns.lookup = default client.id = producer-3 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 2147483647 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer (org.apache.kafka.clients.producer.ProducerConfig:347) [2020-07-14 16:50:11,078] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,078] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,078] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,078] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,082] WARN The configuration 'log4j.appender.stdout.layout.conversionpattern' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,082] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,082] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,082] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,082] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,083] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,083] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,083] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,083] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,083] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,084] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,084] WARN The configuration 'value.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,084] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,084] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,084] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,084] WARN The configuration 'log4j.root.loglevel' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,084] WARN The configuration 'key.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:355) [2020-07-14 16:50:11,085] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:11,086] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:11,086] INFO Kafka startTimeMs: 1594745411085 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:11,087] INFO ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29092] check.crcs = true client.dns.lookup = default client.id = client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = kafka-connect-01 group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer (org.apache.kafka.clients.consumer.ConsumerConfig:347) [2020-07-14 16:50:11,088] INFO [Producer clientId=producer-3] Cluster ID: d88xyIRzT9ahbzImZzreyA (org.apache.kafka.clients.Metadata:280) [2020-07-14 16:50:11,094] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,097] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,097] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,097] WARN The configuration 'log4j.appender.stdout.layout.conversionpattern' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,097] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,097] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,097] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,097] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,098] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,098] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,098] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,098] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,098] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,098] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,098] WARN The configuration 'value.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,098] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,098] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,099] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,099] WARN The configuration 'log4j.root.loglevel' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,099] WARN The configuration 'key.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:355) [2020-07-14 16:50:11,099] INFO Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:11,099] INFO Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:11,099] INFO Kafka startTimeMs: 1594745411099 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:11,105] INFO [Consumer clientId=consumer-kafka-connect-01-3, groupId=kafka-connect-01] Cluster ID: d88xyIRzT9ahbzImZzreyA (org.apache.kafka.clients.Metadata:280) [2020-07-14 16:50:11,111] INFO [Consumer clientId=consumer-kafka-connect-01-3, groupId=kafka-connect-01] Subscribed to partition(s): _kafka-connect-01-configs-0 (org.apache.kafka.clients.consumer.KafkaConsumer:1128) [2020-07-14 16:50:11,112] INFO [Consumer clientId=consumer-kafka-connect-01-3, groupId=kafka-connect-01] Seeking to EARLIEST offset of partition _kafka-connect-01-configs-0 (org.apache.kafka.clients.consumer.internals.SubscriptionState:566) [2020-07-14 16:50:11,119] INFO [Consumer clientId=consumer-kafka-connect-01-3, groupId=kafka-connect-01] Resetting offset for partition _kafka-connect-01-configs-0 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 16:50:11,126] INFO Finished reading KafkaBasedLog for topic _kafka-connect-01-configs (org.apache.kafka.connect.util.KafkaBasedLog:159) [2020-07-14 16:50:11,128] INFO Started KafkaBasedLog for topic _kafka-connect-01-configs (org.apache.kafka.connect.util.KafkaBasedLog:161) [2020-07-14 16:50:11,130] INFO Started KafkaConfigBackingStore (org.apache.kafka.connect.storage.KafkaConfigBackingStore:267) [2020-07-14 16:50:11,130] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Herder started (org.apache.kafka.connect.runtime.distributed.DistributedHerder:286) [2020-07-14 16:50:11,143] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Cluster ID: d88xyIRzT9ahbzImZzreyA (org.apache.kafka.clients.Metadata:280) [2020-07-14 16:50:11,675] INFO Started o.e.j.s.ServletContextHandler@4c264aed{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:825) [2020-07-14 16:50:11,675] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:57) [2020-07-14 16:50:12,057] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 16:50:12,060] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:222) [2020-07-14 16:50:12,060] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 16:50:12,090] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Join group failed with org.apache.kafka.common.errors.MemberIdRequiredException: The group member needs to have a valid member id before actually entering a consumer group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:455) [2020-07-14 16:50:12,090] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 16:50:12,250] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Successfully joined group with generation 1 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:503) [2020-07-14 16:50:12,252] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Joined group at generation 1 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-70fcb2fc-220b-4de1-96e9-f846d1e9df50', leaderUrl='http://kafka-connect-01:8083/', offset=-1, connectorIds=[], taskIds=[], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1549) [2020-07-14 16:50:12,252] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Starting connectors and tasks using config offset -1 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1111) [2020-07-14 16:50:12,253] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1132) [2020-07-14 16:50:12,329] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Session key updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1447) Tue Jul 14 16:50:14 UTC 2020 Kafka Connect listener HTTP state: 200 (waiting for 200) -- +> Creating Data Generator source [2020-07-14 16:50:14,578] INFO AbstractConfig values: (org.apache.kafka.common.config.AbstractConfig:347) [2020-07-14 16:50:14,585] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Connector source-datagen-01 config updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1403) [2020-07-14 16:50:15,094] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:222) [2020-07-14 16:50:15,094] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 16:50:15,109] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Successfully joined group with generation 2 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:503) [2020-07-14 16:50:15,110] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Joined group at generation 2 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-70fcb2fc-220b-4de1-96e9-f846d1e9df50', leaderUrl='http://kafka-connect-01:8083/', offset=2, connectorIds=[source-datagen-01], taskIds=[], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1549) [2020-07-14 16:50:15,110] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Starting connectors and tasks using config offset 2 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1111) [2020-07-14 16:50:15,112] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Starting connector source-datagen-01 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1186) [2020-07-14 16:50:15,117] INFO [source-datagen-01|worker] ConnectorConfig values: config.action.reload = restart connector.class = io.confluent.kafka.connect.datagen.DatagenConnector errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = source-datagen-01 tasks.max = 1 transforms = [] value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig:347) [2020-07-14 16:50:15,118] INFO [source-datagen-01|worker] EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.confluent.kafka.connect.datagen.DatagenConnector errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = source-datagen-01 tasks.max = 1 transforms = [] value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347) [2020-07-14 16:50:15,118] INFO [source-datagen-01|worker] Creating connector source-datagen-01 of type io.confluent.kafka.connect.datagen.DatagenConnector (org.apache.kafka.connect.runtime.Worker:251) [2020-07-14 16:50:15,135] INFO [source-datagen-01|worker] Instantiated connector source-datagen-01 with version null of type class io.confluent.kafka.connect.datagen.DatagenConnector (org.apache.kafka.connect.runtime.Worker:254) [2020-07-14 16:50:15,137] INFO [source-datagen-01|worker] DatagenConnectorConfig values: iterations = -1 kafka.topic = ratings max.interval = 750 quickstart = ratings random.seed = null schema.filename = schema.keyfield = (io.confluent.kafka.connect.datagen.DatagenConnectorConfig:347) [2020-07-14 16:50:15,158] INFO [source-datagen-01|worker] Finished creating connector source-datagen-01 (org.apache.kafka.connect.runtime.Worker:273) [2020-07-14 16:50:15,159] INFO SourceConnectorConfig values: config.action.reload = restart connector.class = io.confluent.kafka.connect.datagen.DatagenConnector errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = source-datagen-01 tasks.max = 1 transforms = [] value.converter = null (org.apache.kafka.connect.runtime.SourceConnectorConfig:347) [2020-07-14 16:50:15,160] INFO EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.confluent.kafka.connect.datagen.DatagenConnector errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = source-datagen-01 tasks.max = 1 transforms = [] value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347) {"name":"source-datagen-01","config":{"connector.class":"io.confluent.kafka.connect.datagen.DatagenConnector","key.converter":"org.apache.kafka.connect.storage.StringConverter","kafka.topic":"ratings","max.interval":"750","quickstart":"ratings","tasks.max":"1","name":"source-datagen-01"},"tasks":[],"type":"source"}[2020-07-14 16:50:15,614] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Tasks [source-datagen-01-0] configs updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1418) [2020-07-14 16:50:16,118] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1132) [2020-07-14 16:50:16,120] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Handling task config update by restarting tasks [] (org.apache.kafka.connect.runtime.distributed.DistributedHerder:581) [2020-07-14 16:50:16,120] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:222) [2020-07-14 16:50:16,121] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 16:50:16,129] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Successfully joined group with generation 3 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:503) [2020-07-14 16:50:16,130] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Joined group at generation 3 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-70fcb2fc-220b-4de1-96e9-f846d1e9df50', leaderUrl='http://kafka-connect-01:8083/', offset=4, connectorIds=[source-datagen-01], taskIds=[source-datagen-01-0], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1549) [2020-07-14 16:50:16,130] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Starting connectors and tasks using config offset 4 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1111) [2020-07-14 16:50:16,131] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Starting task source-datagen-01-0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1146) [2020-07-14 16:50:16,132] INFO [source-datagen-01|task-0] Creating task source-datagen-01-0 (org.apache.kafka.connect.runtime.Worker:419) [2020-07-14 16:50:16,133] INFO [source-datagen-01|task-0] ConnectorConfig values: config.action.reload = restart connector.class = io.confluent.kafka.connect.datagen.DatagenConnector errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = source-datagen-01 tasks.max = 1 transforms = [] value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig:347) [2020-07-14 16:50:16,134] INFO [source-datagen-01|task-0] EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.confluent.kafka.connect.datagen.DatagenConnector errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = source-datagen-01 tasks.max = 1 transforms = [] value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347) [2020-07-14 16:50:16,136] INFO [source-datagen-01|task-0] TaskConfig values: task.class = class io.confluent.kafka.connect.datagen.DatagenTask (org.apache.kafka.connect.runtime.TaskConfig:347) [2020-07-14 16:50:16,136] INFO [source-datagen-01|task-0] Instantiated task source-datagen-01-0 with version null of type io.confluent.kafka.connect.datagen.DatagenTask (org.apache.kafka.connect.runtime.Worker:434) [2020-07-14 16:50:16,138] INFO [source-datagen-01|task-0] StringConverterConfig values: converter.encoding = UTF8 converter.type = key (org.apache.kafka.connect.storage.StringConverterConfig:347) [2020-07-14 16:50:16,138] INFO [source-datagen-01|task-0] Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task source-datagen-01-0 using the connector config (org.apache.kafka.connect.runtime.Worker:449) [2020-07-14 16:50:16,145] INFO [source-datagen-01|task-0] AvroConverterConfig values: bearer.auth.token = [hidden] proxy.port = -1 schema.reflection = false auto.register.schemas = true max.schemas.per.subject = 1000 basic.auth.credentials.source = URL value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy schema.registry.url = [http://schema-registry:8081] basic.auth.user.info = [hidden] proxy.host = schema.registry.basic.auth.user.info = [hidden] bearer.auth.credentials.source = STATIC_TOKEN key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy (io.confluent.connect.avro.AvroConverterConfig:179) [2020-07-14 16:50:16,182] INFO [source-datagen-01|task-0] KafkaAvroSerializerConfig values: bearer.auth.token = [hidden] proxy.port = -1 schema.reflection = false auto.register.schemas = true max.schemas.per.subject = 1000 basic.auth.credentials.source = URL value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy schema.registry.url = [http://schema-registry:8081] basic.auth.user.info = [hidden] proxy.host = schema.registry.basic.auth.user.info = [hidden] bearer.auth.credentials.source = STATIC_TOKEN key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy (io.confluent.kafka.serializers.KafkaAvroSerializerConfig:179) [2020-07-14 16:50:16,185] INFO [source-datagen-01|task-0] KafkaAvroDeserializerConfig values: bearer.auth.token = [hidden] proxy.port = -1 schema.reflection = false auto.register.schemas = true max.schemas.per.subject = 1000 basic.auth.credentials.source = URL specific.avro.reader = false value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy schema.registry.url = [http://schema-registry:8081] basic.auth.user.info = [hidden] proxy.host = schema.registry.basic.auth.user.info = [hidden] bearer.auth.credentials.source = STATIC_TOKEN key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy (io.confluent.kafka.serializers.KafkaAvroDeserializerConfig:179) [2020-07-14 16:50:16,217] INFO [source-datagen-01|task-0] AvroDataConfig values: connect.meta.data = true enhanced.avro.schema.support = false schemas.cache.config = 1000 (io.confluent.connect.avro.AvroDataConfig:347) [2020-07-14 16:50:16,217] INFO [source-datagen-01|task-0] Set up the value converter class io.confluent.connect.avro.AvroConverter for task source-datagen-01-0 using the worker config (org.apache.kafka.connect.runtime.Worker:453) [2020-07-14 16:50:16,217] INFO [source-datagen-01|task-0] Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task source-datagen-01-0 using the worker config (org.apache.kafka.connect.runtime.Worker:460) [2020-07-14 16:50:16,222] INFO [source-datagen-01|task-0] Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:514) [2020-07-14 16:50:16,224] INFO [source-datagen-01|task-0] ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:29092] buffer.memory = 33554432 client.dns.lookup = default client.id = connector-producer-source-datagen-01-0 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 2147483647 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer linger.ms = 0 max.block.ms = 9223372036854775807 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 2147483647 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer (org.apache.kafka.clients.producer.ProducerConfig:347) [2020-07-14 16:50:16,229] INFO [source-datagen-01|task-0] Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 16:50:16,229] INFO [source-datagen-01|task-0] Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 16:50:16,229] INFO [source-datagen-01|task-0] Kafka startTimeMs: 1594745416229 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 16:50:16,241] INFO [source-datagen-01|task-0] [Producer clientId=connector-producer-source-datagen-01-0] Cluster ID: d88xyIRzT9ahbzImZzreyA (org.apache.kafka.clients.Metadata:280) [2020-07-14 16:50:16,241] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1132) [2020-07-14 16:50:16,251] INFO [source-datagen-01|task-0] DatagenConnectorConfig values: iterations = -1 kafka.topic = ratings max.interval = 750 quickstart = ratings random.seed = null schema.filename = schema.keyfield = (io.confluent.kafka.connect.datagen.DatagenConnectorConfig:347) [2020-07-14 16:50:16,550] INFO [source-datagen-01|task-0] AvroDataConfig values: connect.meta.data = true enhanced.avro.schema.support = false schemas.cache.config = 1 (io.confluent.connect.avro.AvroDataConfig:347) [2020-07-14 16:50:16,553] INFO [source-datagen-01|task-0] WorkerSourceTask{id=source-datagen-01-0} Source task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask:214) [2020-07-14 16:51:16,240] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 16:51:16,241] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 16:51:16,847] ERROR [source-datagen-01|task-0] Failed to send HTTP request to endpoint: http://schema-registry:8081/subjects/ratings-value/versions (io.confluent.kafka.schemaregistry.client.rest.RestService:268) java.net.SocketTimeoutException: connect timed out at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:589) at sun.net.NetworkClient.doConnect(NetworkClient.java:175) at sun.net.www.http.HttpClient.openServer(HttpClient.java:463) at sun.net.www.http.HttpClient.openServer(HttpClient.java:558) at sun.net.www.http.HttpClient.(HttpClient.java:242) at sun.net.www.http.HttpClient.New(HttpClient.java:339) at sun.net.www.http.HttpClient.New(HttpClient.java:357) at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1220) at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1156) at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1050) at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:984) at sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1334) at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1309) at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:264) at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:352) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:495) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:486) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:459) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:206) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:268) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:244) at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:74) at io.confluent.connect.avro.AvroConverter$Serializer.serialize(AvroConverter.java:138) at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:84) at org.apache.kafka.connect.storage.Converter.fromConnectData(Converter.java:63) at org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$2(WorkerSourceTask.java:295) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104) at org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(WorkerSourceTask.java:295) at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:321) at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:245) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:184) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:234) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [2020-07-14 16:51:16,853] INFO [source-datagen-01|task-0] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 16:51:16,854] INFO [source-datagen-01|task-0] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 16:51:16,854] ERROR [source-datagen-01|task-0] WorkerSourceTask{id=source-datagen-01-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:186) org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:178) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104) at org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(WorkerSourceTask.java:295) at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:321) at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:245) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:184) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:234) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.kafka.connect.errors.DataException: Failed to serialize Avro data from topic ratings : at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:87) at org.apache.kafka.connect.storage.Converter.fromConnectData(Converter.java:63) at org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$2(WorkerSourceTask.java:295) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162) ... 11 more Caused by: org.apache.kafka.common.errors.SerializationException: Error serializing Avro message Caused by: java.net.SocketTimeoutException: connect timed out at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:589) at sun.net.NetworkClient.doConnect(NetworkClient.java:175) at sun.net.www.http.HttpClient.openServer(HttpClient.java:463) at sun.net.www.http.HttpClient.openServer(HttpClient.java:558) at sun.net.www.http.HttpClient.(HttpClient.java:242) at sun.net.www.http.HttpClient.New(HttpClient.java:339) at sun.net.www.http.HttpClient.New(HttpClient.java:357) at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1220) at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1156) at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1050) at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:984) at sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1334) at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1309) at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:264) at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:352) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:495) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:486) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:459) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:206) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:268) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:244) at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:74) at io.confluent.connect.avro.AvroConverter$Serializer.serialize(AvroConverter.java:138) at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:84) at org.apache.kafka.connect.storage.Converter.fromConnectData(Converter.java:63) at org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$2(WorkerSourceTask.java:295) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104) at org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(WorkerSourceTask.java:295) at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:321) at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:245) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:184) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:234) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [2020-07-14 16:51:16,856] ERROR [source-datagen-01|task-0] WorkerSourceTask{id=source-datagen-01-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:187) [2020-07-14 16:51:16,856] INFO [source-datagen-01|task-0] [Producer clientId=connector-producer-source-datagen-01-0] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:1182) [2020-07-14 16:52:16,241] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 16:52:16,241] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 16:53:16,243] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 16:53:16,245] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 16:54:16,250] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 16:54:16,259] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 16:55:16,262] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 16:55:16,262] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 16:56:16,264] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 16:56:16,264] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 16:57:16,265] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 16:57:16,265] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 16:58:16,268] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 16:58:16,268] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 16:59:16,268] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 16:59:16,269] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:00:16,270] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:00:16,270] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:01:16,272] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:01:16,272] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:02:16,274] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:02:16,275] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:03:16,275] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:03:16,276] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:04:16,278] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:04:16,278] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:05:16,277] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:05:16,277] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:06:16,278] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:06:16,280] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:07:16,294] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:07:16,295] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:08:16,286] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:08:16,286] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:08:33,895] INFO AbstractConfig values: (org.apache.kafka.common.config.AbstractConfig:347) [2020-07-14 17:08:33,943] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Connector SINK_ES_RATINGS config updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1403) [2020-07-14 17:08:34,451] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:222) [2020-07-14 17:08:34,452] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 17:08:34,484] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Successfully joined group with generation 4 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:503) [2020-07-14 17:08:34,485] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Joined group at generation 4 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-70fcb2fc-220b-4de1-96e9-f846d1e9df50', leaderUrl='http://kafka-connect-01:8083/', offset=5, connectorIds=[SINK_ES_RATINGS, source-datagen-01], taskIds=[source-datagen-01-0], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1549) [2020-07-14 17:08:34,492] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Starting connectors and tasks using config offset 5 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1111) [2020-07-14 17:08:34,509] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Starting connector SINK_ES_RATINGS (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1186) [2020-07-14 17:08:34,511] INFO [SINK_ES_RATINGS|worker] ConnectorConfig values: config.action.reload = restart connector.class = io.confluent.connect.elasticsearch.ElasticsearchSinkConnector errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = SINK_ES_RATINGS tasks.max = 1 transforms = [ExtractTimestamp] value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig:347) [2020-07-14 17:08:34,514] INFO [SINK_ES_RATINGS|worker] EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.confluent.connect.elasticsearch.ElasticsearchSinkConnector errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = SINK_ES_RATINGS tasks.max = 1 transforms = [ExtractTimestamp] transforms.ExtractTimestamp.offset.field = null transforms.ExtractTimestamp.partition.field = null transforms.ExtractTimestamp.static.field = null transforms.ExtractTimestamp.static.value = null transforms.ExtractTimestamp.timestamp.field = RATING_TS transforms.ExtractTimestamp.topic.field = null transforms.ExtractTimestamp.type = class org.apache.kafka.connect.transforms.InsertField$Value value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347) [2020-07-14 17:08:34,514] INFO [SINK_ES_RATINGS|worker] Creating connector SINK_ES_RATINGS of type io.confluent.connect.elasticsearch.ElasticsearchSinkConnector (org.apache.kafka.connect.runtime.Worker:251) [2020-07-14 17:08:34,521] INFO [SINK_ES_RATINGS|worker] Instantiated connector SINK_ES_RATINGS with version 5.5.0 of type class io.confluent.connect.elasticsearch.ElasticsearchSinkConnector (org.apache.kafka.connect.runtime.Worker:254) [2020-07-14 17:08:34,522] INFO [SINK_ES_RATINGS|worker] ElasticsearchSinkConnectorConfig values: auto.create.indices.at.start = true batch.size = 2000 behavior.on.malformed.documents = fail behavior.on.null.values = ignore compact.map.entries = true connection.compression = false connection.password = null connection.timeout.ms = 1000 connection.url = [http://elasticsearch:9200] connection.username = null drop.invalid.message = false elastic.https.ssl.cipher.suites = null elastic.https.ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] elastic.https.ssl.endpoint.identification.algorithm = https elastic.https.ssl.key.password = null elastic.https.ssl.keymanager.algorithm = SunX509 elastic.https.ssl.keystore.location = null elastic.https.ssl.keystore.password = null elastic.https.ssl.keystore.type = JKS elastic.https.ssl.protocol = TLS elastic.https.ssl.provider = null elastic.https.ssl.secure.random.implementation = null elastic.https.ssl.trustmanager.algorithm = PKIX elastic.https.ssl.truststore.location = null elastic.https.ssl.truststore.password = null elastic.https.ssl.truststore.type = JKS elastic.security.protocol = PLAINTEXT flush.timeout.ms = 10000 key.ignore = false linger.ms = 1 max.buffered.records = 20000 max.in.flight.requests = 5 max.retries = 5 read.timeout.ms = 3000 retry.backoff.ms = 100 schema.ignore = true topic.index.map = [] topic.key.ignore = [] topic.schema.ignore = [] type.name = _doc write.method = insert (io.confluent.connect.elasticsearch.ElasticsearchSinkConnectorConfig:347) [2020-07-14 17:08:34,529] INFO [SINK_ES_RATINGS|worker] Finished creating connector SINK_ES_RATINGS (org.apache.kafka.connect.runtime.Worker:273) [2020-07-14 17:08:34,532] INFO SinkConnectorConfig values: config.action.reload = restart connector.class = io.confluent.connect.elasticsearch.ElasticsearchSinkConnector errors.deadletterqueue.context.headers.enable = false errors.deadletterqueue.topic.name = errors.deadletterqueue.topic.replication.factor = 3 errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = SINK_ES_RATINGS tasks.max = 1 topics = [ratings] topics.regex = transforms = [ExtractTimestamp] value.converter = null (org.apache.kafka.connect.runtime.SinkConnectorConfig:347) [2020-07-14 17:08:34,532] INFO EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.confluent.connect.elasticsearch.ElasticsearchSinkConnector errors.deadletterqueue.context.headers.enable = false errors.deadletterqueue.topic.name = errors.deadletterqueue.topic.replication.factor = 3 errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = SINK_ES_RATINGS tasks.max = 1 topics = [ratings] topics.regex = transforms = [ExtractTimestamp] transforms.ExtractTimestamp.offset.field = null transforms.ExtractTimestamp.partition.field = null transforms.ExtractTimestamp.static.field = null transforms.ExtractTimestamp.static.value = null transforms.ExtractTimestamp.timestamp.field = RATING_TS transforms.ExtractTimestamp.topic.field = null transforms.ExtractTimestamp.type = class org.apache.kafka.connect.transforms.InsertField$Value value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347) [2020-07-14 17:08:35,465] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Tasks [SINK_ES_RATINGS-0] configs updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1418) [2020-07-14 17:08:35,466] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1132) [2020-07-14 17:08:35,468] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Handling task config update by restarting tasks [] (org.apache.kafka.connect.runtime.distributed.DistributedHerder:581) [2020-07-14 17:08:35,468] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:222) [2020-07-14 17:08:35,468] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 17:08:35,478] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Successfully joined group with generation 5 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:503) [2020-07-14 17:08:35,478] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Joined group at generation 5 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-70fcb2fc-220b-4de1-96e9-f846d1e9df50', leaderUrl='http://kafka-connect-01:8083/', offset=7, connectorIds=[SINK_ES_RATINGS, source-datagen-01], taskIds=[SINK_ES_RATINGS-0, source-datagen-01-0], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1549) [2020-07-14 17:08:35,479] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Starting connectors and tasks using config offset 7 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1111) [2020-07-14 17:08:35,480] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Starting task SINK_ES_RATINGS-0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1146) [2020-07-14 17:08:35,481] INFO [SINK_ES_RATINGS|task-0] Creating task SINK_ES_RATINGS-0 (org.apache.kafka.connect.runtime.Worker:419) [2020-07-14 17:08:35,481] INFO [SINK_ES_RATINGS|task-0] ConnectorConfig values: config.action.reload = restart connector.class = io.confluent.connect.elasticsearch.ElasticsearchSinkConnector errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = SINK_ES_RATINGS tasks.max = 1 transforms = [ExtractTimestamp] value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig:347) [2020-07-14 17:08:35,482] INFO [SINK_ES_RATINGS|task-0] EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.confluent.connect.elasticsearch.ElasticsearchSinkConnector errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = SINK_ES_RATINGS tasks.max = 1 transforms = [ExtractTimestamp] transforms.ExtractTimestamp.offset.field = null transforms.ExtractTimestamp.partition.field = null transforms.ExtractTimestamp.static.field = null transforms.ExtractTimestamp.static.value = null transforms.ExtractTimestamp.timestamp.field = RATING_TS transforms.ExtractTimestamp.topic.field = null transforms.ExtractTimestamp.type = class org.apache.kafka.connect.transforms.InsertField$Value value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347) [2020-07-14 17:08:35,485] INFO [SINK_ES_RATINGS|task-0] TaskConfig values: task.class = class io.confluent.connect.elasticsearch.ElasticsearchSinkTask (org.apache.kafka.connect.runtime.TaskConfig:347) [2020-07-14 17:08:35,485] INFO [SINK_ES_RATINGS|task-0] Instantiated task SINK_ES_RATINGS-0 with version 5.5.0 of type io.confluent.connect.elasticsearch.ElasticsearchSinkTask (org.apache.kafka.connect.runtime.Worker:434) [2020-07-14 17:08:35,487] INFO [SINK_ES_RATINGS|task-0] StringConverterConfig values: converter.encoding = UTF8 converter.type = key (org.apache.kafka.connect.storage.StringConverterConfig:347) [2020-07-14 17:08:35,487] INFO [SINK_ES_RATINGS|task-0] Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task SINK_ES_RATINGS-0 using the connector config (org.apache.kafka.connect.runtime.Worker:449) [2020-07-14 17:08:35,490] INFO [SINK_ES_RATINGS|task-0] AvroConverterConfig values: bearer.auth.token = [hidden] proxy.port = -1 schema.reflection = false auto.register.schemas = true max.schemas.per.subject = 1000 basic.auth.credentials.source = URL value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy schema.registry.url = [http://schema-registry:8081] basic.auth.user.info = [hidden] proxy.host = schema.registry.basic.auth.user.info = [hidden] bearer.auth.credentials.source = STATIC_TOKEN key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy (io.confluent.connect.avro.AvroConverterConfig:179) [2020-07-14 17:08:35,492] INFO [SINK_ES_RATINGS|task-0] KafkaAvroSerializerConfig values: bearer.auth.token = [hidden] proxy.port = -1 schema.reflection = false auto.register.schemas = true max.schemas.per.subject = 1000 basic.auth.credentials.source = URL value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy schema.registry.url = [http://schema-registry:8081] basic.auth.user.info = [hidden] proxy.host = schema.registry.basic.auth.user.info = [hidden] bearer.auth.credentials.source = STATIC_TOKEN key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy (io.confluent.kafka.serializers.KafkaAvroSerializerConfig:179) [2020-07-14 17:08:35,493] INFO [SINK_ES_RATINGS|task-0] KafkaAvroDeserializerConfig values: bearer.auth.token = [hidden] proxy.port = -1 schema.reflection = false auto.register.schemas = true max.schemas.per.subject = 1000 basic.auth.credentials.source = URL specific.avro.reader = false value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy schema.registry.url = [http://schema-registry:8081] basic.auth.user.info = [hidden] proxy.host = schema.registry.basic.auth.user.info = [hidden] bearer.auth.credentials.source = STATIC_TOKEN key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy (io.confluent.kafka.serializers.KafkaAvroDeserializerConfig:179) [2020-07-14 17:08:35,493] INFO [SINK_ES_RATINGS|task-0] AvroDataConfig values: connect.meta.data = true enhanced.avro.schema.support = false schemas.cache.config = 1000 (io.confluent.connect.avro.AvroDataConfig:347) [2020-07-14 17:08:35,494] INFO [SINK_ES_RATINGS|task-0] Set up the value converter class io.confluent.connect.avro.AvroConverter for task SINK_ES_RATINGS-0 using the worker config (org.apache.kafka.connect.runtime.Worker:453) [2020-07-14 17:08:35,495] INFO [SINK_ES_RATINGS|task-0] Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task SINK_ES_RATINGS-0 using the worker config (org.apache.kafka.connect.runtime.Worker:460) [2020-07-14 17:08:35,497] INFO [SINK_ES_RATINGS|task-0] Initializing: org.apache.kafka.connect.runtime.TransformationChain{org.apache.kafka.connect.transforms.InsertField$Value} (org.apache.kafka.connect.runtime.Worker:529) [2020-07-14 17:08:35,497] INFO [SINK_ES_RATINGS|task-0] SinkConnectorConfig values: config.action.reload = restart connector.class = io.confluent.connect.elasticsearch.ElasticsearchSinkConnector errors.deadletterqueue.context.headers.enable = false errors.deadletterqueue.topic.name = errors.deadletterqueue.topic.replication.factor = 3 errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = SINK_ES_RATINGS tasks.max = 1 topics = [ratings] topics.regex = transforms = [ExtractTimestamp] value.converter = null (org.apache.kafka.connect.runtime.SinkConnectorConfig:347) [2020-07-14 17:08:35,498] INFO [SINK_ES_RATINGS|task-0] EnrichedConnectorConfig values: config.action.reload = restart connector.class = io.confluent.connect.elasticsearch.ElasticsearchSinkConnector errors.deadletterqueue.context.headers.enable = false errors.deadletterqueue.topic.name = errors.deadletterqueue.topic.replication.factor = 3 errors.log.enable = false errors.log.include.messages = false errors.retry.delay.max.ms = 60000 errors.retry.timeout = 0 errors.tolerance = none header.converter = null key.converter = class org.apache.kafka.connect.storage.StringConverter name = SINK_ES_RATINGS tasks.max = 1 topics = [ratings] topics.regex = transforms = [ExtractTimestamp] transforms.ExtractTimestamp.offset.field = null transforms.ExtractTimestamp.partition.field = null transforms.ExtractTimestamp.static.field = null transforms.ExtractTimestamp.static.value = null transforms.ExtractTimestamp.timestamp.field = RATING_TS transforms.ExtractTimestamp.topic.field = null transforms.ExtractTimestamp.type = class org.apache.kafka.connect.transforms.InsertField$Value value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347) [2020-07-14 17:08:35,501] INFO [SINK_ES_RATINGS|task-0] ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:29092] check.crcs = true client.dns.lookup = default client.id = connector-consumer-SINK_ES_RATINGS-0 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = connect-SINK_ES_RATINGS group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer (org.apache.kafka.clients.consumer.ConsumerConfig:347) [2020-07-14 17:08:35,519] INFO [SINK_ES_RATINGS|task-0] Kafka version: 5.5.0-ccs (org.apache.kafka.common.utils.AppInfoParser:117) [2020-07-14 17:08:35,519] INFO [SINK_ES_RATINGS|task-0] Kafka commitId: 606822a624024828 (org.apache.kafka.common.utils.AppInfoParser:118) [2020-07-14 17:08:35,519] INFO [SINK_ES_RATINGS|task-0] Kafka startTimeMs: 1594746515518 (org.apache.kafka.common.utils.AppInfoParser:119) [2020-07-14 17:08:35,532] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1132) [2020-07-14 17:08:35,537] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Subscribed to topic(s): ratings (org.apache.kafka.clients.consumer.KafkaConsumer:974) [2020-07-14 17:08:35,537] INFO [SINK_ES_RATINGS|task-0] Starting ElasticsearchSinkTask (io.confluent.connect.elasticsearch.ElasticsearchSinkTask:58) [2020-07-14 17:08:35,538] INFO [SINK_ES_RATINGS|task-0] ElasticsearchSinkConnectorConfig values: auto.create.indices.at.start = true batch.size = 2000 behavior.on.malformed.documents = fail behavior.on.null.values = ignore compact.map.entries = true connection.compression = false connection.password = null connection.timeout.ms = 1000 connection.url = [http://elasticsearch:9200] connection.username = null drop.invalid.message = false elastic.https.ssl.cipher.suites = null elastic.https.ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] elastic.https.ssl.endpoint.identification.algorithm = https elastic.https.ssl.key.password = null elastic.https.ssl.keymanager.algorithm = SunX509 elastic.https.ssl.keystore.location = null elastic.https.ssl.keystore.password = null elastic.https.ssl.keystore.type = JKS elastic.https.ssl.protocol = TLS elastic.https.ssl.provider = null elastic.https.ssl.secure.random.implementation = null elastic.https.ssl.trustmanager.algorithm = PKIX elastic.https.ssl.truststore.location = null elastic.https.ssl.truststore.password = null elastic.https.ssl.truststore.type = JKS elastic.security.protocol = PLAINTEXT flush.timeout.ms = 10000 key.ignore = false linger.ms = 1 max.buffered.records = 20000 max.in.flight.requests = 5 max.retries = 5 read.timeout.ms = 3000 retry.backoff.ms = 100 schema.ignore = true topic.index.map = [] topic.key.ignore = [] topic.schema.ignore = [] type.name = _doc write.method = insert (io.confluent.connect.elasticsearch.ElasticsearchSinkConnectorConfig:347) [2020-07-14 17:08:35,611] INFO [SINK_ES_RATINGS|task-0] ElasticsearchSinkConnectorConfig values: auto.create.indices.at.start = true batch.size = 2000 behavior.on.malformed.documents = fail behavior.on.null.values = ignore compact.map.entries = true connection.compression = false connection.password = null connection.timeout.ms = 1000 connection.url = [http://elasticsearch:9200] connection.username = null drop.invalid.message = false elastic.https.ssl.cipher.suites = null elastic.https.ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] elastic.https.ssl.endpoint.identification.algorithm = https elastic.https.ssl.key.password = null elastic.https.ssl.keymanager.algorithm = SunX509 elastic.https.ssl.keystore.location = null elastic.https.ssl.keystore.password = null elastic.https.ssl.keystore.type = JKS elastic.https.ssl.protocol = TLS elastic.https.ssl.provider = null elastic.https.ssl.secure.random.implementation = null elastic.https.ssl.trustmanager.algorithm = PKIX elastic.https.ssl.truststore.location = null elastic.https.ssl.truststore.password = null elastic.https.ssl.truststore.type = JKS elastic.security.protocol = PLAINTEXT flush.timeout.ms = 10000 key.ignore = false linger.ms = 1 max.buffered.records = 20000 max.in.flight.requests = 5 max.retries = 5 read.timeout.ms = 3000 retry.backoff.ms = 100 schema.ignore = true topic.index.map = [] topic.key.ignore = [] topic.schema.ignore = [] type.name = _doc write.method = insert (io.confluent.connect.elasticsearch.ElasticsearchSinkConnectorConfig:347) [2020-07-14 17:08:35,614] INFO [SINK_ES_RATINGS|task-0] Using unsecured connection to [http://elasticsearch:9200] (io.confluent.connect.elasticsearch.jest.JestElasticsearchClient:207) [2020-07-14 17:08:35,956] INFO [SINK_ES_RATINGS|task-0] Setting server pool to a list of 1 servers: [http://elasticsearch:9200] (io.searchbox.client.AbstractJestClient:60) [2020-07-14 17:08:35,957] INFO [SINK_ES_RATINGS|task-0] Using multi thread/connection supporting pooling connection manager (io.searchbox.client.JestClientFactory:224) [2020-07-14 17:08:36,034] INFO [SINK_ES_RATINGS|task-0] Using default GSON instance (io.searchbox.client.JestClientFactory:68) [2020-07-14 17:08:36,035] INFO [SINK_ES_RATINGS|task-0] Node Discovery disabled... (io.searchbox.client.JestClientFactory:85) [2020-07-14 17:08:36,035] INFO [SINK_ES_RATINGS|task-0] Idle connection reaping disabled... (io.searchbox.client.JestClientFactory:97) [2020-07-14 17:08:36,185] INFO [SINK_ES_RATINGS|task-0] Detected Elasticsearch version is ES_V7 (io.confluent.connect.elasticsearch.jest.JestElasticsearchClient:291) [2020-07-14 17:08:36,188] INFO [SINK_ES_RATINGS|task-0] JsonConverterConfig values: converter.type = value decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false (org.apache.kafka.connect.json.JsonConverterConfig:347) [2020-07-14 17:08:36,196] INFO [SINK_ES_RATINGS|task-0] Started ElasticsearchSinkTask, will ignore records with null values ('behavior.on.null.values') (io.confluent.connect.elasticsearch.ElasticsearchSinkTask:146) [2020-07-14 17:08:36,197] INFO [SINK_ES_RATINGS|task-0] WorkerSinkTask{id=SINK_ES_RATINGS-0} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:306) [2020-07-14 17:08:36,281] WARN [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Error while fetching metadata with correlation id 2 : {ratings=LEADER_NOT_AVAILABLE} (org.apache.kafka.clients.NetworkClient:1077) [2020-07-14 17:08:36,282] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Cluster ID: d88xyIRzT9ahbzImZzreyA (org.apache.kafka.clients.Metadata:280) [2020-07-14 17:08:36,283] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 17:08:36,284] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 17:08:36,296] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Join group failed with org.apache.kafka.common.errors.MemberIdRequiredException: The group member needs to have a valid member id before actually entering a consumer group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:455) [2020-07-14 17:08:36,296] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 17:08:36,386] WARN [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Error while fetching metadata with correlation id 7 : {ratings=LEADER_NOT_AVAILABLE} (org.apache.kafka.clients.NetworkClient:1077) [2020-07-14 17:08:36,497] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Finished assignment for group at generation 1: {connector-consumer-SINK_ES_RATINGS-0-1f7cc89c-ad6e-4887-ac37-259f09845966=Assignment(partitions=[ratings-0])} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:604) [2020-07-14 17:08:36,505] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Successfully joined group with generation 1 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:503) [2020-07-14 17:08:36,506] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Adding newly assigned partitions: ratings-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:273) [2020-07-14 17:08:36,525] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Found no committed offset for partition ratings-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1299) [2020-07-14 17:08:36,530] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Resetting offset for partition ratings-0 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 17:08:36,535] INFO [SINK_ES_RATINGS|task-0] Index 'ratings' not found in local cache; checking for existence (io.confluent.connect.elasticsearch.jest.JestElasticsearchClient:319) [2020-07-14 17:08:36,544] INFO [SINK_ES_RATINGS|task-0] Index 'ratings' not found in Elasticsearch. Error message: 404 Not Found (io.confluent.connect.elasticsearch.jest.JestElasticsearchClient:327) [2020-07-14 17:08:36,548] INFO [SINK_ES_RATINGS|task-0] Requesting Elasticsearch create index 'ratings' (io.confluent.connect.elasticsearch.jest.JestElasticsearchClient:369) [2020-07-14 17:08:36,677] INFO [SINK_ES_RATINGS|task-0] Index 'ratings' created in Elasticsearch; adding to local cache (io.confluent.connect.elasticsearch.jest.JestElasticsearchClient:381) [2020-07-14 17:09:16,287] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:09:16,288] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:10:16,290] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:10:16,290] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:11:16,293] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:11:16,294] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:12:16,297] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:12:16,297] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:13:16,300] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:13:16,302] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:14:16,303] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:14:16,304] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:15:16,305] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:15:16,306] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:16:16,308] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:16:16,308] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:17:16,312] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:17:16,312] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:18:16,313] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:18:16,313] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:19:16,315] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:19:16,315] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:20:16,316] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:20:16,316] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:21:16,318] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:21:16,318] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:22:16,320] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:22:16,320] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:23:16,322] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:23:16,322] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:24:16,325] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:24:16,326] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:25:16,325] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:25:16,325] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:26:16,329] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:26:16,329] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:26:45,995] INFO AbstractConfig values: (org.apache.kafka.common.config.AbstractConfig:347) [2020-07-14 17:26:46,061] INFO AbstractConfig values: (org.apache.kafka.common.config.AbstractConfig:347) [2020-07-14 17:26:46,070] INFO AbstractConfig values: (org.apache.kafka.common.config.AbstractConfig:347) [2020-07-14 17:27:16,332] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:27:16,333] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:28:16,334] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:28:16,334] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:29:16,338] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:29:16,338] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:30:16,341] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:30:16,341] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:31:16,343] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:31:16,344] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:32:16,344] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:32:16,345] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:33:16,343] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:33:16,343] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:34:16,343] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:34:16,344] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:35:16,347] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:35:16,347] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:36:16,350] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:36:16,351] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:37:16,351] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:37:16,352] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:38:16,352] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:38:16,352] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:39:16,354] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:39:16,355] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:40:16,356] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:40:16,356] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:41:16,357] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:41:16,357] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:42:16,358] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:42:16,358] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:43:16,360] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:43:16,360] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:44:16,363] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:44:16,364] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:45:16,362] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:45:16,362] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:46:16,364] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:46:16,364] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:47:16,366] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:47:16,366] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:48:16,368] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:48:16,368] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:49:16,370] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:49:16,371] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:50:12,300] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Session key updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1447) [2020-07-14 17:50:16,373] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:50:16,373] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:51:16,375] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:51:16,375] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:52:16,377] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:52:16,377] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:53:16,378] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:53:16,379] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:54:16,380] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:54:16,381] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:55:16,382] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:55:16,382] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:56:16,383] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:56:16,383] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:57:16,386] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:57:16,387] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:58:16,388] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:58:16,389] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 17:59:16,390] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 17:59:16,391] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:00:16,393] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:00:16,394] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:01:16,400] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:01:16,401] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:02:16,404] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:02:16,405] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:03:16,407] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:03:16,407] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:04:16,409] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:04:16,410] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:05:16,411] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:05:16,411] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:06:16,413] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:06:16,413] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:07:16,415] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:07:16,416] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:08:16,418] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:08:16,419] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:09:16,420] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:09:16,421] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:10:16,424] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:10:16,425] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:11:16,427] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:11:16,428] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:12:16,429] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:12:16,429] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:13:16,431] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:13:16,432] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:14:16,434] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:14:16,436] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:15:16,439] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:15:16,439] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:16:16,444] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:16:16,445] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:17:16,444] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:17:16,444] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:18:16,446] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:18:16,447] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:19:16,449] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:19:16,450] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:20:16,451] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:20:16,451] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:21:16,454] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:21:16,454] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:22:16,456] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:22:16,456] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:23:16,458] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:23:16,458] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:24:16,459] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:24:16,459] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:25:16,461] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:25:16,462] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:26:16,463] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:26:16,463] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:27:16,466] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:27:16,469] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:28:16,470] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:28:16,471] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:29:16,473] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:29:16,475] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:30:16,476] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:30:16,477] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:31:16,478] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:31:16,479] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:32:16,480] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:32:16,481] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:33:16,476] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:33:16,476] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:34:16,465] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:34:16,465] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:35:16,466] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:35:16,466] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:36:16,467] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:36:16,467] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:37:16,470] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:37:16,470] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:38:16,471] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:38:16,472] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:39:16,473] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:39:16,473] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:40:16,477] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:40:16,478] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:41:16,480] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:41:16,480] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:42:16,482] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:42:16,482] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:43:16,484] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:43:16,485] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:44:16,487] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:44:16,489] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:45:16,495] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:45:16,495] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:46:16,496] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:46:16,497] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:47:16,499] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:47:16,500] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:48:16,499] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:48:16,499] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:49:16,502] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:49:16,502] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:50:12,864] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Session key updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1447) [2020-07-14 18:50:16,505] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:50:16,506] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:51:16,507] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:51:16,508] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:52:16,511] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:52:16,513] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:53:16,515] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:53:16,516] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:54:16,518] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:54:16,519] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:55:16,520] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:55:16,520] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:56:16,523] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:56:16,523] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:57:16,527] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:57:16,527] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:58:16,530] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:58:16,533] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 18:59:16,534] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 18:59:16,535] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:00:16,533] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:00:16,533] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:01:16,533] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:01:16,533] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:02:16,535] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:02:16,536] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:03:16,541] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:03:16,543] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:03:39,658] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 19:03:39,652] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 19:03:39,768] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 19:03:39,873] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 19:03:43,165] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Attempt to heartbeat failed for since member id connector-consumer-SINK_ES_RATINGS-0-1f7cc89c-ad6e-4887-ac37-259f09845966 is not valid. (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:1065) [2020-07-14 19:03:43,172] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Giving away all assigned partitions as lost since generation has been reset,indicating that consumer is no longer part of the group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:669) [2020-07-14 19:03:43,174] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Lost previously assigned partitions ratings-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:311) [2020-07-14 19:03:43,183] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 19:03:43,215] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Join group failed with org.apache.kafka.common.errors.MemberIdRequiredException: The group member needs to have a valid member id before actually entering a consumer group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:455) [2020-07-14 19:03:43,216] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 19:03:43,418] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Finished assignment for group at generation 3: {connector-consumer-SINK_ES_RATINGS-0-ac22a98b-8a84-40cf-92d8-63681b849864=Assignment(partitions=[ratings-0])} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:604) [2020-07-14 19:03:43,435] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Successfully joined group with generation 3 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:503) [2020-07-14 19:03:43,437] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Adding newly assigned partitions: ratings-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:273) [2020-07-14 19:03:43,444] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Found no committed offset for partition ratings-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1299) [2020-07-14 19:03:43,454] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Resetting offset for partition ratings-0 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState:383) [2020-07-14 19:19:19,000] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 19:19:19,002] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 19:19:19,200] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 19:19:19,235] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Member connect-1-70fcb2fc-220b-4de1-96e9-f846d1e9df50 sending LeaveGroup request to coordinator kafka:29092 (id: 2147483646 rack: null) due to consumer poll timeout has expired. This means the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time processing messages. You can address this either by increasing max.poll.interval.ms or by reducing the maximum size of batches returned in poll() with max.poll.records. (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:979) [2020-07-14 19:19:19,243] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 19:19:19,255] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:222) [2020-07-14 19:19:19,256] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 19:19:19,371] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Join group failed with org.apache.kafka.common.errors.MemberIdRequiredException: The group member needs to have a valid member id before actually entering a consumer group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:455) [2020-07-14 19:19:19,371] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 19:19:19,489] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Successfully joined group with generation 7 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:503) [2020-07-14 19:19:19,491] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Joined group at generation 7 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-f334387f-4bf8-44ee-9a96-2fffe9a75c41', leaderUrl='http://kafka-connect-01:8083/', offset=9, connectorIds=[SINK_ES_RATINGS, source-datagen-01], taskIds=[SINK_ES_RATINGS-0, source-datagen-01-0], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1549) [2020-07-14 19:19:19,495] WARN [Worker clientId=connect-1, groupId=kafka-connect-01] Catching up to assignment's config offset. (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1020) [2020-07-14 19:19:19,497] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Current config state offset 7 is behind group assignment 9, reading to end of config log (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1081) [2020-07-14 19:19:19,721] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Finished reading to end of log and updated config snapshot, new config log offset: 9 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1085) [2020-07-14 19:19:19,721] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Starting connectors and tasks using config offset 9 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1111) [2020-07-14 19:19:19,722] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1132) [2020-07-14 19:19:38,143] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:19:38,144] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:20:38,087] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:20:38,087] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:21:38,022] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:21:38,022] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:22:37,956] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:22:37,956] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:23:37,888] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:23:37,889] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:24:37,821] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:24:37,821] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:25:37,754] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:25:37,754] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:26:37,687] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:26:37,687] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:27:37,620] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:27:37,621] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:28:37,554] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:28:37,555] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:29:37,488] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:29:37,488] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:30:37,421] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:30:37,422] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:31:37,356] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:31:37,356] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:32:37,290] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:32:37,290] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:33:37,222] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:33:37,223] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:34:37,157] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:34:37,157] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:35:37,089] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:35:37,090] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:36:37,023] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:36:37,025] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:37:36,958] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:37:36,958] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:38:36,891] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:38:36,892] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:39:36,824] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:39:36,824] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:40:36,759] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:40:36,760] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:41:36,692] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:41:36,692] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:42:36,628] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:42:36,629] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:43:36,561] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:43:36,561] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:44:36,494] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:44:36,495] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:45:36,426] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:45:36,427] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:46:36,360] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:46:36,361] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:47:36,295] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:47:36,296] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:48:36,224] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:48:36,224] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 19:49:09,137] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 19:49:09,153] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 19:49:09,160] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 19:49:09,274] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 19:49:09,567] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 19:49:09,611] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 19:49:45,661] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 19:49:45,663] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:20:23,582] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 20:20:23,614] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 20:20:23,617] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 20:20:23,733] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 20:20:23,749] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Member connect-1-f334387f-4bf8-44ee-9a96-2fffe9a75c41 sending LeaveGroup request to coordinator kafka:29092 (id: 2147483646 rack: null) due to consumer poll timeout has expired. This means the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time processing messages. You can address this either by increasing max.poll.interval.ms or by reducing the maximum size of batches returned in poll() with max.poll.records. (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:979) [2020-07-14 20:20:23,756] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:222) [2020-07-14 20:20:23,757] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 20:20:23,846] INFO [Consumer clientId=consumer-kafka-connect-01-3, groupId=kafka-connect-01] Error sending fetch request (sessionId=1062271281, epoch=19591) to node 1: {}. (org.apache.kafka.clients.FetchSessionHandler:481) org.apache.kafka.common.errors.DisconnectException [2020-07-14 20:20:23,913] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 20:20:23,849] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Error sending fetch request (sessionId=573350713, epoch=17386) to node 1: {}. (org.apache.kafka.clients.FetchSessionHandler:481) org.apache.kafka.common.errors.DisconnectException [2020-07-14 20:20:24,072] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Join group failed with org.apache.kafka.common.errors.MemberIdRequiredException: The group member needs to have a valid member id before actually entering a consumer group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:455) [2020-07-14 20:20:24,075] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 20:20:24,096] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 20:20:24,321] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Successfully joined group with generation 9 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:503) [2020-07-14 20:20:24,326] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Joined group at generation 9 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-0dde1abb-5b61-4588-bdc4-1e9483f7bc7f', leaderUrl='http://kafka-connect-01:8083/', offset=9, connectorIds=[SINK_ES_RATINGS, source-datagen-01], taskIds=[SINK_ES_RATINGS-0, source-datagen-01-0], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1549) [2020-07-14 20:20:24,334] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Starting connectors and tasks using config offset 9 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1111) [2020-07-14 20:20:24,342] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1132) [2020-07-14 20:20:24,496] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Session key updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1447) [2020-07-14 20:21:00,089] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:21:00,092] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:22:00,035] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:22:00,040] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:22:59,972] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:22:59,974] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:28:40,028] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 20:28:40,048] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 20:28:40,053] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 20:28:40,053] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Group coordinator kafka:29092 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:849) [2020-07-14 20:28:40,156] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 20:28:40,173] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Member connect-1-0dde1abb-5b61-4588-bdc4-1e9483f7bc7f sending LeaveGroup request to coordinator kafka:29092 (id: 2147483646 rack: null) due to consumer poll timeout has expired. This means the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time processing messages. You can address this either by increasing max.poll.interval.ms or by reducing the maximum size of batches returned in poll() with max.poll.records. (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:979) [2020-07-14 20:28:40,180] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:222) [2020-07-14 20:28:40,181] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 20:28:40,189] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Join group failed with org.apache.kafka.common.errors.MemberIdRequiredException: The group member needs to have a valid member id before actually entering a consumer group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:455) [2020-07-14 20:28:40,189] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:552) [2020-07-14 20:28:40,308] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Successfully joined group with generation 11 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:503) [2020-07-14 20:28:40,309] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Joined group at generation 11 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-4b0aba2c-4a5c-48dd-951d-ba216981c754', leaderUrl='http://kafka-connect-01:8083/', offset=10, connectorIds=[SINK_ES_RATINGS, source-datagen-01], taskIds=[SINK_ES_RATINGS-0, source-datagen-01-0], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1549) [2020-07-14 20:28:40,309] WARN [Worker clientId=connect-1, groupId=kafka-connect-01] Catching up to assignment's config offset. (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1020) [2020-07-14 20:28:40,309] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Current config state offset 9 is behind group assignment 10, reading to end of config log (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1081) [2020-07-14 20:28:40,538] INFO [SINK_ES_RATINGS|task-0] [Consumer clientId=connector-consumer-SINK_ES_RATINGS-0, groupId=connect-SINK_ES_RATINGS] Discovered group coordinator kafka:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:797) [2020-07-14 20:28:40,682] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Finished reading to end of log and updated config snapshot, new config log offset: 10 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1085) [2020-07-14 20:28:40,682] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Starting connectors and tasks using config offset 10 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1111) [2020-07-14 20:28:40,683] INFO [Worker clientId=connect-1, groupId=kafka-connect-01] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1132) [2020-07-14 20:29:31,125] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:29:31,126] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:30:31,057] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:30:31,058] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:31:30,991] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:31:30,992] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:32:30,923] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:32:30,923] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:33:30,854] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:33:30,854] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:34:30,786] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:34:30,787] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:34:33,930] INFO AbstractConfig values: (org.apache.kafka.common.config.AbstractConfig:347) [2020-07-14 20:34:33,948] INFO AbstractConfig values: (org.apache.kafka.common.config.AbstractConfig:347) [2020-07-14 20:34:33,960] INFO AbstractConfig values: (org.apache.kafka.common.config.AbstractConfig:347) [2020-07-14 20:35:30,718] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:35:30,718] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:36:30,649] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:36:30,650] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:37:30,581] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:37:30,582] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:38:30,514] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:38:30,515] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:39:30,448] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:39:30,449] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:40:30,380] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:40:30,381] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:41:30,315] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:41:30,316] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:42:30,248] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:42:30,248] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:43:30,180] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:43:30,181] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:44:30,113] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:44:30,113] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:45:30,044] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:45:30,045] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:46:29,977] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:46:29,978] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:47:29,911] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:47:29,912] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:48:29,844] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:48:29,844] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:49:29,777] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:49:29,777] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:50:29,708] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:50:29,709] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:51:29,641] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:51:29,641] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:52:29,574] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:52:29,574] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:53:29,506] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:53:29,507] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:54:29,438] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:54:29,439] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:55:29,371] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:55:29,371] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:56:29,304] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:56:29,305] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:57:29,238] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:57:29,238] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:58:29,170] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:58:29,170] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441) [2020-07-14 20:59:29,104] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:424) [2020-07-14 20:59:29,104] INFO [source-datagen-01|task-0|offsets] WorkerSourceTask{id=source-datagen-01-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:441)