Skip to content

Excessive Kafka client logging in WebSphere Application Server with OpenTelemetry Java Agent #13833

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
tinnapat opened this issue May 9, 2025 · 4 comments
Labels
needs author feedback Waiting for additional feedback from the author stale

Comments

@tinnapat
Copy link

tinnapat commented May 9, 2025

Description

After instrumenting our application running on WebSphere Application Server with the OpenTelemetry Java Agent, we're experiencing excessive logging related to Kafka clients. These logs are generated continuously (multiple times per second), quickly filling up disk space and overwhelming other important log information in the SystemOut.log file.

The logs appear to be coming from the Kafka client instrumentation, showing repeated consumer creation, configuration, and closing cycles. This is causing:

  1. Rapid disk space consumption
  2. Difficulty in finding relevant application logs
  3. Potential performance impact due to excessive I/O operations

Environment

  • Application Server: IBM WebSphere Application Server 9.0.5.17
  • OpenTelemetry Java Agent Version: 2.15.0
  • Java Version: 1.8.0_381
  • OS: Red Hat Linux 8

Log Sample

[5/2/25 15:05:25:720 ICT] 0000021b SystemOut     O  WARN | [Consumer clientId=consumer-decision_monitoring-112429, groupId=decision_monitoring] Error while fetching metadata with correlation id 72 : {decision_monitoring=UNKNOWN_TOPIC_OR_PARTITION}
[5/2/25 15:05:25:759 ICT] 000002ee SystemOut     O  WARN | The configuration 'auto.commit.interval.ms' was supplied but isn't a known config.
[5/2/25 15:05:25:760 ICT] 000002ee SystemOut     O  INFO | Kafka version: 3.1.0
[5/2/25 15:05:25:760 ICT] 000002ee SystemOut     O  INFO | Kafka commitId: 37edeed0777bacb3
[5/2/25 15:05:25:760 ICT] 000002ee SystemOut     O  INFO | Kafka startTimeMs: 1746173125759
[5/2/25 15:05:25:760 ICT] 000002ee SystemOut     O  INFO | [Consumer clientId=consumer-null-112607, groupId=null] Subscribed to partition(s): PYBIBATCHINDEXPROCESSOR-3
[5/2/25 15:05:25:760 ICT] 000002ee SystemOut     O  INFO | [Consumer clientId=consumer-null-112607, groupId=null] Seeking to offset 0 for partition PYBIBATCHINDEXPROCESSOR-3
[5/2/25 15:05:25:760 ICT] 000002e0 SystemOut     O  INFO | Metrics scheduler closed
[5/2/25 15:05:25:760 ICT] 000002e0 SystemOut     O  INFO | Closing reporter io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter
[5/2/25 15:05:25:760 ICT] 000002e0 SystemOut     O  INFO | Closing reporter org.apache.kafka.common.metrics.JmxReporter
[5/2/25 15:05:25:761 ICT] 000002e0 SystemOut     O  INFO | Metrics reporters closed
[5/2/25 15:05:25:762 ICT] 000002da SystemOut     O  INFO | Metrics scheduler closed
[5/2/25 15:05:25:762 ICT] 000002da SystemOut     O  INFO | Closing reporter io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter
[5/2/25 15:05:25:762 ICT] 000002da SystemOut     O  INFO | Closing reporter org.apache.kafka.common.metrics.JmxReporter
[5/2/25 15:05:25:762 ICT] 000002da SystemOut     O  INFO | Metrics reporters closed
[5/2/25 15:05:25:763 ICT] 000002da SystemOut     O  INFO | App info kafka.consumer for consumer-null-112583 unregistered
[5/2/25 15:05:25:764 ICT] 000002da SystemOut     O  INFO | ConsumerConfig values:
        allow.auto.create.topics = true
        auto.commit.interval.ms = 1000
        auto.offset.reset = earliest
        bootstrap.servers = [10.225.100.77:9092, 10.225.100.76:9092]
        check.crcs = true
        client.dns.lookup = use_all_dns_ips
        client.id = consumer-null-112608
        client.rack =
        connections.max.idle.ms = 540000
        default.api.timeout.ms = 60000
        enable.auto.commit = false
        exclude.internal.topics = true
        fetch.max.bytes = 52428800
        fetch.max.wait.ms = 500
        fetch.min.bytes = 1
        group.id = null
        group.instance.id = null
        heartbeat.interval.ms = 3000
        interceptor.classes = []
        internal.leave.group.on.close = true
        internal.throw.on.fetch.stable.offset.unsupported = false
        isolation.level = read_uncommitted
        key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
        max.partition.fetch.bytes = 1048576
        max.poll.interval.ms = 300000
        max.poll.records = 500
        metadata.max.age.ms = 300000
        metric.reporters = [class io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter]
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
        receive.buffer.bytes = 65536
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 30000
        retry.backoff.ms = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.connect.timeout.ms = null
        sasl.login.read.timeout.ms = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.login.retry.backoff.max.ms = 10000
        sasl.login.retry.backoff.ms = 100
        sasl.mechanism = GSSAPI
        sasl.oauthbearer.clock.skew.seconds = 30
        sasl.oauthbearer.expected.audience = null
        sasl.oauthbearer.expected.issuer = null
        sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
        sasl.oauthbearer.jwks.endpoint.url = null
        sasl.oauthbearer.scope.claim.name = scope
        sasl.oauthbearer.sub.claim.name = sub
        sasl.oauthbearer.token.endpoint.url = null
        security.protocol = PLAINTEXT
        security.providers = null
        send.buffer.bytes = 131072
        session.timeout.ms = 30000
        socket.connection.setup.timeout.max.ms = 30000
        socket.connection.setup.timeout.ms = 10000
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2]
        ssl.endpoint.identification.algorithm = https
        ssl.engine.factory.class = null
        ssl.key.password = null
        ssl.keymanager.algorithm = IbmX509
        ssl.keystore.certificate.chain = null
        ssl.keystore.key = null
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLSv1.2
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.certificates = null
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer

[5/2/25 15:05:25:765 ICT] 000002da SystemOut     O  WARN | The configuration 'auto.commit.interval.ms' was supplied but isn't a known config.
[5/2/25 15:05:25:766 ICT] 000002da SystemOut     O  INFO | Kafka version: 3.1.0
[5/2/25 15:05:25:766 ICT] 000002da SystemOut     O  INFO | Kafka commitId: 37edeed0777bacb3
[5/2/25 15:05:25:766 ICT] 000002da SystemOut     O  INFO | Kafka startTimeMs: 1746173125765
[5/2/25 15:05:25:766 ICT] 000002da SystemOut     O  INFO | [Consumer clientId=consumer-null-112608, groupId=null] Subscribed to partition(s): PXINTERACTIONAGGREGATOR-4
[5/2/25 15:05:25:766 ICT] 000002da SystemOut     O  INFO | [Consumer clientId=consumer-null-112608, groupId=null] Seeking to offset 0 for partition PXINTERACTIONAGGREGATOR-4
[5/2/25 15:05:25:771 ICT] 000002e0 SystemOut     O  INFO | App info kafka.consumer for consumer-null-112581 unregistered
[5/2/25 15:05:25:772 ICT] 000002e0 SystemOut     O  INFO | ConsumerConfig values:
        allow.auto.create.topics = true
        auto.commit.interval.ms = 1000
        auto.offset.reset = earliest
        bootstrap.servers = [10.225.100.77:9092, 10.225.100.76:9092]
        check.crcs = true
        client.dns.lookup = use_all_dns_ips
        client.id = consumer-null-112609
        client.rack =
        connections.max.idle.ms = 540000
        default.api.timeout.ms = 60000
        enable.auto.commit = false
        exclude.internal.topics = true
        fetch.max.bytes = 52428800
        fetch.max.wait.ms = 500
        fetch.min.bytes = 1
        group.id = null
        group.instance.id = null
        heartbeat.interval.ms = 3000
        interceptor.classes = []
        internal.leave.group.on.close = true
        internal.throw.on.fetch.stable.offset.unsupported = false
        isolation.level = read_uncommitted
        key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
        max.partition.fetch.bytes = 1048576
        max.poll.interval.ms = 300000
        max.poll.records = 500
        metadata.max.age.ms = 300000
        metric.reporters = [class io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter]
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
        receive.buffer.bytes = 65536
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 30000
        retry.backoff.ms = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.connect.timeout.ms = null
        sasl.login.read.timeout.ms = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.login.retry.backoff.max.ms = 10000
        sasl.login.retry.backoff.ms = 100
        sasl.mechanism = GSSAPI
        sasl.oauthbearer.clock.skew.seconds = 30
        sasl.oauthbearer.expected.audience = null
        sasl.oauthbearer.expected.issuer = null
        sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
        sasl.oauthbearer.jwks.endpoint.url = null
        sasl.oauthbearer.scope.claim.name = scope
        sasl.oauthbearer.sub.claim.name = sub
        sasl.oauthbearer.token.endpoint.url = null
        security.protocol = PLAINTEXT
        security.providers = null
        send.buffer.bytes = 131072
        session.timeout.ms = 30000
        socket.connection.setup.timeout.max.ms = 30000
        socket.connection.setup.timeout.ms = 10000
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2]
        ssl.endpoint.identification.algorithm = https
        ssl.engine.factory.class = null
        ssl.key.password = null
        ssl.keymanager.algorithm = IbmX509
        ssl.keystore.certificate.chain = null
        ssl.keystore.key = null
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLSv1.2
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.certificates = null
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer

[5/2/25 15:05:25:774 ICT] 000002e0 SystemOut     O  WARN | The configuration 'auto.commit.interval.ms' was supplied but isn't a known config.
[5/2/25 15:05:25:774 ICT] 000002e0 SystemOut     O  INFO | Kafka version: 3.1.0
[5/2/25 15:05:25:774 ICT] 000002e0 SystemOut     O  INFO | Kafka commitId: 37edeed0777bacb3
[5/2/25 15:05:25:774 ICT] 000002e0 SystemOut     O  INFO | Kafka startTimeMs: 1746173125774
[5/2/25 15:05:25:775 ICT] 000002e0 SystemOut     O  INFO | [Consumer clientId=consumer-null-112609, groupId=null] Subscribed to partition(s): PYSASBATCHINDEXCLASSESPROCESSOR-1, PYSASBATCHINDEXCLASSESPROCESSOR-5
[5/2/25 15:05:25:775 ICT] 000002e0 SystemOut     O  INFO | [Consumer clientId=consumer-null-112609, groupId=null] Seeking to offset 0 for partition PYSASBATCHINDEXCLASSESPROCESSOR-1
[5/2/25 15:05:25:775 ICT] 000002e0 SystemOut     O  INFO | [Consumer clientId=consumer-null-112609, groupId=null] Seeking to offset 0 for partition PYSASBATCHINDEXCLASSESPROCESSOR-5
[5/2/25 15:05:25:804 ICT] 000002ee SystemOut     O  INFO | [Consumer clientId=consumer-null-112607, groupId=null] Cluster ID: fb3aH165RHylsZv94gcO7Q
[5/2/25 15:05:25:805 ICT] 000002e0 SystemOut     O  INFO | [Consumer clientId=consumer-null-112609, groupId=null] Cluster ID: fb3aH165RHylsZv94gcO7Q
[5/2/25 15:05:25:819 ICT] 000002e3 SystemOut     O  INFO | Metrics scheduler closed
[5/2/25 15:05:25:820 ICT] 000002e3 SystemOut     O  INFO | Closing reporter io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter
[5/2/25 15:05:25:820 ICT] 000002e3 SystemOut     O  INFO | Closing reporter org.apache.kafka.common.metrics.JmxReporter
[5/2/25 15:05:25:820 ICT] 000002e3 SystemOut     O  INFO | Metrics reporters closed
[5/2/25 15:05:25:821 ICT] 000002e3 SystemOut     O  INFO | App info kafka.consumer for consumer-null-112585 unregistered
[5/2/25 15:05:25:822 ICT] 000002e3 SystemOut     O  INFO | ConsumerConfig values:
        allow.auto.create.topics = true
        auto.commit.interval.ms = 1000
        auto.offset.reset = earliest
        bootstrap.servers = [10.225.100.77:9092, 10.225.100.76:9092]
        check.crcs = true
        client.dns.lookup = use_all_dns_ips
        client.id = consumer-null-112610
        client.rack =
        connections.max.idle.ms = 540000
        default.api.timeout.ms = 60000
        enable.auto.commit = false
        exclude.internal.topics = true
        fetch.max.bytes = 52428800
        fetch.max.wait.ms = 500
        fetch.min.bytes = 1
        group.id = null
        group.instance.id = null
        heartbeat.interval.ms = 3000
        interceptor.classes = []
        internal.leave.group.on.close = true
        internal.throw.on.fetch.stable.offset.unsupported = false
        isolation.level = read_uncommitted
        key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
        max.partition.fetch.bytes = 1048576
        max.poll.interval.ms = 300000
        max.poll.records = 500
        metadata.max.age.ms = 300000
        metric.reporters = [class io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter]
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
        receive.buffer.bytes = 65536
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 30000
        retry.backoff.ms = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.connect.timeout.ms = null
        sasl.login.read.timeout.ms = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.login.retry.backoff.max.ms = 10000
        sasl.login.retry.backoff.ms = 100
        sasl.mechanism = GSSAPI
        sasl.oauthbearer.clock.skew.seconds = 30
        sasl.oauthbearer.expected.audience = null
        sasl.oauthbearer.expected.issuer = null
        sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
        sasl.oauthbearer.jwks.endpoint.url = null
        sasl.oauthbearer.scope.claim.name = scope
        sasl.oauthbearer.sub.claim.name = sub
        sasl.oauthbearer.token.endpoint.url = null
        security.protocol = PLAINTEXT
        security.providers = null
        send.buffer.bytes = 131072
        session.timeout.ms = 30000
        socket.connection.setup.timeout.max.ms = 30000
        socket.connection.setup.timeout.ms = 10000
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2]
        ssl.endpoint.identification.algorithm = https
        ssl.engine.factory.class = null
        ssl.key.password = null
        ssl.keymanager.algorithm = IbmX509
        ssl.keystore.certificate.chain = null
        ssl.keystore.key = null
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLSv1.2
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.certificates = null
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer

[5/2/25 15:05:25:824 ICT] 000002e3 SystemOut     O  WARN | The configuration 'auto.commit.interval.ms' was supplied but isn't a known config.
[5/2/25 15:05:25:825 ICT] 000002e3 SystemOut     O  INFO | Kafka version: 3.1.0
[5/2/25 15:05:25:825 ICT] 000002e3 SystemOut     O  INFO | Kafka commitId: 37edeed0777bacb3
[5/2/25 15:05:25:825 ICT] 000002e3 SystemOut     O  INFO | Kafka startTimeMs: 1746173125824
[5/2/25 15:05:25:825 ICT] 000002e3 SystemOut     O  INFO | [Consumer clientId=consumer-null-112610, groupId=null] Subscribed to partition(s): PYNLPREPORTINGDATAPROCESSOR-3, PYNLPREPORTINGDATAPROCESSOR-5
[5/2/25 15:05:25:825 ICT] 000002e3 SystemOut     O  INFO | [Consumer clientId=consumer-null-112610, groupId=null] Seeking to offset 0 for partition PYNLPREPORTINGDATAPROCESSOR-3
[5/2/25 15:05:25:825 ICT] 000002e3 SystemOut     O  INFO | [Consumer clientId=consumer-null-112610, groupId=null] Seeking to offset 0 for partition PYNLPREPORTINGDATAPROCESSOR-5
[5/2/25 15:05:25:827 ICT] 000002e1 SystemOut     O  INFO | Metrics scheduler closed
[5/2/25 15:05:25:827 ICT] 000002e1 SystemOut     O  INFO | Closing reporter io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter
[5/2/25 15:05:25:827 ICT] 000002e1 SystemOut     O  INFO | Closing reporter org.apache.kafka.common.metrics.JmxReporter
[5/2/25 15:05:25:827 ICT] 000002e1 SystemOut     O  INFO | Metrics reporters closed
[5/2/25 15:05:25:829 ICT] 000002e1 SystemOut     O  INFO | App info kafka.consumer for consumer-null-112584 unregistered
[5/2/25 15:05:25:830 ICT] 000002e1 SystemOut     O  INFO | ConsumerConfig values:
        allow.auto.create.topics = true
        auto.commit.interval.ms = 1000
        auto.offset.reset = earliest
        bootstrap.servers = [10.225.100.77:9092, 10.225.100.76:9092]
        check.crcs = true
        client.dns.lookup = use_all_dns_ips
        client.id = consumer-null-112611
        client.rack =
        connections.max.idle.ms = 540000
        default.api.timeout.ms = 60000
        enable.auto.commit = false
        exclude.internal.topics = true
        fetch.max.bytes = 52428800
        fetch.max.wait.ms = 500
        fetch.min.bytes = 1
        group.id = null
        group.instance.id = null
        heartbeat.interval.ms = 3000
        interceptor.classes = []
        internal.leave.group.on.close = true
        internal.throw.on.fetch.stable.offset.unsupported = false
        isolation.level = read_uncommitted
        key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
        max.partition.fetch.bytes = 1048576
        max.poll.interval.ms = 300000
        max.poll.records = 500
        metadata.max.age.ms = 300000
        metric.reporters = [class io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter]
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
        receive.buffer.bytes = 65536
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 30000
        retry.backoff.ms = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.connect.timeout.ms = null
        sasl.login.read.timeout.ms = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.login.retry.backoff.max.ms = 10000
        sasl.login.retry.backoff.ms = 100
        sasl.mechanism = GSSAPI
        sasl.oauthbearer.clock.skew.seconds = 30
        sasl.oauthbearer.expected.audience = null
        sasl.oauthbearer.expected.issuer = null
        sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
        sasl.oauthbearer.jwks.endpoint.url = null
        sasl.oauthbearer.scope.claim.name = scope
        sasl.oauthbearer.sub.claim.name = sub
        sasl.oauthbearer.token.endpoint.url = null
        security.protocol = PLAINTEXT
        security.providers = null
        send.buffer.bytes = 131072
        session.timeout.ms = 30000
        socket.connection.setup.timeout.max.ms = 30000
        socket.connection.setup.timeout.ms = 10000
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2]
        ssl.endpoint.identification.algorithm = https
        ssl.engine.factory.class = null
        ssl.key.password = null
        ssl.keymanager.algorithm = IbmX509
        ssl.keystore.certificate.chain = null
        ssl.keystore.key = null
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLSv1.2
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.certificates = null
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer

[5/2/25 15:05:25:831 ICT] 000002e1 SystemOut     O  WARN | The configuration 'auto.commit.interval.ms' was supplied but isn't a known config.
[5/2/25 15:05:25:832 ICT] 000002e1 SystemOut     O  INFO | Kafka version: 3.1.0
[5/2/25 15:05:25:832 ICT] 000002e1 SystemOut     O  INFO | Kafka commitId: 37edeed0777bacb3

@laurit
Copy link
Contributor

laurit commented May 9, 2025

Looks like you are creating and closing a lot of KafkaConsumers. I'd guess that some of this logging comes from enabling kafka metrics reporting. Try disabling instrumentation with name kafka-clients-metrics as described in https://opentelemetry.io/docs/zero-code/java/agent/disable/#suppressing-specific-agent-instrumentation Though the big chunk that is after ConsumerConfig values should be because of creating new consumers. I'd assume that you'd get this even without the otel agent. If this only happens with the otel agent you could try setting breakpoint to https://github.com/apache/kafka/blob/58c08441d1396ec09ab924ddec5be14aa423d833/clients/src/main/java/org/apache/kafka/common/config/AbstractConfig.java#L367 and see what triggers logging these.

@laurit laurit added the needs author feedback Waiting for additional feedback from the author label May 9, 2025
@tinnapat
Copy link
Author

tinnapat commented May 9, 2025

@laurit Thank you for the quick feedback! I appreciate your suggestions.

I can confirm that this excessive logging only occurs only when the OpenTelemetry Java Agent is enabled. Without the agent, our application logs are normal.

I'll try disabling the kafka-clients-metrics instrumentation as you suggested.

Regarding setting a breakpoint in AbstractConfig.java - I'm not sure how to approach this in our environment since we are using packaged Pega application and I am not sure we have access to the related source code to set breakpoint. Is there an alternative approach you could suggest to help diagnose what's triggering these logs?

I'll report back after testing with the kafka-clients-metrics instrumentation disabled to see if that resolves or reduces the issue.

Thanks again for your help!

@github-actions github-actions bot removed the needs author feedback Waiting for additional feedback from the author label May 9, 2025
@laurit
Copy link
Contributor

laurit commented May 12, 2025

I am not sure we have access to the related source code to set breakpoint

I suggested setting breakpoint in kafka client code which is open source and readily available. Source might not even be needed, for example in intellij you can set method breakpoints just by knowing class and method name.

@laurit laurit added the needs author feedback Waiting for additional feedback from the author label May 12, 2025
Copy link
Contributor

This has been automatically marked as stale because it has been marked as needing author feedback and has not had any activity for 7 days. It will be closed automatically if there is no response from the author within 7 additional days from this comment.

@github-actions github-actions bot added the stale label May 19, 2025
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale May 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs author feedback Waiting for additional feedback from the author stale
Projects
None yet
Development

No branches or pull requests

2 participants