-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NullPointerException on getting message schema #1332
Comments
Hello there gusevalexey0! 👋 Thank you and congratulations 🎉 for opening your very first issue in this project! 💖 In case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. 👀 |
Hey, thanks for reaching out. |
Actually, after a more deeper look, I couldn't reproduce your issue. Could you please check your logs for other errors preceding this one? The best way to check it up is to restart the app and check all the logs since the startup time. |
@Haarolean I am also facing the same issue, looks like you can reproduce this by using Amazon MSK. |
Full log from the startup _ __ __ _ _ _ ___
| |/ /__ _ / _| | ____ _ | | | |_ _|
| ' // _` | |_| |/ / _` |_____| | | || |
| . \ (_| | _| < (_| |_____| |_| || |
|_|\_\__,_|_| |_|\_\__,_| \___/|___|
2021-12-29 08:25:34,311 INFO [background-preinit] o.h.v.i.u.Version: HV000001: Hibernate Validator 6.2.0.Final
2021-12-29 08:25:34,346 INFO [main] c.p.k.u.KafkaUiApplication: Starting KafkaUiApplication using Java 13.0.9 on e241faf6965e with PID 1 (/kafka-ui-api.jar started by root in /)
2021-12-29 08:25:34,349 DEBUG [main] c.p.k.u.KafkaUiApplication: Running with Spring Boot v2.5.6, Spring v5.3.12
2021-12-29 08:25:34,350 INFO [main] c.p.k.u.KafkaUiApplication: No active profile set, falling back to default profiles: default
2021-12-29 08:25:36,783 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate: Bootstrapping Spring Data LDAP repositories in DEFAULT mode.
2021-12-29 08:25:36,853 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate: Finished Spring Data repository scanning in 57 ms. Found 0 LDAP repository interfaces.
2021-12-29 08:25:38,199 INFO [main] c.p.k.u.s.DeserializationService: Using SchemaRegistryAwareRecordSerDe for cluster 'local'
2021-12-29 08:25:39,110 INFO [main] o.s.b.a.s.r.ReactiveUserDetailsServiceAutoConfiguration:
Using generated security password: 44b802f3-9423-48cc-8656-5fca76f472be
2021-12-29 08:25:39,313 WARN [main] c.p.k.u.c.a.DisabledAuthSecurityConfig: Authentication is disabled. Access will be unrestricted.
2021-12-29 08:25:39,687 INFO [main] o.s.l.c.s.AbstractContextSource: Property 'userDn' not set - anonymous context will be used for read-write operations
2021-12-29 08:25:39,811 INFO [main] o.s.b.a.e.w.EndpointLinksResolver: Exposing 1 endpoint(s) beneath base path '/actuator'
2021-12-29 08:25:40,793 INFO [main] o.s.b.w.e.n.NettyWebServer: Netty started on port 8080
2021-12-29 08:25:40,830 INFO [main] c.p.k.u.KafkaUiApplication: Started KafkaUiApplication in 7.519 seconds (JVM running for 8.596)
2021-12-29 08:25:40,880 DEBUG [parallel-1] c.p.k.u.s.ClustersMetricsScheduler: Start getting metrics for kafkaCluster: local
2021-12-29 08:25:40,908 INFO [parallel-1] o.a.k.c.a.AdminClientConfig: AdminClientConfig values:
bootstrap.servers = [kafka:19092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
2021-12-29 08:25:41,034 INFO [parallel-1] o.a.k.c.u.AppInfoParser: Kafka version: 2.8.0
2021-12-29 08:25:41,034 INFO [parallel-1] o.a.k.c.u.AppInfoParser: Kafka commitId: ebb1d6e21cc92130
2021-12-29 08:25:41,034 INFO [parallel-1] o.a.k.c.u.AppInfoParser: Kafka startTimeMs: 1640766341032
2021-12-29 08:25:41,701 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:zookeeper.version=3.7.0-e3704b390a6697bfdf4b0bef79e3da7a4f6bac4b, built on 2021-03-17 09:46 UTC
2021-12-29 08:25:41,701 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:host.name=e241faf6965e
2021-12-29 08:25:41,701 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:java.version=13.0.9
2021-12-29 08:25:41,701 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:java.vendor=Alpine
2021-12-29 08:25:41,701 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:java.home=/usr/lib/jvm/java-13-openjdk
2021-12-29 08:25:41,701 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:java.class.path=kafka-ui-api.jar
2021-12-29 08:25:41,701 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:java.library.path=/usr/lib/jvm/java-13-openjdk/lib/server:/usr/lib/jvm/java-13-openjdk/lib:/usr/lib/jvm/java-13-openjdk/../lib:/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib
2021-12-29 08:25:41,701 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:java.io.tmpdir=/tmp
2021-12-29 08:25:41,701 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:java.compiler=<NA>
2021-12-29 08:25:41,702 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:os.name=Linux
2021-12-29 08:25:41,702 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:os.arch=amd64
2021-12-29 08:25:41,702 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:os.version=5.10.16.3-microsoft-standard-WSL2
2021-12-29 08:25:41,702 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:user.name=root
2021-12-29 08:25:41,702 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:user.home=/root
2021-12-29 08:25:41,702 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:user.dir=/
2021-12-29 08:25:41,702 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:os.memory.free=61MB
2021-12-29 08:25:41,702 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:os.memory.max=3166MB
2021-12-29 08:25:41,702 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Client environment:os.memory.total=93MB
2021-12-29 08:25:41,708 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ZooKeeper: Initiating client connection, connectString=zookeeper:2181 sessionTimeout=60000 watcher=com.provectus.kafka.ui.service.ZookeeperService$$Lambda$1151/0x0000000801216040@7edd3a6a
2021-12-29 08:25:41,717 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.c.X509Util: Setting -D jdk.tls.rejectClientInitiatedRenegotiation=true to disable client-initiated TLS renegotiation
2021-12-29 08:25:41,722 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ClientCnxnSocket: jute.maxbuffer value is 1048575 Bytes
2021-12-29 08:25:41,737 INFO [kafka-admin-client-thread | adminclient-1] o.a.z.ClientCnxn: zookeeper.request.timeout value is 0. feature enabled=false
2021-12-29 08:25:41,738 DEBUG [kafka-admin-client-thread | adminclient-1] c.p.k.u.s.ZookeeperService: Start getting Zookeeper metrics for kafkaCluster: local
2021-12-29 08:25:41,750 INFO [kafka-admin-client-thread | adminclient-1-SendThread(zookeeper:2181)] o.a.z.ClientCnxn: Opening socket connection to server zookeeper/172.19.0.3:2181.
2021-12-29 08:25:41,750 INFO [kafka-admin-client-thread | adminclient-1-SendThread(zookeeper:2181)] o.a.z.ClientCnxn: SASL config status: Will not attempt to authenticate using SASL (unknown error)
2021-12-29 08:25:41,752 INFO [kafka-admin-client-thread | adminclient-1-SendThread(zookeeper:2181)] o.a.z.ClientCnxn: Socket connection established, initiating session, client: /172.19.0.2:44364, server: zookeeper/172.19.0.3:2181
2021-12-29 08:25:41,762 INFO [kafka-admin-client-thread | adminclient-1-SendThread(zookeeper:2181)] o.a.z.ClientCnxn: Session establishment complete on server zookeeper/172.19.0.3:2181, session id = 0x100000b29dc0009, negotiated timeout = 40000
2021-12-29 08:25:41,856 DEBUG [kafka-admin-client-thread | adminclient-1] c.p.k.u.s.ClustersMetricsScheduler: Metrics updated for cluster: local
2021-12-29 08:26:04,583 WARN [parallel-7] c.p.k.u.e.ErrorCode: Multiple class com.provectus.kafka.ui.exception.ErrorCode values refer to code 4001
2021-12-29 08:26:10,827 DEBUG [parallel-4] c.p.k.u.s.ClustersMetricsScheduler: Start getting metrics for kafkaCluster: local
2021-12-29 08:26:10,831 DEBUG [kafka-admin-client-thread | adminclient-1] c.p.k.u.s.ZookeeperService: Start getting Zookeeper metrics for kafkaCluster: local
2021-12-29 08:26:10,862 DEBUG [kafka-admin-client-thread | adminclient-1] c.p.k.u.s.ClustersMetricsScheduler: Metrics updated for cluster: local
2021-12-29 08:26:11,482 WARN [parallel-5] o.h.v.i.p.j.JavaBeanExecutable: HV000254: Missing parameter metadata for SeekTypeDTO(String, int, String), which declares implicit or synthetic parameters. Automatic resolution of generic type information for method parameters may yield incorrect results if multiple parameters have the same erasure. To solve this, compile your code with the '-parameters' flag.
2021-12-29 08:26:11,559 WARN [parallel-5] o.h.v.i.p.j.JavaBeanExecutable: HV000254: Missing parameter metadata for SeekDirectionDTO(String, int, String), which declares implicit or synthetic parameters. Automatic resolution of generic type information for method parameters may yield incorrect results if multiple parameters have the same erasure. To solve this, compile your code with the '-parameters' flag.
2021-12-29 08:26:11,618 INFO [elastic-4] o.a.k.c.c.ConsumerConfig: ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [kafka:19092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = kafka-ui-4dc7012a-c89f-4804-a8ed-94715418201b
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = true
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = null
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.BytesDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.BytesDeserializer
2021-12-29 08:26:11,655 INFO [elastic-4] o.a.k.c.u.AppInfoParser: Kafka version: 2.8.0
2021-12-29 08:26:11,655 INFO [elastic-4] o.a.k.c.u.AppInfoParser: Kafka commitId: ebb1d6e21cc92130
2021-12-29 08:26:11,655 INFO [elastic-4] o.a.k.c.u.AppInfoParser: Kafka startTimeMs: 1640766371655
2021-12-29 08:26:11,684 INFO [elastic-4] o.a.k.c.Metadata: [Consumer clientId=kafka-ui-4dc7012a-c89f-4804-a8ed-94715418201b, groupId=null] Cluster ID: NA2c93fwSAu943GjgeiF_Q
2021-12-29 08:26:11,687 INFO [elastic-4] c.p.k.u.u.OffsetsSeek: Positioning consumer for topic test with ConsumerPosition(seekType=OFFSET, seekTo={test-0=0}, seekDirection=FORWARD)
2021-12-29 08:26:11,717 INFO [elastic-4] o.a.k.c.c.KafkaConsumer: [Consumer clientId=kafka-ui-4dc7012a-c89f-4804-a8ed-94715418201b, groupId=null] Unsubscribed all topics or patterns and assigned partitions
2021-12-29 08:26:11,718 INFO [elastic-4] c.p.k.u.u.OffsetsSeek: Assignment: []
2021-12-29 08:26:11,723 INFO [elastic-4] c.p.k.u.e.ForwardRecordEmitter: Polling finished
2021-12-29 08:26:11,724 INFO [elastic-4] o.a.k.c.m.Metrics: Metrics scheduler closed
2021-12-29 08:26:11,724 INFO [elastic-4] o.a.k.c.m.Metrics: Closing reporter org.apache.kafka.common.metrics.JmxReporter
2021-12-29 08:26:11,725 INFO [elastic-4] o.a.k.c.m.Metrics: Metrics reporters closed
2021-12-29 08:26:11,737 INFO [elastic-4] o.a.k.c.u.AppInfoParser: App info kafka.consumer for kafka-ui-4dc7012a-c89f-4804-a8ed-94715418201b unregistered
2021-12-29 08:26:27,553 INFO [elastic-4] o.a.k.c.c.ConsumerConfig: ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [kafka:19092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = kafka-ui-f301102a-e7cc-4773-8d9b-dd0e51d5e0f7
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = true
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = null
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.BytesDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.BytesDeserializer
2021-12-29 08:26:27,563 INFO [elastic-4] o.a.k.c.u.AppInfoParser: Kafka version: 2.8.0
2021-12-29 08:26:27,564 INFO [elastic-4] o.a.k.c.u.AppInfoParser: Kafka commitId: ebb1d6e21cc92130
2021-12-29 08:26:27,564 INFO [elastic-4] o.a.k.c.u.AppInfoParser: Kafka startTimeMs: 1640766387563
2021-12-29 08:26:27,579 INFO [elastic-4] o.a.k.c.Metadata: [Consumer clientId=kafka-ui-f301102a-e7cc-4773-8d9b-dd0e51d5e0f7, groupId=null] Cluster ID: NA2c93fwSAu943GjgeiF_Q
2021-12-29 08:26:27,582 INFO [elastic-4] c.p.k.u.u.OffsetsSeek: Positioning consumer for topic testtest with ConsumerPosition(seekType=OFFSET, seekTo={testtest-0=0}, seekDirection=FORWARD)
2021-12-29 08:26:27,600 INFO [elastic-4] o.a.k.c.c.KafkaConsumer: [Consumer clientId=kafka-ui-f301102a-e7cc-4773-8d9b-dd0e51d5e0f7, groupId=null] Unsubscribed all topics or patterns and assigned partitions
2021-12-29 08:26:27,600 INFO [elastic-4] c.p.k.u.u.OffsetsSeek: Assignment: []
2021-12-29 08:26:27,601 INFO [elastic-4] c.p.k.u.e.ForwardRecordEmitter: Polling finished
2021-12-29 08:26:27,602 INFO [elastic-4] o.a.k.c.m.Metrics: Metrics scheduler closed
2021-12-29 08:26:27,602 INFO [elastic-4] o.a.k.c.m.Metrics: Closing reporter org.apache.kafka.common.metrics.JmxReporter
2021-12-29 08:26:27,602 INFO [elastic-4] o.a.k.c.m.Metrics: Metrics reporters closed
2021-12-29 08:26:27,611 INFO [elastic-4] o.a.k.c.u.AppInfoParser: App info kafka.consumer for kafka-ui-f301102a-e7cc-4773-8d9b-dd0e51d5e0f7 unregistered
2021-12-29 08:26:28,694 ERROR [parallel-4] o.s.b.a.w.r.e.AbstractErrorWebExceptionHandler: [20cbd8dc-12] 500 Server Error for HTTP GET "/api/clusters/local/topics/testtest/messages/schema"
java.lang.NullPointerException: null
at com.provectus.kafka.ui.util.jsonschema.JsonSchema.toJson(JsonSchema.java:28)
Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException:
Error has been observed at the following site(s):
*__checkpoint ⇢ com.provectus.kafka.ui.config.CustomWebFilter [DefaultWebFilterChain]
*__checkpoint ⇢ com.provectus.kafka.ui.config.ReadOnlyModeFilter [DefaultWebFilterChain]
*__checkpoint ⇢ org.springframework.security.web.server.authorization.AuthorizationWebFilter [DefaultWebFilterChain]
*__checkpoint ⇢ org.springframework.security.web.server.authorization.ExceptionTranslationWebFilter [DefaultWebFilterChain]
*__checkpoint ⇢ org.springframework.security.web.server.authentication.logout.LogoutWebFilter [DefaultWebFilterChain]
*__checkpoint ⇢ org.springframework.security.web.server.savedrequest.ServerRequestCacheWebFilter [DefaultWebFilterChain]
*__checkpoint ⇢ org.springframework.security.web.server.context.SecurityContextServerWebExchangeWebFilter [DefaultWebFilterChain]
*__checkpoint ⇢ org.springframework.security.web.server.context.ReactorContextWebFilter [DefaultWebFilterChain]
*__checkpoint ⇢ org.springframework.security.web.server.header.HttpHeaderWriterWebFilter [DefaultWebFilterChain]
*__checkpoint ⇢ org.springframework.security.config.web.server.ServerHttpSecurity$ServerWebExchangeReactorContextWebFilter [DefaultWebFilterChain]
*__checkpoint ⇢ org.springframework.security.web.server.WebFilterChainProxy [DefaultWebFilterChain]
*__checkpoint ⇢ org.springframework.boot.actuate.metrics.web.reactive.server.MetricsWebFilter [DefaultWebFilterChain]
*__checkpoint ⇢ HTTP GET "/api/clusters/local/topics/testtest/messages/schema" [ExceptionHandlingWebHandler]
Stack trace:
at com.provectus.kafka.ui.util.jsonschema.JsonSchema.toJson(JsonSchema.java:28)
at com.provectus.kafka.ui.serde.schemaregistry.SchemaRegistryAwareRecordSerDe.lambda$getTopicSchema$0(SchemaRegistryAwareRecordSerDe.java:196)
at java.base/java.util.Optional.orElseGet(Optional.java:362)
at com.provectus.kafka.ui.serde.schemaregistry.SchemaRegistryAwareRecordSerDe.getTopicSchema(SchemaRegistryAwareRecordSerDe.java:196)
at com.provectus.kafka.ui.service.TopicsService.getTopicSchema(TopicsService.java:392)
at com.provectus.kafka.ui.controller.MessagesController.getTopicSchema(MessagesController.java:62)
at com.provectus.kafka.ui.controller.MessagesController$$FastClassBySpringCGLIB$$8951e2d8.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:783)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:753)
at org.springframework.validation.beanvalidation.MethodValidationInterceptor.invoke(MethodValidationInterceptor.java:123)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:753)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:698)
at com.provectus.kafka.ui.controller.MessagesController$$EnhancerBySpringCGLIB$$4fc4bcd9.getTopicSchema(<generated>)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:567)
at org.springframework.web.reactive.result.method.InvocableHandlerMethod.lambda$invoke$0(InvocableHandlerMethod.java:144)
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:125)
at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1816)
at reactor.core.publisher.MonoZip$ZipCoordinator.signal(MonoZip.java:251)
at reactor.core.publisher.MonoZip$ZipInner.onNext(MonoZip.java:336)
at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onNext(MonoPeekTerminal.java:180)
at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2398)
at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.request(MonoPeekTerminal.java:139)
at reactor.core.publisher.MonoZip$ZipInner.onSubscribe(MonoZip.java:325)
at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onSubscribe(MonoPeekTerminal.java:152)
at reactor.core.publisher.MonoJust.subscribe(MonoJust.java:55)
at reactor.core.publisher.Mono.subscribe(Mono.java:4399)
at reactor.core.publisher.MonoZip.subscribe(MonoZip.java:128)
at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64)
at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:236)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onComplete(MonoIgnoreThen.java:203)
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onComplete(MonoFlatMap.java:181)
at reactor.core.publisher.Operators.complete(Operators.java:137)
at reactor.core.publisher.MonoZip.subscribe(MonoZip.java:120)
at reactor.core.publisher.Mono.subscribe(Mono.java:4399)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:255)
at reactor.core.publisher.MonoIgnoreThen.subscribe(MonoIgnoreThen.java:51)
at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64)
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:157)
at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onNext(FluxSwitchIfEmpty.java:74)
at reactor.core.publisher.MonoNext$NextSubscriber.onNext(MonoNext.java:82)
at reactor.core.publisher.FluxConcatMap$ConcatMapImmediate.innerNext(FluxConcatMap.java:282)
at reactor.core.publisher.FluxConcatMap$ConcatMapInner.onNext(FluxConcatMap.java:861)
at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onNext(FluxMapFuseable.java:127)
at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onNext(MonoPeekTerminal.java:180)
at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2398)
at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.request(MonoPeekTerminal.java:139)
at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.request(FluxMapFuseable.java:169)
at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.set(Operators.java:2194)
at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.onSubscribe(Operators.java:2068)
at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onSubscribe(FluxMapFuseable.java:96)
at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onSubscribe(MonoPeekTerminal.java:152)
at reactor.core.publisher.MonoJust.subscribe(MonoJust.java:55)
at reactor.core.publisher.Mono.subscribe(Mono.java:4399)
at reactor.core.publisher.FluxConcatMap$ConcatMapImmediate.drain(FluxConcatMap.java:449)
at reactor.core.publisher.FluxConcatMap$ConcatMapImmediate.onSubscribe(FluxConcatMap.java:219)
at reactor.core.publisher.FluxIterable.subscribe(FluxIterable.java:165)
at reactor.core.publisher.FluxIterable.subscribe(FluxIterable.java:87)
at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64)
at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52)
at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64)
at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52)
at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64)
at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52)
at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52)
at reactor.core.publisher.Mono.subscribe(Mono.java:4399)
at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onComplete(FluxSwitchIfEmpty.java:82)
at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onComplete(MonoPeekTerminal.java:299)
at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onComplete(MonoPeekTerminal.java:299)
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:148)
at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onNext(FluxSwitchIfEmpty.java:74)
at reactor.core.publisher.FluxFilter$FilterSubscriber.onNext(FluxFilter.java:113)
at reactor.core.publisher.FluxDefaultIfEmpty$DefaultIfEmptySubscriber.onNext(FluxDefaultIfEmpty.java:101)
at reactor.core.publisher.MonoNext$NextSubscriber.onNext(MonoNext.java:82)
at reactor.core.publisher.FluxConcatMap$ConcatMapImmediate.innerNext(FluxConcatMap.java:282)
at reactor.core.publisher.FluxConcatMap$ConcatMapInner.onNext(FluxConcatMap.java:861)
at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1816)
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:151)
at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onNext(FluxMapFuseable.java:127)
at reactor.core.publisher.FluxFilterFuseable$FilterFuseableSubscriber.onNext(FluxFilterFuseable.java:118)
at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2398)
at reactor.core.publisher.FluxFilterFuseable$FilterFuseableSubscriber.request(FluxFilterFuseable.java:191)
at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.request(FluxMapFuseable.java:169)
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:110)
at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onSubscribe(FluxMapFuseable.java:96)
at reactor.core.publisher.FluxFilterFuseable$FilterFuseableSubscriber.onSubscribe(FluxFilterFuseable.java:87)
at reactor.core.publisher.MonoJust.subscribe(MonoJust.java:55)
at reactor.core.publisher.Mono.subscribe(Mono.java:4399)
at reactor.core.publisher.FluxConcatMap$ConcatMapImmediate.drain(FluxConcatMap.java:449)
at reactor.core.publisher.FluxConcatMap$ConcatMapImmediate.onSubscribe(FluxConcatMap.java:219)
at reactor.core.publisher.FluxIterable.subscribe(FluxIterable.java:165)
at reactor.core.publisher.FluxIterable.subscribe(FluxIterable.java:87)
at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64)
at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52)
at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64)
at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52)
at reactor.core.publisher.Mono.subscribe(Mono.java:4399)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:255)
at reactor.core.publisher.MonoIgnoreThen.subscribe(MonoIgnoreThen.java:51)
at reactor.core.publisher.Mono.subscribe(Mono.java:4399)
at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onComplete(FluxSwitchIfEmpty.java:82)
at reactor.core.publisher.FluxFilter$FilterSubscriber.onComplete(FluxFilter.java:166)
at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onComplete(FluxPeekFuseable.java:940)
at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onComplete(FluxSwitchIfEmpty.java:85)
at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2400)
at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.set(Operators.java:2194)
at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.onSubscribe(Operators.java:2068)
at reactor.core.publisher.MonoJust.subscribe(MonoJust.java:55)
at reactor.core.publisher.Mono.subscribe(Mono.java:4399)
at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onComplete(FluxSwitchIfEmpty.java:82)
at reactor.core.publisher.MonoNext$NextSubscriber.onComplete(MonoNext.java:102)
at reactor.core.publisher.FluxFilter$FilterSubscriber.onComplete(FluxFilter.java:166)
at reactor.core.publisher.FluxFlatMap$FlatMapMain.checkTerminated(FluxFlatMap.java:846)
at reactor.core.publisher.FluxFlatMap$FlatMapMain.drainLoop(FluxFlatMap.java:608)
at reactor.core.publisher.FluxFlatMap$FlatMapMain.drain(FluxFlatMap.java:588)
at reactor.core.publisher.FluxFlatMap$FlatMapMain.onComplete(FluxFlatMap.java:465)
at reactor.core.publisher.FluxPeekFuseable$PeekFuseableSubscriber.onComplete(FluxPeekFuseable.java:277)
at reactor.core.publisher.FluxIterable$IterableSubscription.slowPath(FluxIterable.java:294)
at reactor.core.publisher.FluxIterable$IterableSubscription.request(FluxIterable.java:230)
at reactor.core.publisher.FluxPeekFuseable$PeekFuseableSubscriber.request(FluxPeekFuseable.java:144)
at reactor.core.publisher.FluxFlatMap$FlatMapMain.onSubscribe(FluxFlatMap.java:371)
at reactor.core.publisher.FluxPeekFuseable$PeekFuseableSubscriber.onSubscribe(FluxPeekFuseable.java:178)
at reactor.core.publisher.FluxIterable.subscribe(FluxIterable.java:165)
at reactor.core.publisher.FluxIterable.subscribe(FluxIterable.java:87)
at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64)
at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52)
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:157)
at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1816)
at reactor.core.publisher.FluxDefaultIfEmpty$DefaultIfEmptySubscriber.onComplete(FluxDefaultIfEmpty.java:109)
at reactor.core.publisher.FluxMap$MapSubscriber.onComplete(FluxMap.java:142)
at reactor.core.publisher.FluxMap$MapSubscriber.onComplete(FluxMap.java:142)
at reactor.core.publisher.FluxFilter$FilterSubscriber.onComplete(FluxFilter.java:166)
at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onComplete(FluxMap.java:269)
at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1817)
at reactor.core.publisher.MonoCacheTime$CoordinatorSubscriber.signalCached(MonoCacheTime.java:337)
at reactor.core.publisher.MonoCacheTime$CoordinatorSubscriber.onNext(MonoCacheTime.java:354)
at reactor.core.publisher.FluxPeek$PeekSubscriber.onNext(FluxPeek.java:200)
at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onNext(FluxSwitchIfEmpty.java:74)
at reactor.core.publisher.MonoPublishOn$PublishOnSubscriber.run(MonoPublishOn.java:181)
at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:68)
at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:28)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:830) |
Also have this issue |
@amonsat hi, please share the details on how you use the app |
@Haarolean I have a docker-compose.yml kafka: kafka-ui: And when I click produce message I have same issue. But the rest of functionality works well. |
Thanks, I was able to reproduce the issue. |
Will be available in |
Describe the bug
It is impossible to produce message to topic due to NullPointerException
Set up
Run latest image in docker:
Steps to Reproduce
Steps to reproduce the behavior:
test
Expected behavior
Form for message creating opens
Screenshots
Additional context
Error in logs:
The text was updated successfully, but these errors were encountered: