Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Install of 0.2.181 does not work #347

Closed
DSchmidtDev opened this issue Aug 3, 2023 · 9 comments · Fixed by #358
Closed

Install of 0.2.181 does not work #347

DSchmidtDev opened this issue Aug 3, 2023 · 9 comments · Fixed by #358
Labels
bug Something isn't working

Comments

@DSchmidtDev
Copy link
Contributor

DSchmidtDev commented Aug 3, 2023

Describe the bug
Neither an upgrade or clean install of latest release fails while running the system update job

To Reproduce
Install helm chart version 0.2.181

Additional context
Error message is as follows

2023-08-03 06:12:48,812 [main] ERROR c.l.d.u.s.e.steps.DataHubStartupStep:40 - DataHubStartupStep failed. org.apache.kafka.common.errors.SerializationException: Error serializing Avro message Caused by: java.io.IOException: No schema registered under subject! at io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient.getLatestVersion(MockSchemaRegistryClient.java:261) at io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient.getLatestSchemaMetadata(MockSchemaRegistryClient.java:310) at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:181) at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:77) at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:59) at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62) at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:902) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:862) at com.linkedin.metadata.dao.producer.KafkaEventProducer.produceDataHubUpgradeHistoryEvent(KafkaEventProducer.java:171) at com.linkedin.datahub.upgrade.system.elasticsearch.steps.DataHubStartupStep.lambda$executable$0(DataHubStartupStep.java:37) at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeStepInternal(DefaultUpgradeManager.java:110) at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeInternal(DefaultUpgradeManager.java:68) at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeInternal(DefaultUpgradeManager.java:42) at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.execute(DefaultUpgradeManager.java:33) at com.linkedin.datahub.upgrade.UpgradeCli.run(UpgradeCli.java:80) at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:768) at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:752) at org.springframework.boot.SpringApplication.run(SpringApplication.java:314) at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:164) at com.linkedin.datahub.upgrade.UpgradeCliApplication.main(UpgradeCliApplication.java:23) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49) at org.springframework.boot.loader.Launcher.launch(Launcher.java:108) at org.springframework.boot.loader.Launcher.launch(Launcher.java:58) at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:65) 2023-08-03 06:12:48,867 [main] INFO c.l.d.u.impl.DefaultUpgradeReport:16 - Failed Step 4/6: DataHubStartupStep. Failed after 3 retries. 2023-08-03 06:12:48,867 [main] INFO c.l.d.u.impl.DefaultUpgradeReport:16 - Exiting upgrade SystemUpdate with failure. 2023-08-03 06:12:48,881 [main] INFO c.l.d.u.impl.DefaultUpgradeReport:16 - Upgrade SystemUpdate completed with result FAILED. Exiting... 2023-08-03 06:12:48,992 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.AbstractNettyClient:249 - Shutdown requested 2023-08-03 06:12:48,995 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.AbstractNettyClient:252 - Shutting down

@DSchmidtDev DSchmidtDev added the bug Something isn't working label Aug 3, 2023
@leszekbulawa
Copy link

leszekbulawa commented Aug 4, 2023

For me it also fails on datahub-system-update-job. Logline with error:

2023-08-04 13:35:17,188 [SpringApplicationShutdownHook] INFO  o.a.k.clients.producer.KafkaProducer:1182 - [Producer clientId=producer-1] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.
ANTLR Tool version 4.5 used for code generation does not match the current runtime version 4.7.2ANTLR Runtime version 4.5 used for parser compilation does not match the current runtime version 4.7.2ANTLR Tool version 4.5 used for code generation does not match the current runtime version 4.7.2ANTLR Runtime version 4.5 used for parser compilation does not match the current runtime version 4.7.2

@RyanHolstien
Copy link
Contributor

This error should only happen if you have configured Internal Schema Registry, if this is not intended then you can reconfigure the schema registry implementation to be Confluent Kafka:
https://github.com/acryldata/datahub-helm/blob/master/charts/datahub/values.yaml#L491

It seems there are still issues with configuring Internal Schema Registry if you are intending to use it and the recommendation at this time is to still use Confluent Schema Registry if it is possible for your team until we can get this fixed.

@leszekbulawa
Copy link

It is worth mentioning that Confluent Schema Registry has to be enabled in the prerequisites chart:
https://github.com/acryldata/datahub-helm/blob/master/charts/prerequisites/values.yaml#L95

@andreyolv
Copy link

andreyolv commented Aug 15, 2023

Same problem.

Filling values in prerequisites:

cp-helm-charts:
  enabled: true
  cp-schema-registry:
    enabled: true

and values in datahub:

kafka:
  schemaregistry:
    type: KAFKA
    url: "http://prerequisites-cp-schema-registry:8081"

Still does not work, and same error. So documentation is incomplete and not enough for deploy on kubernetes.

@moyun
Copy link

moyun commented Aug 17, 2023

For me it also fails on datahub-system-update-job. Logline with error:

I met the same issue, have you resolved it?

2023-08-17 16:08:49,363 [main] INFO c.l.d.u.impl.DefaultUpgradeReport:16 - Executing Step 4/6: DataHubStartupStep...
2023-08-17 16:08:49,494 [main] ERROR c.l.d.u.s.e.steps.DataHubStartupStep:40 - DataHubStartupStep failed.
org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: java.io.IOException: No schema registered under subject!
at io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient.getLatestVersion(MockSchemaRegistryClient.java:261)
at io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient.getLatestSchemaMetadata(MockSchemaRegistryClient.java:310)
at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:181)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:77)
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:59)
at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:902)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:862)
at com.linkedin.metadata.dao.producer.KafkaEventProducer.produceDataHubUpgradeHistoryEvent(KafkaEventProducer.java:171)
at com.linkedin.datahub.upgrade.system.elasticsearch.steps.DataHubStartupStep.lambda$executable$0(DataHubStartupStep.java:37)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeStepInternal(DefaultUpgradeManager.java:110)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeInternal(DefaultUpgradeManager.java:68)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeInternal(DefaultUpgradeManager.java:42)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.execute(DefaultUpgradeManager.java:33)
at com.linkedin.datahub.upgrade.UpgradeCli.run(UpgradeCli.java:80)
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:768)
at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:752)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:314)
at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:164)
at com.linkedin.datahub.upgrade.UpgradeCliApplication.main(UpgradeCliApplication.java:23)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:108)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:58)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:65)
2023-08-17 16:08:49,495 [main] ERROR c.l.d.u.s.e.steps.DataHubStartupStep:40 - DataHubStartupStep failed.
org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: java.io.IOException: No schema registered under subject!
at io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient.getLatestVersion(MockSchemaRegistryClient.java:261)
at io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient.getLatestSchemaMetadata(MockSchemaRegistryClient.java:310)
at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:181)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:77)
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:59)
at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:902)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:862)
at com.linkedin.metadata.dao.producer.KafkaEventProducer.produceDataHubUpgradeHistoryEvent(KafkaEventProducer.java:171)
at com.linkedin.datahub.upgrade.system.elasticsearch.steps.DataHubStartupStep.lambda$executable$0(DataHubStartupStep.java:37)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeStepInternal(DefaultUpgradeManager.java:110)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeInternal(DefaultUpgradeManager.java:68)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeInternal(DefaultUpgradeManager.java:42)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.execute(DefaultUpgradeManager.java:33)
at com.linkedin.datahub.upgrade.UpgradeCli.run(UpgradeCli.java:80)
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:768)
at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:752)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:314)
at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:164)
at com.linkedin.datahub.upgrade.UpgradeCliApplication.main(UpgradeCliApplication.java:23)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:108)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:58)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:65)
2023-08-17 16:08:49,495 [main] ERROR c.l.d.u.s.e.steps.DataHubStartupStep:40 - DataHubStartupStep failed.
org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: java.io.IOException: No schema registered under subject!
at io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient.getLatestVersion(MockSchemaRegistryClient.java:261)
at io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient.getLatestSchemaMetadata(MockSchemaRegistryClient.java:310)
at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:181)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:77)
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:59)
at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:902)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:862)
at com.linkedin.metadata.dao.producer.KafkaEventProducer.produceDataHubUpgradeHistoryEvent(KafkaEventProducer.java:171)
at com.linkedin.datahub.upgrade.system.elasticsearch.steps.DataHubStartupStep.lambda$executable$0(DataHubStartupStep.java:37)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeStepInternal(DefaultUpgradeManager.java:110)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeInternal(DefaultUpgradeManager.java:68)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeInternal(DefaultUpgradeManager.java:42)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.execute(DefaultUpgradeManager.java:33)
at com.linkedin.datahub.upgrade.UpgradeCli.run(UpgradeCli.java:80)
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:768)
at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:752)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:314)
at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:164)
at com.linkedin.datahub.upgrade.UpgradeCliApplication.main(UpgradeCliApplication.java:23)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:108)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:58)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:65)
2023-08-17 16:08:49,495 [main] ERROR c.l.d.u.s.e.steps.DataHubStartupStep:40 - DataHubStartupStep failed.
org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: java.io.IOException: No schema registered under subject!
at io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient.getLatestVersion(MockSchemaRegistryClient.java:261)
at io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient.getLatestSchemaMetadata(MockSchemaRegistryClient.java:310)
at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:181)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:77)
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:59)
at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:902)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:862)
at com.linkedin.metadata.dao.producer.KafkaEventProducer.produceDataHubUpgradeHistoryEvent(KafkaEventProducer.java:171)
at com.linkedin.datahub.upgrade.system.elasticsearch.steps.DataHubStartupStep.lambda$executable$0(DataHubStartupStep.java:37)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeStepInternal(DefaultUpgradeManager.java:110)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeInternal(DefaultUpgradeManager.java:68)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeInternal(DefaultUpgradeManager.java:42)
at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.execute(DefaultUpgradeManager.java:33)
at com.linkedin.datahub.upgrade.UpgradeCli.run(UpgradeCli.java:80)
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:768)
at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:752)
2023-08-17 16:08:49,495 [main] INFO c.l.d.u.impl.DefaultUpgradeReport:16 - Failed Step 4/6: DataHubStartupStep. Failed after 3 retries.
2023-08-17 16:08:49,495 [main] INFO c.l.d.u.impl.DefaultUpgradeReport:16 - Exiting upgrade SystemUpdate with failure.
2023-08-17 16:08:49,495 [main] INFO c.l.d.u.impl.DefaultUpgradeReport:16 - Upgrade SystemUpdate completed with result FAILED. Exiting...
2023-08-17 16:08:49,567 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.AbstractNettyClient:249 - Shutdown requested
2023-08-17 16:08:49,567 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.AbstractNettyClient:252 - Shutting down
2023-08-17 16:08:49,569 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-08-17 16:08:49,569 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-08-17 16:08:49,569 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-08-17 16:08:49,575 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-08-17 16:08:49,575 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-08-17 16:08:49,575 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-08-17 16:08:49,576 [R2 Nio Event Loop-3-1] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-08-17 16:08:49,576 [R2 Nio Event Loop-3-2] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-08-17 16:08:49,576 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.AbstractNettyClient:249 - Shutdown requested
2023-08-17 16:08:49,576 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.AbstractNettyClient:252 - Shutting down
2023-08-17 16:08:49,576 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-08-17 16:08:49,576 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-08-17 16:08:49,576 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-08-17 16:08:49,576 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-08-17 16:08:49,576 [R2 Nio Event Loop-3-1] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-08-17 16:08:49,576 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-08-17 16:08:49,576 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-08-17 16:08:49,577 [R2 Nio Event Loop-3-2] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-08-17 16:08:49,578 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.AbstractNettyClient:249 - Shutdown requested
2023-08-17 16:08:49,578 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.AbstractNettyClient:252 - Shutting down
2023-08-17 16:08:49,578 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-08-17 16:08:49,578 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-08-17 16:08:49,578 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-08-17 16:08:49,578 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-08-17 16:08:49,578 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-08-17 16:08:49,579 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-08-17 16:08:49,579 [R2 Nio Event Loop-1-1] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-08-17 16:08:49,579 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.AbstractNettyClient:249 - Shutdown requested
2023-08-17 16:08:49,579 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.AbstractNettyClient:252 - Shutting down
2023-08-17 16:08:49,579 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-08-17 16:08:49,579 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-08-17 16:08:49,579 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-08-17 16:08:49,579 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-08-17 16:08:49,579 [R2 Nio Event Loop-1-2] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-08-17 16:08:49,579 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-08-17 16:08:49,579 [R2 Nio Event Loop-1-1] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-08-17 16:08:49,579 [SpringApplicationShutdownHook] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-08-17 16:08:49,579 [R2 Nio Event Loop-1-2] INFO c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-08-17 16:08:49,790 [SpringApplicationShutdownHook] INFO o.a.k.clients.producer.KafkaProducer:1182 - [Producer clientId=producer-1] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.
2023-08-17 16:08:49,790 [SpringApplicationShutdownHook] INFO o.a.k.clients.producer.KafkaProducer:1182 - [Producer clientId=producer-1] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.
ANTLR Tool version 4.5 used for code generation does not match the current runtime version 4.7.2ANTLR Runtime version 4.5 used for parser compilation does not match the current runtime version 4.7.2ANTLR Tool version 4.5 used for code generation does not match the current runtime version 4.7.2ANTLR Runtime version 4.5 used for parser compilation does not match the current runtime version 4.7.2

@daviibf
Copy link

daviibf commented Aug 25, 2023

I sustain! I am having the same issue.

To fix it, I had to change the url value of the schema-registry @andreyolv :

Filling values in prerequisites:

cp-helm-charts:
  enabled: true
  cp-schema-registry:
    enabled: true

and values in datahub:

kafka:
  schemaregistry:
    type: KAFKA
    url: "http://datahub-datahub-gms:8080/schema-registry/api/"

@DSchmidtDev
Copy link
Contributor Author

Same problem.

Filling values in prerequisites:

cp-helm-charts:
  enabled: true
  cp-schema-registry:
    enabled: true

and values in datahub:

kafka:
  schemaregistry:
    type: KAFKA
    url: "http://prerequisites-cp-schema-registry:8081"

Still does not work, and same error. So documentation is incomplete and not enough for deploy on kubernetes.

For me exactly this config worked and the switch between the registries worked without an recognizable issue so far.

@david-leifker
Copy link
Contributor

I think maybe we need some better documentation here. If using GMS as an internal schema registry it doesn't support registering schemas. The expectation for INTERNAL is that there is no confluent schema registry and the URL points to GMS, this should be automatically configured. If using confluent then schemas can be registered per normal and the URL points to the confluent schema registry. These settings span both the prerequisites and the main helm charts so there isn't a mechanism to sync them automatically. The original error indicates that perhaps the confluent schema registry is enabled, but points to GMS incorrectly.

@DSchmidtDev
Copy link
Contributor Author

Thanks for coming back to this @david-leifker but I need to disagree. The original error (I posted) results in using the INTERNAL registry as described and how it was the default in the helm chart.
So not deploying (disabling) confluent registry in prerequisites and pointing the kafka.schemaregistry.url to GMS.
It could have been reproduced multiple times. Or is there any other hidden setting we missed?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
7 participants