New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Invalid SSL configs breaking connector #221
Comments
We ended up getting around this by building the connector ourselves with Confluent 5.5.1, which unintentionally removed the new SSL configs in a refactor starting in 5.5.0 (confluentinc/schema-registry#1280). However, I expect this will break again in a future release as it has since been re-added (confluentinc/schema-registry#1577). |
We're having this issue as well.
|
Are you using SSL in your Kafka brokers? I cannot reproduce this with non-SSL tests. |
Yes we are using SSL, I guess not terribly surprising it doesn't occur for non-SSL. |
We're seeing the following error when booting up our connector that uses the
SnowflakeAvroConverter
with the schema registry enabled (and accessible):This only showed up after upgrading the connector from 1.2.3 to 1.4.4. We're still investigating, but currently suspect confluentinc/schema-registry#957 and maybe confluentinc/common#241 are related, which are changes that would've been pulled in as part of the upgrade since the dependent confluent version was also bumped from 5.3.0 to 5.4.0.
We are not actually using SSL for the schema registry, but the default values seem to be an issue. This has been observed on both Confluent 5.4.0 and 5.5.1 for Kafka Connect and Schema Registry, along with Kafka 2.2.1 and 2.5.0. This may be more of an upstream issue, but I was wondering if anyone else has seen this?
Also confluentinc/schema-registry#1576 and apache/kafka#8338 are not in the versions we're using, but may be relevant in the future.
The text was updated successfully, but these errors were encountered: