-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Registration of schema on the Azure Schema Registry in Kafka Connect scenario #50
Comments
You'd need to edit this line to pull the boolean from |
@OneCricketeer Have you success load the |
It's no different than other plugins mentioned in the Kafka documentation. Build this jar using |
Thanks @OneCricketeer . Fixed it. |
Unfortunately, I can't help without seeing your code. If you use StringSerializer in a producer, you'll get a String-type Connect Schema. Try using a structured event instead |
Thanks. |
The schema or its ID should be encoded within the event itself. The converter class is not indented to be used standalone, only within a Connector, such as the original question about JDBC Source. Are you trying to write your own? Perhaps you can start a new issue thread? @zhaozy93 |
Could you tell me whether it is possible to setup the source connector in Kafka Connect scenarios so that it does not automatically register schemas on the Azure Schema Registry.
I ask since I'm currently using com.microsoft.azure.schemaregistry.kafka.avro.AvroConverter class as a value converter in the source connector. I'm using Kafka Connect to ingest data from postgres database to a Kafka topic and I'm storing schemas in Azure Schema Registry.
Automatic schema registration is not the best practice, hence I would like to avoid it.
As far as I can see auto.register.schemas option was removed from configuration of connectors so I am interested; is there any alternative way to prevent connector from automatically registering schemas?
This is the configuration of my source connector:
{
"name": "jdbc-postgresql-avro-connector-azure",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"tasks.max": "1",
"value.converter": "com.microsoft.azure.schemaregistry.kafka.avro.AvroConverter",
"value.converter.schema.registry.url": {schemaRegistryUrl},
"value.converter.schema.group": "postgreskafkaschemagroup",
"value.converter.tenant.id": {tenantId},
"value.converter.client.id": {clientId},
"value.converter.client.secret": {clientSecret},
"connection.url": {postgresConnectionUrl},
"connection.user": {user},
"connection.password": {password},
"connection.attempts": 3,
"mode": "incrementing",
"query": "SELECT * FROM users",
"table.types": "TABLE",
"topic.prefix": "users",
"incrementing.column.name": "id"
}
}
I would appreciate any help.
The text was updated successfully, but these errors were encountered: