-
Notifications
You must be signed in to change notification settings - Fork 13.6k
[FLINK-33045] Make it possible to disable auto-registering schema in Schema Registry #26662
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
[FLINK-33045] Make it possible to disable auto-registering schema in Schema Registry #26662
Conversation
@flinkbot run azure |
1 similar comment
@flinkbot run azure |
94811df
to
ba6bd2e
Compare
@@ -58,7 +77,7 @@ public ConfluentSchemaRegistryCoder(String subject, SchemaRegistryClient schemaR | |||
* @param schemaRegistryClient client to connect schema registry | |||
*/ | |||
public ConfluentSchemaRegistryCoder(SchemaRegistryClient schemaRegistryClient) { | |||
this.schemaRegistryClient = schemaRegistryClient; | |||
this(null, schemaRegistryClient, null); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: it is a bit strange to have an optional first parameter I would expect the mandatory parameters first for all these methods ie the constructors to all start
ConfluentSchemaRegistryCoder(SchemaRegistryClient schemaRegistryClient
...
I realise this is existing code, so it is your call if you want to change it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's a valid nit, but the original commit that I took over had that changed already and is in this PR, see
Line 70 in 0b8f50b
this(subject, schemaRegistryClient, null); |
<tr> | ||
<td><h5>auto.register.schemas</h5></td> | ||
<td>optional</td> | ||
<td>yes</td> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why is this line here but not in the Chinese?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because I don't know Chinese :) So I'll follow the process that's at https://flink.apache.org/how-to-contribute/contribute-documentation/#chinese-documentation-translation after this PR is merged
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had assumed we would just add here as the optional above is not translated. <td>yes</td>
as per the note in the link you sent. But it sounds like you have this in hand.
@@ -289,7 +289,7 @@ under the License. | |||
<!-- Indirectly accessed in pyflink_gateway_server --> | |||
<groupId>org.apache.flink</groupId> | |||
<artifactId>flink-sql-connector-kafka</artifactId> | |||
<version>3.0.0-1.17</version> | |||
<version>4.0.0-2.0</version> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is this related to the fix?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's about keeping versions in sync. Ideally the Python dependencies on connector versions should be out of the Flink repo and moved into the connector repos, but we're not there yet
59f3a77
to
7304dde
Compare
7304dde
to
0b8f50b
Compare
What is the purpose of the change
This PR is based on #25410 and aims to complete the necessary tasks. It introduces
auto.register.schemas
as a table option. Compared to the linked PR, it includes unit tests, a new IT case and updated documentationBrief change log
auto.register.schemas
AvroConfluentITCase
Verifying this change
This change added tests and can be verified as follows:
AvroConfluentITCase
that writes and reads from/to Kafka usingavro-confluent
, with the table option set totrue
(default) to show that Flink can register the schema andfalse
where it relies on schema registration outside of FlinkDoes this pull request potentially affect one of the following parts:
@Public(Evolving)
: noDocumentation