-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
flink CDC监控mysql表,表更新报错Encountered change event for table test.student whose schema isn't known to this connector #50
Comments
properties.setProperty("database.history.kafka.bootstrap.servers", "kafka1:9092");
|
|
是Mysql DatabaseName大小写的问题,我猜想应该是debezium 会将 我传入的 ‘Test’ 转化为 ‘test’,然后去找对应的schema,所以会抛这样的异常 |
正解,是debezium获取到mysql里的所有数据库的时候,会和你传入的 dbname比较,比较的时候传入的dbname被转成小写了,所以tableid找不到。 |
mysql CDC的earliest模式不支持表名带下划线的. 这个问题官方什么时候回修复 |
我尝试使用flink cdc的官方案例,代码如下
`
import com.alibaba.ververica.cdc.connectors.mysql.MySQLSource;
import com.alibaba.ververica.cdc.debezium.StringDebeziumDeserializationSchema;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.SourceFunction;
import java.util.Properties;
public class MySqlBinlogSourceExample {
public static void main(String[] args) throws Exception {
Properties properties = new Properties();
properties.setProperty("database.history.kafka.bootstrap.servers", "kafka1:9092");
properties.setProperty("database.history.kafka.topic", "demo1");
properties.setProperty("snapshot.new.tables", "parallel");
properties.setProperty("key.converter", "org.apache.kafka.connect.json.JsonConverter");
properties.setProperty("value.converter", "org.apache.kafka.connect.json.JsonConverter");
properties.setProperty("decimal.handling.mode", "string");
// properties.setProperty("inconsistent.schema.handling.mode","warn");// 不报错,但是不显示binlog
// properties.setProperty("snapshot.mode","schema_only_recovery");
// .databaseList("Test")
.tableList("Test.student")// monitor all tables under inventory database
.username("root")
.password("123")
.deserializer(new StringDebeziumDeserializationSchema()) // converts SourceRecord to String
.build();
}
org.apache.flink flink-clients_${scala.binary.version} ${flink.version} org.apache.flink flink-java ${flink.version} org.apache.flink flink-streaming-java_${scala.binary.version} ${flink.version} com.alibaba.ververica flink-connector-mysql-cdc 1.1.0`
依赖如下:
其中 flink 版本为1.11.1,
![image](https://user-images.githubusercontent.com/58650751/97846841-9116fe80-1d29-11eb-9b2b-8370524b64c7.png)
刚开始运行的时候日志打印为
是能读到数据的,但是我一旦在mysql表中执行更新操作,插入了一条数据,就会抛出异常
Caused by: org.apache.kafka.connect.errors.ConnectException: Encountered change event for table test.student whose schema isn't known to this connector at io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:230) at io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:207) at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:600) at com.github.shyiko.mysql.binlog.BinaryLogClient.notifyEventListeners(BinaryLogClient.java:1130) at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:978) at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:581) at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:860) at java.lang.Thread.run(Thread.java:748)
PS:Mysql server version 5.7.31,
据我了解,debezium是依赖与Kafka的,是否需要正确配置本地Kafka的connect才能执行成功?
The text was updated successfully, but these errors were encountered: