Skip to content

循环插入数据时,随机丢失插入的数据 #3846

@donghaihu

Description

@donghaihu

版本:iceberg 0.13 ;flink 1.13.2 ;
calalog:hadoop、hive均会出现此情况;
无明显的报错信息;
代码如下:
EnvironmentSettings settings = null;
TableEnvironment tableEnv = null;
String dbName = "test_ods_preview_read_db";
String tableName = "test_ods_preview_read_data";
String warehouse="hdfs://cdh2:8020/user/hive/warehouse/zbc";
@before
public void init() {

    settings = EnvironmentSettings.newInstance().build();

    tableEnv = TableEnvironment.create(settings);

    String toWithClause = "(\n" +
            "  'type'='iceberg',\n" +
            "  'catalog-type'='hadoop',\n" +
            "  'clients'='2',\n" +
            "  'property-version'='1',\n" +
            "  'warehouse'='hdfs://cdh2:8020/user/hive/warehouse/zbc'\n" +
            ")";

    tableEnv.executeSql("CREATE CATALOG test_hadoop_catalog WITH " + toWithClause);
    tableEnv.executeSql("USE CATALOG test_hadoop_catalog");
    tableEnv.executeSql("CREATE DATABASE IF NOT EXISTS test_ods_preview_read_db");
    tableEnv.executeSql("USE test_ods_preview_read_db");
    tableEnv.executeSql("CREATE TABLE IF NOT EXISTS test_ods_preview_read_db.test_ods_preview_read_data(sensor_id STRING, ts BIGINT)");
    
    for (int i = 0;i< 10;i++){
        String sql = String.format("INSERT INTO test_ods_preview_read_data SELECT 'sensor_id_%d',%d" ,i ,i);
        tableEnv.executeSql(sql); --执行了10次,随机的写入1-10条数据,不是每次都写入成功,无明显报错。
        --Thread.sleep(2000);在此种情况下,数据写入成功率变大
    }

    tableEnv.getConfig().getConfiguration().setBoolean("table.dynamic-table-options.enabled", true);
}

没有排查到具体原因,help!谢谢。

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions