Skip to content

[SUPPORT] Upgrading to 0.11.1 resulting use delete of sparksql  #6146

@bigdata-spec

Description

@bigdata-spec

The environment is CDH6.3.2 and Hudi is 0.11.1

I want to test delete of spark sql,
there are 4 records in the table ,vehicle_model_id is [100 101 102 105]
when I use spark sql run:
delete from zone_test.hudi_spark_table0719_0101 where vehicle_model_id = '102'; it works ok there is 3 records in the table ,
image
But when I run :
delete from zone_test.hudi_spark_table0719_0101 where vehicle_model_id = '109'; 109 does‘t exists in the table。
image
and i query by hive
image

and i query by trino ,it query normal。
image

is hudi can't delete not exsits record?

Metadata

Metadata

Labels

area:sqlSQL interfacespriority:highSignificant impact; potential bugs

Type

No type

Projects

Status

✅ Done

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions