Describe the bug
When there are more than one filter conditions, triggering bloomFilter will cause the task to fail.
To Reproduce
- enabled reuseSubquery
set spark.sql.execution.reuseSubquery=true;
- there are more than one filter conditions, triggering bloomFilter
Expected behavior
Screenshots
UI:

executor:

driver:

Additional context
After reusing the subquery, there is a bug in the conversion between Arrow and Spark due to the presence of two filtering conditions, which leads to issues with the schema information in the subquery.