Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-46287][PYTHON][CONNECT] DataFrame.isEmpty should work with all datatypes #44209

Closed

Conversation

zhengruifeng
Copy link
Contributor

@zhengruifeng zhengruifeng commented Dec 6, 2023

What changes were proposed in this pull request?

DataFrame.isEmpty should work with all datatypes

the schema maybe not compatible with arrow, so should not use collect/take to check isEmpty

Why are the changes needed?

bugfix

Does this PR introduce any user-facing change?

before:

In [1]: spark.sql("SELECT INTERVAL '10-8' YEAR TO MONTH AS interval").isEmpty()
23/12/06 20:39:58 WARN CheckAllocator: More than one DefaultAllocationManager on classpath. Choosing first found
--------------------------------------------------------------------------- / 1]
KeyError                                  Traceback (most recent call last)
Cell In[1], line 1
----> 1 spark.sql("SELECT INTERVAL '10-8' YEAR TO MONTH AS interval").isEmpty()

File ~/Dev/spark/python/pyspark/sql/connect/dataframe.py:181, in DataFrame.isEmpty(self)
    180 def isEmpty(self) -> bool:
--> 181     return len(self.take(1)) == 0

...

File ~/.dev/miniconda3/envs/spark_dev_311/lib/python3.11/site-packages/pyarrow/public-api.pxi:208, in pyarrow.lib.pyarrow_wrap_array()

File ~/.dev/miniconda3/envs/spark_dev_311/lib/python3.11/site-packages/pyarrow/array.pxi:3659, in pyarrow.lib.get_array_class_from_type()

KeyError: 21

after

In [1]: spark.sql("SELECT INTERVAL '10-8' YEAR TO MONTH AS interval").isEmpty()
23/12/06 20:40:26 WARN CheckAllocator: More than one DefaultAllocationManager on classpath. Choosing first found
Out[1]: False

How was this patch tested?

added ut

Was this patch authored or co-authored using generative AI tooling?

no

fix

fix
@zhengruifeng
Copy link
Contributor Author

cc @HyukjinKwon

@zhengruifeng
Copy link
Contributor Author

scala client don't have this issue:

def isEmpty: Boolean = select().limit(1).withResult { result =>
result.length == 0
}

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM.

@dongjoon-hyun
Copy link
Member

Merged to master. Thank you all!

@zhengruifeng zhengruifeng deleted the py_connect_df_isempty branch December 7, 2023 00:24
@zhengruifeng
Copy link
Contributor Author

thank you @dongjoon-hyun @HyukjinKwon for reviews

dbatomic pushed a commit to dbatomic/spark that referenced this pull request Dec 11, 2023
…ll datatypes

### What changes were proposed in this pull request?
`DataFrame.isEmpty` should work with all datatypes

the schema maybe not compatible with arrow, so should not use `collect/take` to check `isEmpty`

### Why are the changes needed?
bugfix

### Does this PR introduce _any_ user-facing change?
before:
```
In [1]: spark.sql("SELECT INTERVAL '10-8' YEAR TO MONTH AS interval").isEmpty()
23/12/06 20:39:58 WARN CheckAllocator: More than one DefaultAllocationManager on classpath. Choosing first found
--------------------------------------------------------------------------- / 1]
KeyError                                  Traceback (most recent call last)
Cell In[1], line 1
----> 1 spark.sql("SELECT INTERVAL '10-8' YEAR TO MONTH AS interval").isEmpty()

File ~/Dev/spark/python/pyspark/sql/connect/dataframe.py:181, in DataFrame.isEmpty(self)
    180 def isEmpty(self) -> bool:
--> 181     return len(self.take(1)) == 0

...

File ~/.dev/miniconda3/envs/spark_dev_311/lib/python3.11/site-packages/pyarrow/public-api.pxi:208, in pyarrow.lib.pyarrow_wrap_array()

File ~/.dev/miniconda3/envs/spark_dev_311/lib/python3.11/site-packages/pyarrow/array.pxi:3659, in pyarrow.lib.get_array_class_from_type()

KeyError: 21
```

after
```
In [1]: spark.sql("SELECT INTERVAL '10-8' YEAR TO MONTH AS interval").isEmpty()
23/12/06 20:40:26 WARN CheckAllocator: More than one DefaultAllocationManager on classpath. Choosing first found
Out[1]: False
```

### How was this patch tested?
added ut

### Was this patch authored or co-authored using generative AI tooling?
no

Closes apache#44209 from zhengruifeng/py_connect_df_isempty.

Authored-by: Ruifeng Zheng <ruifengz@apache.org>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants